INSERT INTO sites(host) VALUES('baseline-consulting.typepad.com') 1045: Access denied for user 'www-data'@'localhost' (using password: NO) baseline-consulting.typepad.com Estimated Worth $434,428 - MYIP.NET Website Information
Welcome to MyIP.net!
 Set MYIP as homepage      

  
           

Web Page Information

Title:
Meta Description:
Meta Keywords:
sponsored links:
Links:
Images:
Age:
sponsored links:

Traffic and Estimation

Traffic:
Estimation:

Website Ranks

Alexa Rank:
Google Page Rank:
Sogou Rank:
Baidu Cache:

Search Engine Indexed

Search EngineIndexedLinks
 Google:
 Bing:
 Yahoo!:
 Baidu:
 Sogou:
 Youdao:
 Soso:

Server Data

Web Server:
IP address:    
Location:

Registry information

Registrant:
Email:
ICANN Registrar:
Created:
Updated:
Expires:
Status:
Name Server:
Whois Server:

Alexa Rank and trends

Traffic: Today One Week Avg. Three Mon. Avg.
Rank:
PV:
Unique IP:

More ranks in the world

Users from these countries/regions

Where people go on this site

Alexa Charts

Alexa Reach and Rank

Whois data

Front Page Thumbnail

sponsored links:

Front Page Loading Time

Keyword Hits (Biger,better)

Other TLDs of baseline-consulting

TLDs Created Expires Registered
.com
.net
.org
.cn
.com.cn
.asia
.mobi

Similar Websites

More...
Alexa鏍囬

Search Engine Spider Emulation

Title:In the Field with our Experts
Description:The consulting experts at Baseline Consulting are in the trenches with our clients day in and day out. They're sharing how-to鈥檚 and describing new best practices in near real-time!
Keywords:
Body:
In the Field with our Experts
In the Field with our Experts
The consulting experts at Baseline Consulting are in the trenches with our clients day in and day out. They're sharing how-to鈥檚 and describing new best practices in near real-time!
Home
February 22, 2011
The Dreaded Stairs
By Stephen Putman, Senior Consultant
Recently, a friend of mine posted a link on Facebook that reinforced a philosophy that I have had for a long time that applies to all activities in life that are not duty-bound:
The Dreaded Stairs (part of #0160;The Fun Theory project)
I have long felt that humans do things for two reasons:
A) They #39;re fun
B) They #39;re lucrative
This applies to the field of Data Governance and Quality as it does everything else. One of the reasons data governance and quality initiatives are not more widely adopted and followed is that the work is not terribly fun - data owners must be identified, policies and processes must be adopted, and the entire process must be monitored and attended once it is in place. It #39;s also not seen as lucrative in a direct sense - the act of cleansing the data in a transaction usually doesn #39;t provide immediate financial reward, and while the implementation of governance and quality initiatives can affect the company #39;s bottom line, the benefits are very difficult to quantify in a traditional sense.
Phil Simon #0160;has produced a terrific #0160;series #0160;for The Data Roundtable on incentive ideas for data quality programs, so I will not address these here - he says it much better than I can. I am concerned with quot;fun. quot; The video above demonstrated an innovative idea to make a mundane but healthy activity (climbing stairs) into a joyful experience. What sort of innovative programs can be created to make managing high-quality data fun?
quot;Fun quot; is a difficult concept because it means something different to everyone. One way to find out what is quot;fun quot; to your employees is by conducting surveys or workshops to ask them directly. Another possibility could be to have a quot;company carnival quot; in your parking lot, and award employees who identify quality issues with raffle tickets or a quot;boss #39; dunk tank. quot; The White House holds a #0160;yearly contest #0160;with government employees for the best quality improvement or cost-savings idea (this is more of an incentive, but some people also consider contests like this fun).
These are just a few ideas off the top of the head - do you have creative people who can come up with other ideas? If it is indeed true that fun makes unpleasant activities more palatable, this would be time well spent to reinforce data governance and quality in your organization.
photo by Robin Fensom via Flickr (Creative Commons license)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at baseline-consulting.com/ebooks.
Feb 22, 2011 6:00:00 AM
|
data governance,
data management,
data quality
Comment 1
Reblog It 0
February 16, 2011
Three-Dimensional Chess
By Stephen Putman, Senior Consultant
I recently read Rob Gonzalez #39; blog post #0160;I #39;ve Got a Federated Bridge to Sell You (A Defense of the Warehouse) #0160;with great interest - a Semantic Web professional who is defending a technology that could be displaced by semantics! I agree with Mr. Gonzalez that semantically federated databases are not the answer in all business cases. However, traditional data warehouses and data marts are not the best answer in all cases either, and there are also cases where neither technology is the appropriate solution.
The appropriate technological solution for a given business case depends on a great many factors, which I like to call quot;Three-Dimensional Chess. quot;
An organization needs to consider many factors in choosing the right technology to solve an analytical requirement, including:
Efficiency/speed of query return - Is the right data stored or accessed in an efficient manner, and can it be accessed quickly and accurately? #0160;
Currency of data - How current is the data that is available? #0160;
Flexibility of model - Can the system accept new data inputs of differing structures with a minimum of remodeling and recoding? #0160;
Implementation cost, including maintenance - How much does it cost to implement and maintain the system? #0160;
Ease of use by end users - Can the data be accessed and manipulated by end users in familiar tools without damage to the underlying data set? #0160;
Relative fit to industry and organizational standards - This deals with long-term maintainability of the system, which I addressed in a recent posting 鈥 #0160;Making It Fit.
Current staff skillsets/scarcity of resources to implement and maintain - Can your staff implement and maintain the system, or alternately, can you find the necessary resources in the market to do so at a reasonable cost?
Fortunately, new tools and methodologies are constantly being developed that can optimize one or more of these factors, but balancing all of these sometimes mutually exclusive factors is a very difficult job. There are very few system architects who are well versed in many of the applicable systems, so architects tend to advocate the types of systems they are familiar with, bending requirements to fit the characteristics of the system. This causes the undesirable tendency that is represented in the saying, quot;When all you have is a hammer, everything looks like a nail. quot;
Make sure that your organization is taking all factors into account when deciding how to solve an analytical requirement by developing or attracting people who are skilled at playing 鈥渢hree-dimensional chess.鈥
#0160;
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at baseline-consulting.com/ebooks.
Feb 16, 2011 6:00:00 AM
|
business analytics,
data warehousing,
information architecture,
semantic web
Comment 0
Reblog It 0
February 01, 2011
Linked Data Today!
By Stephen Putman, Senior Consultant
I begin today with an invitation to a headache...click this link: #0160;The Linking Open Data Cloud Diagram
Ouch! That is a really complicated diagram. I believe that the #0160;Semantic Web #0160;suffers from the same difficulty that many worthy technologies do - the relative impossibility to describe the concept in simple terms, using concepts familiar to the vast majority of the audience. When this happens, the technology gets buried under well-meaning but hopelessly complex diagrams like this one. If you take the time to understand it, the concept is very powerful, but all the circles and lines immediately turn off most people.
Fortunately, there are simple things that you can do in your organization today that will introduce the concept of #0160;linked data #0160;to your staff and begin to leverage the great power that the concept holds. It will take a little bit of transition, but once the idea takes hold you can take it in several more powerful directions.
Many companies treat their applications as islands unto themselves in their basic operations, regardless of any external feeds or reporting that occurs. One result of this is that basic, seldom-changing concepts such as Country, State, and Date/Time are replicated in each system throughout the company. A basic tenet of data management states that managing data in one place is preferable to managing it in several - every time something changes, it must be maintained in however many systems use it.
One of the basic concepts of linked data is that applications will use a common repository for data like State, for example, and publish #0160;Uniform Resource Identifiers #0160;(URIs), or standardized location values that act much like Web-based URLs, for each value in the repository. Applications will then link to the URI for the lookup value instead of proprietary codes in use today. There are efforts to make global shared repositories for this type of data, but it is not necessary to place your trust in these data stores right away - all of this can occur within your company #39;s firewall.
The transition to linked data does not need to be sudden or comprehensive, but can be accomplished in an incremental fashion to mitigate disruption to existing systems. Here are actions that you can begin right now to start the transition:
If you are coding an application that uses these common lookups, store the URI in the parent table instead of the proprietary code.
If you are using quot;shrink wrap quot; applications, construct views that reconcile the URIs and the proprietary codes, and encourage their use by end users.
Investigate usage of common repositories in all future development and packaged software acquisition.
Begin investigation of linking company-specific common data concepts, such as department, location, etc.
#0160;Once the transition to a common data store is under way, your organization will have lower administration costs and more consistent data throughout the company. You will also be leading your company into the future of linked data processing that is coming soon.
photo by steve_lodefink via Flickr (Creative Commons License)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at information-management.com.
Feb 1, 2011 6:00:00 AM
|
data integration,
data management
Comment 0
Reblog It 0
January 18, 2011
Succeed Despite Failing
By Stephen Putman, Senior Consultant
I just finished reading a post on the Netflix blog - 5 Lessons We #39;ve Learned Using Amazon Web Services (AWS). Even though this article is specific to a high-traffic cloud-based technology platform, I think that it holds a great lesson for the optimization of any computer system, and especially a system that relies on outside sources such as a business intelligence system. #0160;
Netflix develops their systems with the attitude that anything can fail at any point in the technology stack, and their systems should respond in as graceful a way as possible. This is a wonderful attitude to have for any system, and their lessons can be applied to a BI system just as easily:
1. You must unlearn what you have learned. Many people who develop and maintain BI systems come from the transactional application world, and apply their experience to a BI system, which is fundamentally different in several ways - for example, the optimization goal of a transactional system is the individual transaction, while the optimization point of a BI system is the retrieval and manipulation of often huge data sets. Managers and developers that do not realize these differences are doomed to failure with their systems, while people who #0160;successfully #0160;make the transition meet organizational goals much more easily.
2. Co-tenancy is hard. The BI system must manage many different types of loads and requests on a daily basis while simultaneously appearing to be as responsive to the user as all other software used. The system administrator must balance data loads, operational reporting requests, and the construction and manipulation of analysis data sets, often at the same time. This is the same sort of paradigm shift as in lesson 1 - people who do not realize the complications of this environment are doomed to failure since the success of a BI system is directly proportional to the frequency of use, and an inefficient system quickly becomes unused.
3. The best way to avoid failure is to fail constantly. This lesson seems counter-intuitive, but I #39;ve seen a lot of failed systems that always assumed that things would work perfectly - source feeds would always have valid data, in the same place, at the same time, always - that this philosophy gains more credence daily. Systems should always be tested for outages at any step of the process, and coded so that the response is graceful and as invisible to end-users as possible. If you don #39;t rehearse this in development, you will fail in production - take that to the bank.
4. Learn with real scale, not toy models. It would seem that proper performance testing on systems equivalent to production hardware and networking with full data sets would be self-evident, but many development shops look at this as an unnecessary expense that adds little to the finished product. But, as in lesson 3 above, if you do not rehearse the operation of your system on the same size of system as your production environment, you have no way of knowing how the system will respond in real-world situations, and are effectively gambling with your career. The smart manager avoids this sort of gamble.
5. Commit yourself. This message surfaces in many different discussions, but it should be re-emphasized frequently - a system as important as your enterprise business intelligence system should have strong and unwavering commitment from all levels of your organization to survive the inevitable struggles that occur in the implementation of such a large computer system.
It is sometimes surprising to realize that even though technology continues to become more complex and distributed, the same simple lessons can be learned from every system and applied to new systems. These lessons should be reviewed frequently in your quest to implement successful data processing systems.
photo by PseudoGil via Flickr (Creative Commons License)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at information-management.com.
Jan 18, 2011 6:00:00 AM
|
business analytics,
business intelligence (BI)
Comment 0
Reblog It 0
January 11, 2011
New Years鈥 Resolutions: Assess and Revise Your BI Strategy
By Dick Voorhees, Senior Consultant
The New Year is upon us. And for many, the coming of the New Year involves making new resolutions, or reaffirming old ones. This resolution-making process includes corporations and organizations, not just individuals. In terms of personal resolutions, some undertake this process in earnest, but many seem to deal with resolutions superficially, or at least not very effectively. The same is frequently true for organizations as well.
So how then should an organization go about deciding which 鈥渞esolutions鈥 to pursue in the New Year, which goals and objectives are both worthy and achievable? Often there are no quot;good quot; or quot;bad quot; opportunities, a priori, but some are more likely to result in a successful outcome and/or have more significant payoff than others.
Take stock of the opportunities, and develop a list of key potential initiatives (or review the existing list, if one exists). Consider recent or imminent changes in the marketplace, competitors鈥 actions, and governmental regulations. Which of these initiatives offers the possibility of consolidating/increasing market share, improving customer service, or represents necessary future investment (in the case of regulations)? And which best supports the existing goals and objectives of the organization?
Assess the capabilities and readiness of the organization to act on these initiatives. An opportunity might be a significant one, but if the organization can鈥檛 respond effectively and in a timely manner, then the opportunity will be lost, and the organization might better focus its attention and resources on another opportunity with lesser potential payback, but that has a much greater chance of success.
Develop a roadmap, a tactical plan, for addressing the opportunity. Determine which resources are required 鈥 hardware, software, capital, and most importantly people 鈥 what policies and procedures must be defined or changed, etc鈥
Then be prepared to act! Sometimes the best intentions for the New Year fail not for lack of thought or foresight, but for lack of effective follow through. Develop the proper oversight/governance mechanisms, put the plan into action, and then make sure to monitor progress on a regular basis.
These are not difficult steps to follow, but organizations sometimes need help doing so. We鈥檝e found that clients who call us have learned the hard way 鈥 either directly or through stories they鈥檝e heard in their industries 鈥 that some careful planning, deliberate program design, and 鈥 if necessary 鈥 some skill assessment and training can take them a long way in their resolutions for success in 2011. Good luck!
photo by L.C.N酶ttaasen via Flickr (Creative Commons)
#0160;
Dick Voorhees is a seasoned technology professional with more than 25 years of experience in information technology, data integration, and business analytic systems. He is highly skilled at working with and leading mixed teams of business stakeholders and technologists on data enabling projects.
Jan 11, 2011 6:00:00 AM
|
assessments,
best practices,
business intelligence (BI),
data governance,
strategic planning
Comment 0
Reblog It 0
December 21, 2010
Do You Know What Your Reports Are Doing?
By Stephen Putman, Senior Consultant
The implementation of a new business intelligence system often requires the replication of existing reports in the new environment. In the process of designing, implementing and testing the new system, issues of data elements not matching existing output invariably come up. Many times, these discrepancies arise from data elements that are extrapolated from seemingly unrelated sources or calculations that are embedded in the reports themselves that often pre-date the tenure of the project team implementing the changes. How can you mitigate these issues in future implementations?
Issues of post-report data manipulation can range from simple - lack of documentation of the existing system - to complex and insidious - quot;spreadmarts quot; and stand-alone desktop databases that use the enterprise system for a data source, for example. It is also possible that source systems make changes to existing data and feeds that are not documented or researched by the project team. The result is the same - frustration from the business users and IT group in defining these outliers, not to mention the risk absorbed by the enterprise in using unmanaged data in reports that drive business decisions.
#0160;The actions taken to correct the simple documentation issues center around organizational discipline:
Establish (or follow) a documentation standard for the entire organization, and stick to it!
Implement gateways in development of applications and reports that ensure that undocumented objects are not released to production
Perform periodic audits to ensure compliance
Reining in the other sources of undocumented data is a more complicated task. The data management organization has to walk a fine line between control of the data produced by the organization and curtailing the freedom of end users to respond to changing data requirements in their everyday jobs. The key is communication - the business users need to be encouraged to communicate data requirements into an easy-to-use system and understand the importance of sharing this information with the entire organization. If there is even a hint of disdain or punitive action regarding this communication, it will stop immediately, and these new derivations will remain a mystery until anther system is designed.
The modern information management environment is heading more and more towards transparency and accountability, which is being demanded by both internal and external constituencies. The well-documented reporting system supports this change in attitude to reduce risk in external reporting and increase confidence in the veracity of internal reports, allowing all involved to make better decisions and drive profitability of the business. It is a change whose time has come.
photo by r h via Flickr (Creative Commons License)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at information-management.com.
Dec 21, 2010 6:00:00 AM
|
business analytics,
business intelligence (BI),
data warehousing
Comment 0
Reblog It 0
December 16, 2010
Keep It On Track
By Stephen Putman, Senior Consultant
In my recent blog posting, quot;Metadata is Key, quot; I talked about one way of changing the mindset of managers and implementers in support of the coming quot;semantic wave quot; of linked data management. Today, I give you another way to prepare for the coming revolution, and also become more disciplined and effective in your project management whether you #39;re going down the semantic road or not...
#0160;rathole (n) - #0160;[from the English idiom 鈥渄own a rathole鈥 for a waste of money or time] A technical subject that is known to be able to absorb infinite amounts of discussion time without more than an infinitesimal probability of arrival at a conclusion or consensus.
#0160;Anyone who has spent time implementing computer systems knows exactly what I #39;m talking about here. Meetings can sometimes devolve into lengthy discussions that have little to do with the subject at hand. Frequently, these meetings become quite emotional, which makes it difficult to refocus the discussion on the meeting #39;s subject. The end result is frustration felt by the project team on quot;wasting time quot; on unrelated subjects, with the resulting lack of clarity and potential for schedule overruns.
One method for mitigating this issue is the presence of a quot;rathole monitor quot; in each meeting. I was introduced to this concept at a client several years ago, and I was impressed by the focus they had in meetings, much to the project鈥檚 benefit. A quot;rathole monitor quot; is a person who does not actively participate in the meeting, but understands the scope and breadth of the proposed solution very well and has enough standing in the organization that they are trusted. This person listens to the discussion #0160;in the meeting, and interrupts when he perceives that the conversion is veering off into an unrelated direction. It is important for this person to record this divergence and relay it to the project management team for later discussion - the discussion is usually useful to the project, and if these new ideas are not addressed later, people will keep these ideas to themselves, which could be detrimental to the project.
#0160;This method will pay dividends in current project management, but how does it relate to semantics and linked data? Semantic technology is all about context and relationships of data objects - in fact, without these objects and relationships being well defined, semantic processing #0160;is impossible. #0160;Therefore, developing a mindset of scope and context is essential to the successful implementation of any semantically enabled application. Training your staff to think in these terms makes your organization perform in a more efficient and focused manner, which will surely lead to increased profitability and more effective operations.
photo by xJasonRogersx via Flickr (Creative Commons License)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at information-management.com.
Dec 16, 2010 6:00:00 AM
|
data management,
project management
Comment 0
Reblog It 0
December 14, 2010
Metadata is Key
By Stephen Putman, Senior Consultant
One of the most promising developments in data management over the last ten years is the rise of semantic processing, commonly referred to as the quot;Semantic Web. quot; Briefly described, semantic processing creates a quot;web of data quot; complimenting the quot;web of documents quot; of the World Wide Web. The benefits of such an array of linked data are many, but the main benefit could the ability for machines to mine for needed data to enhance searches, recommendations, and the like, where humans do this now.
Unfortunately, the growth of the semantic data industry has been slower than anticipated, mainly due to a quot;chicken and egg quot; problem - the systems needs descriptive metadata to be added to existing structures to function efficiently, but major data management companies are reluctant to invest a great deal into creating tools to do this until an appropriate return on investment is demonstrated. I feel that there is an even more basic issue with the adoption of semantics that has nothing to do with tools or investment - we need the implementers and managers of data systems to change their thinking about how they do their jobs; to make metadata production central to the systems they produce.
The interoperability and discoverability of data is becoming an increasingly important requirements for organizations of all types - the financial industry is keenly aware of the requirements of reporting systems that are XBRL-enabled, for example. If we leave external requirements to the side, the same requirements can benefit the internal reporting of the organization as well. Reporting systems go through extended periods of design and implementation, with their contents and design a seemingly well-guarded secret. Consequently, effort is required for departments not originally included in the system design to discover and use appropriate data for their operations.
The organization and publication of metadata about these reporting systems can mitigate the cost of this discovery and use by the entire organization. Here is a sample of the metadata produced by every database system, either formally or informally:
System-schema-table-column
Frequency of update
Input source(s)
Ownership-stewardship
Security level
The collection and publication of such metadata in #0160;standard forms #0160;will prepare your organization for the coming 鈥渟emantic wave, quot; even if you do not have a specific application that can utilize this data at the present time. This will give your organization an advantage over those companies that wait for these requirements to be implemented and will need to play catch-up. You will also gain the advantage of your staff thinking in terms of metadata capture and dissemination, which will help your company become more efficient in its data management functions.
photo by ~Brenda-Starr~ via Flickr (Creative Commons License)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at information-management.com.
Dec 14, 2010 6:00:00 AM
|
data management,
data stewardship,
metadata
Comment 0
Reblog It 0
December 10, 2010
Making It Fit
By Stephen Putman, Senior Consultant
I #39;ve spent the last eighteen months at clients that have aging technology infrastructures and are oriented to build applications as opposed to buying more integrated software packages. All of these organizations face a decision which is similar to the famed quot;build vs. buy quot; decision that is made when implementing a new enterprise computer system - do we acquire new technology to fulfill requirements, or adapt our existing systems to accomplish business goals?
Obviously, there are pros and cons to each approach, and external factors such as enterprise architecture requirements and resource constraints factor into the decision. However, there are considerations independent of those constraints whose answers may guide you to a more effective decision. These considerations are the subject of this article.
Ideally, there would not be a decision to make here at all - your technological investments are well managed, up-to-date, and flexible enough to adapt easily to new requirements. Unfortunately, this is rarely the case in most organizations. Toolsets are cobbled together from developer biases (from previous experience), enterprise standards, or inclusion of OEM packages with larger software packages such as ERP systems or packaged data warehouses. New business requirements often appear that do not fit neatly into this environment, which makes this decision necessary.
Aquire New
The apparent path of least resistance in addressing new business requirements is to purchase specialized packages that solve tactical issues well. This approach has the benefit of being the solution that would most closely fit the requirements at hand. However, the organization runs the risk of gathering a collection of ill-fitting software packages that could have difficulty solving future requirements. The best that can be hoped for in this scenario is that the organization leans toward obtaining tools that are based on a standardized foundation of technology such as Java. This enables future customization if necessary and ensures that there will be resources available to do the future work without substantial retraining.
Modify Existing Tools
The far more common approach to this dilemma is to adapt existing software tools to the new business requirements. The advantage to this approach is that your existing staff is familiar with the toolset and can adapt it to the given application without retraining. The main challenge in this approach is that the organization must weigh the speed of adaptation against the possible inefficiency of the tools in the given scenario and the inherent instability of asking a toolset to do things that it was not designed to do.
The quot;modify existing quot; approach has become much more common in the last ten to twenty years because of budgetary constraints imposed upon the departments involved. Unless you work in a technology company in the commercial product development group, your department is likely perceived as a cost center to the overall organization, not a profit center, which means that money spent on your operations is an expense instead of an investment. Therefore, you are asked to cut costs wherever possible, and technical inefficiencies are tolerated to a greater degree. This means that you may not have the opportunity to acquire new technology even if it makes the most sense.
The decision to acquire new technology or extend existing technology to satisfy new business requirements is often a decision between unsatisfactory alternatives. The best way for an organization to make effective decisions given all of the constraints is to base its purchase decisions on standardized software platforms. This way, you have the maximum flexibility when the decision falls to the quot;modify existing quot; option.
photo by orijinal via Flickr (Creative Commons License)
Stephen Putman has over 20 years experience supporting client/server and internet-based operations from small offices to major corporations. #0160; He has extensive experience in a variety of front-end development tools, as well as relational database design and administration, and is extremely effective in project management and leadership roles. He is the co-author of The Data Governance eBook, available at information-management.com.
Dec 10, 2010 6:00:00 AM
|
data warehousing,
enterprise information management (EIM),
information architecture,
requirements
Comment 0
Reblog It 0
November 02, 2010
Understanding Where Our Work Comes From
By Mary Anne Hopper, Senior Consultant
I鈥檝e written quite a bit about the importance of establishing rigor around the process of project intake and prioritization. #0160; If you鈥檙e sitting there wondering how to even get started, I believe it is important to understand where it is these different work requests because unlike application development projects, BI projects tend to have touch points across the organization. #0160; I tend to break the sources into three main categories鈥攕tand-alone, upstream applications, and enhancements.
Stand-alone BI projects are those that are not driven by new source system development. Project types can include as new data marts, reporting applications, or even re-architecting legacy reporting environments. Application projects are driven by changes in any of the upstream source systems we utilize in the BI environment including new application development and changes to existing applications. Always remember that the smallest of changes in a source system can have the largest of impacts to the downstream BI application environment. The enhancements category is the catch-all for low risk development that can be accomplished in a short amount of time.
Just as important in understanding where work requests come from is prioritizing those work requests. #0160; #0160; The three need to be considered in the same prioritization queue鈥攖his is a step that challenges a lot of the clients I work with. #0160; So, why is it so important to prioritize work together? #0160; The first reason is resource availability. #0160; Resource impact points include project resources (everyone from the analysts to the developers to the testers to the business customers), environment availability and capacity (development and test), and release schedules. #0160; And, most importantly鈥攑rioritizing all work together ensures the business is getting their highest value projects completed first.
#0160;
Mary Anne has 15 years of experience as a data management professional in all aspects of successful delivery of data solutions to support business needs. #0160; She has worked in the capacity of both project manager and business analyst to lead business and technical project teams through data warehouse/data mart implementation, data integration, tool selection and implementation, and process automation projects.
Nov 2, 2010 6:00:00 AM
|
business analytics,
business intelligence (BI),
data governance,
IT-business alignment,
requirements
Comment 0
Reblog It 0
Next
#187;
Baseline Consulting
Baseline Consulting is a management and technology consulting firm specializing in data integration and business analytics.
1 Following
0 Followers
Search
Recent Comments
Stephen Harding: Im a little late to this blog, but wanted to sa... | more 禄
On The Dreaded Stairs
Kasper S酶rensen: Great blog posting and thanks for mentioning Da... | more 禄
On Data Profiling with SQL is Hazardous to Your Company鈥檚 Health
Steve Putman: I think we should include a good appreciation f... | more 禄
On Responsible [Data] Stewardship
Subscribe to this blog's feed
Powered by Typepad

Updated Time

Updating   
Friend links: ProxyFire    More...
Site Map 1 2 3 4 5 6 7 8 9 10 20 30 40 50 60 70 80 90 100 110 120 130 140 150 160 170 180 190 200 250 300 350 400 450 500 550 600 610 620 630 640 650 660 670 680 690 700 710 720 730 740 750
TOS | Contact us
© 2009 MyIP.cn Dev by MYIP Elapsed:63.110ms