When Data Destroys Value (I)

Date: 10-06-2011
Share Button

A survey by Gartner Research found that poor data quality costs companies an average $8 million per year. In a different study published by The Data Warehousing Institute more than 50% of companies experience customer dissatisfaction and cost problems due to poor data.

According to Gartner, about 80% of Business Intelligence implementations fail; while an Accenture survey of managers in Fortune 500 Companies found that 59% cannot find valuable information they need to do their jobs; 42% accidentally use the wrong information about once a week and 53% believe the information they receive is not valuable to them.

It’s pretty obvious that the main reason for the failure to meet Business Intelligence expectations is probably data quality; reinforcing the proverbial GIGO principle.

This is only getting worse considering that Aberdeen Group estimates the amount of corporate data companies need to digest is growing at 41% annual rate, coming from an average 15 unique data sources.

It’s clear that today’s Business Intelligence & Analytics software capability is light-years ahead of the semantic quality and structure of the data. After years and millions of dollars spent in BI deployment, folks in areas like Marketing, Sales or Strategic Management feel like they are drowning in an ocean of data and yet thirsty for the Strategic Knowledge they need to grow the business.

It’s common to see state of the art Business Intelligence applications that cannot deliver strategic direction or analysis until armies of analysts download the data into spreadsheets, manipulate, clean, fix and structure the data manually, for hours or days at a time.

Many folks recognize the transactional data has plenty of errors, at least at the customer, product line, brand, market and segment level, but they never get fixed and, even worse, they populate the BI systems.

This is a classic case of cultural miscommunication. Marketing and Sales people think this is an IT problem and expect the data warehouse analysts to fix it, while IT thinks it’s a business problem and expects Sales or Marketing to take action. The result is that data seldom gets corrected, people give-up, and running raw or bad data through sophisticated BI systems becomes the norm.

After a while, Sales, Marketing and Management don’t trust the data, can’t see the value of BI, quit using the system and continue making decisions based on intuition.

When Analysts or Power users need a quick answer about market share, profitability or growth of a particular segment, product line, customer or competitor it takes days to go through the analyst’s manual process. This consists of running the right queries, exporting them to Microsoft Excel, manually cleansing data errors, adding look-ups from external data sources, and finally creating pivot tables to find the right answers.

Even worse, they have to go through the entire process over and over again every time they need a progress update, either the following week or at month-end, quarter-end or year-end for each one of the business units and markets they serve.

This not only is an inefficient process but fixing data based on analysts’ personal assumptions leads to undesirable multiple silos and different versions of the truth.

As a result, it’s common for people to spend a large portion of team meetings arguing whose data is correct instead of focusing on the critical issues. The numbers generated by Finance do not agree with the analysis performed by Marketing or the explanations provided by Sales. They need a single version of the truth, but different folks run different queries, made different data cleansing assumptions and customized their spreadsheets based on different metrics.


by Bill Cabiro and Strat-Wise LLC


„Please visit their web site at www.strat-wise.com for additional articles and resources on the strategic use of Business Intelligence and Analytics”

Share Button

Leave a Reply

Your email address will not be published. Required fields are marked *