How do data quality and access shape your strategic choices?
by Frank Buytendijk and Thomas Oestreich, August 2010
A recent report from IDC estimates that enterprise data stores will grow an average of 60 percent annually. Research from Gartner shows that spending on software in general is growing from US$220 billion to more than US$231 billion, while IDC finds the analytics market expanding at an average rate of 7.5 percent over the next five years.
Fortunately, the cost of storing data is falling as storage capacity increases, so more and more data can be preserved longer for less money. This allows businesses to create massive stores of transactional, operational, and regulatory data that describe nearly every aspect of the enterprise. But what good is that information if you don’t know how to use it?
The real value of enterprise information is its capacity to support good decisions—decisions that can determine the success or failure of a new product, the effectiveness of a marketing campaign, the very future of a business.
Information Quality For business decision-making, information should reduce uncertainty. But various levels of uncertainty will persist depending on the quality of the information. The essential element is access to predictive tools that guide decision-making despite uncertainty. Understanding the different levels of information quality helps you choose the right tools for the job—and gives you an advantage over competitors with inferior tools or undisciplined decision-making processes.
Perfect information exists only if all uncertainty is eliminated from a decision. Indeed, if no uncertainty exists, anyone can make the same correct decision based on the same perfect information available. When perfect information exists, the opportunity to make a strategic, forward-looking decision has probably passed, and competitive advantage vanishes.
Proxy information exists when there is no perfect information about the subject of your decision but plenty of information about something that looks like it. For instance, you may have access to best practices from other industries or an established correlation between two phenomena.
Partial information is what exists in most decision-making situations—indications of where things may go but nothing definitive. Partial information provides less uncertainty, but some still exists.
Conflicting information doesn’t reduce uncertainty—it increases it, driving down the decision-making value of your information.
Information vacuum occurs when you have no access to quality information, but your competitors do. This is the worst situation for decision-making, putting your organization at a significant disadvantage.
There are many ways to improve the quality of information: integrating disparate enterprise systems, building systems on a common data model, establishing a single global data store, and deploying powerful analytical and reporting tools. But understanding what you have to work with is essential before you act on your insights and predictions.
How Much Are You Willing to Pay? There are three approaches to managing these information states: establishing an explicit decision-making process, considering the timing of a decision, and selecting the right amount of information for the decision.
In making business decisions, the question of how much you are willing to pay for valuable information is often dealt with in an implicit way. It may come out of someone else’s budget, or there may be political factors in play that make the money spent on the decision-making process too low or too high. Furthermore, different people simply have different approaches to using the information in decision-making. Some like to number-crunch obsessively before reaching a conclusion; others trust their intuition more.
By asking how much you are willing to pay, some of these hidden assumptions or conceptions become explicit. For example, let’s say your company was considering an investment in solar energy to power a manufacturing plant. There are many uncertain elements in that decision: how the price of conventional energy will fluctuate over the coming years, the potential for innovation to increase efficiency and reduce the cost of a system in the near future, maintenance expenses, product life expectancy, and the possibility of ROI.
The first step in the process is to determine which uncertain factors you are facing. Managers need to distinguish between the factors they can influence and the ones they cannot. For instance, signing a long-term maintenance contract transforms budget variables into data constants. Now you have a list of factors you will not consider, factors with reasonable ranges, and a few constants. Here technology can help, for instance, by running simulations to generate a probability distribution across the various ranges, so you can make your decision with more confidence. You’ve created a better decision-making process.
But acquiring additional information can come at a cost. For instance, there might be analyst reports about energy pricing and technology advancements, but how much should you pay? You should not pay more for the report than a certain percentage of the difference between the initial range you estimated and the narrowed range that (you think) the information will lead to. This percentage is based on your classification of the decision. The higher the uncertainty and the higher the irreversibility of the decision, the higher this percentage can be.
Timing Is Everything Some people make decisions immediately, acting more on instinct than information. Some wait until the last moment, holding out for late-breaking data that could change the outcome. But some are crippled by the prospect of committing, potentially dooming their organization to stagnation and failure. These variations are difficult to judge; waiting longer means less uncertainty, increasing the value of the decision-making process. However, a window of opportunity could close. Somewhere in the middle is an optimum.
If the cost of missing the opportunity (deciding too late) is higher than the cost of making the wrong decision (deciding too early), you should decide as early as possible. If this is the case, information systems need to compile and present critical data quickly and accurately to deliver a competitive edge. On the other hand, if the cost of a wrong decision is higher than the cost of missing an opportunity, delaying a decision can pay off as better information becomes available.
Say that in the quest to change to solar energy, you discover that LED lamps will deliver an immediate savings of 30 percent on lighting costs. But in two years, tax incentives will make solar energy more attractive, and results in other countries have been very encouraging. It is clearly better to wait a little bit. Go for LED lamps first, and invest a little bit in the meantime to make sure you are ready the moment the tax incentives become active. You’ve created the best decision-making time frame.
Information Overload There is always the danger of analysis paralysis—trying to get even more information to reduce uncertainty. However, we also suffer from something called bounded rationality. The human brain simply cannot oversee the full consequences of complex decisions. There is a continuous trade-off between having enough information to reduce uncertainty and having so much information that uncertainty increases again. This is why managers, confronted by too much unstructured information, tend to either ignore the deluge or simply freeze, unable to determine the best course of action. Both reactions are undesirable.
We often see this paradox in data warehousing, for instance. The more relevant the data warehouse becomes for answering many different business questions, the more the relevance for answering a single question is challenged. Solving this paradox is an area where technologies such as business intelligence, search, and Enterprise 2.0 still have significant innovation potential.
But how do you know that you have just enough information if you don’t know what information is hidden around the corner? The most important thing to remember is that value is all about perception. Some people simply value having more information than others. Once you know your own information intensity profile and you understand the preferences of others, you’re well on your way to creating the right decision-support volume.
Now What? Enabling good decision-making with quality information delivered on the right timetable should be a goal that is dear to the heart of every chief information officer. The CIO should review existing information lifecycle management (ILM) strategies. Typically, such an ILM strategy focuses on when to archive or delete data based on available storage, transactional demands, and regulatory requirements, but this is not enough. A more value-based ILM strategy makes sense, aimed at more managerial use of information.
From a governance point of view, starting a discussion on information value may improve your organization’s business and IT alignment. Information should be seen as a factor of production, like capital, labor, and other resources. There’s a cost to those resources, and there are value-adding activities leading to products and services with an economic value. Information should be treated no differently.
Frank Buytendijk and Thomas Oestreich work with executives worldwide to collect and create thought leadership.