| How to Join

CIO Information Matters

Contact Us

Masters of the Data

CIOs Tune in to the Importance of Data Quality, Data Governance, and Master Data Management (MDM) David Baum Modern business applications produce ever more relevant and actionable information for decision makers, but in many cases the data sources are fragmented and inconsistent. Despite tremendous advancements at the application layer, nearly all IT initiatives succeed or fail based on the quality and consistency of the underlying data.

Why this glaring oversight? According to Mark A. Smith, CEO and Chief Research Officer at Ventana Research, CIOs are responsible for making information available to their businesses in a consistent and timely basis, but in most organizations, information management is seen as a delegated set of tasks and is not the CIO’s top priority.

“Key initiatives such as master data management, data virtualization, data quality, data integration and data governance are employed by just a fraction of organizations that should be mastering the science of information management,” Smith said, citing a Ventana Business Analytics Benchmark study of more than 2,800 organizations.

While CIOs are aware that effective information management results in faster decision-making, according to the Ventana study, only 43% of organizations have undertaken information management initiatives in data governance, data integration, data quality, master data management and data virtualization during the last two years, and less than one fifth have completed those projects. The largest obstacles to completing information management projects are insufficient staffing (68%), inadequate budget (63%) and insufficient training and skills (59%).

Smith believes that overcoming these obstacles and rectifying these deficiencies should be a top agenda item for CIOs. Consolidating and propagating accurate master data can reduce operational costs, increase supply chain and selling efficiencies, improve customer loyalty, and support sound corporate governance. “Other priorities, including business analytics, business applications, and Big Data, will not reach their full potential without top-notch information management that integrates business and IT efforts,” he notes.

Analyzing the Problem
Data quality problems are caused by a variety of factors, beginning with error-prone manual data entry by employees across multiple departments, each with its own rules and methods. For example, the sales department’s requirements for entering customer data into a sales automation application are quite different from the accounting department’s need for customer data in an Accounts Receivable application.

Other inbound information channels include data entry from self-service Web portals, supply-chain interactions, and a steady influx of automated data entry from click streams, equipment sensors, and systems integration efforts.

All of these channels can introduce – and even magnify –errant data. While data is often accurate within an individual system, when you try to extract it from that system and use it more broadly, inconsistencies are revealed. For example, details about product inventory might be stored in a warehouse management system and an order entry system, each with slightly different variants. Certain details might get updated in one system, and yet remain unchanged in the other, creating that dreaded situation: multiple versions of the truth.

Local Progress
Data Quality initiatives often begin at a local level when a business unit realizes that a particular application or database is fraught with errors or inconsistencies. Most data quality tools offer a series of techniques for improving data, including the following:

  • Data profiling – assessing the data to understand its overall degree of accuracy
  • Data standardization – utilizing a business rules engine to ensure that data conforms to pre-defined quality rules
  • Geocoding – automated pattern matching tools for fixing name and address data and applying postal standards
  • Matching and Linking – Comparing data to align similar but slightly different records
  • Monitoring – keeping track of data quality and auto-correcting the variations based on pre-defined business rules

Master Data Management is universal and enterprise-wide. The motivation is to describe Master Data objects that are shared across more than one transactional application. MDM solutions typically contain two architectural components:

  • Technology to profile, consolidate and synchronize master data across the enterprise
  • A platform to manage, cleanse, and enrich the structured and unstructured master data

MDM initiatives must consider the enterprise as a whole as data stewards seek to spot redundancies and identify multiple representations of the same basic information. An MDM solution manages master data objects that typically include Customer, Supplier, Site, Account, Asset, and Product.

An Enterprise Perspective
Addressing Data Quality within discrete systems allows CIOs to tackle MDM in phases, and obtain tangible business value at each stage. For example, you might use a data quality tool to clean up a customer data warehouse, then leverage that effort to initiate enterprise-wide data governance efforts, leading to an authoritative system of record that all information systems rely on.

Oracle Master Data Management is designed to consolidate, cleanse, and enrich key business data from across the enterprise and synchronize it with all applications, business processes, and analytical tools. Oracle’s enterprise master data management (MDM) suite of products consolidates and maintains complete, accurate, and authoritative master data across the enterprise and distributes this master information to all operational and analytical applications as a shared service.

Oracle offers a pre-built MDM suite for key master data objects such as Product, Customer, Supplier, Site, and Finance. These packaged solutions deliver business value in a fraction of the time it takes to build these assets from scratch.

Oracle’s MDM suite is a platform designed to consolidate, cleanse, govern, and share business data across the enterprise and across time. It includes pre-defined data models and access methods with powerful applications to centrally manage the quality and lifecycle of master business data. Oracle MDM eliminates inconsistencies in the core business data across applications and enables strong process controls on a centrally managed master data store. Oracle’s MDM portfolio also includes tools that directly support data governance within the master data stores.

An Enterprise Perspective
Propagating clean, accurate master data throughout the enterprise increases operational efficiency, improves corporate governance, and enables analytic systems that provide a true representation of how the business is running.

“Our groundbreaking research eight years ago established information management as the foundation for every CIO,” Smith sums up. “This still holds true and has grown to now be seen as the blueprint for every organization that wants to enable faster decision making and have a foundation for advancing into big data and business analytics.”

Trusted Master Data

Oracle’s enterprise master data management (MDM) suite of products consolidates and maintains complete, accurate, and authoritative master data across the enterprise and distributes this master information to all operational and analytical applications as a shared service.

National Australia Bank Improves Data Quality, Streamlines Regulatory Reporting
National Australia Bank (NAB) is a financial services organization employing more than 40,000 people, operating more than 1,800 branches and service centers, and responsible to more than 460,000 shareholders. The company provides more than 10.93 million customers worldwide with retail, business, and institutional banking services. NAB wanted to eliminate inconsistencies arising from storing data in 34 different financial and operational systems. The organization lacked consistency in how it maintained data about cost centers, branches, and general ledgers for various business units. To remedy this situation, the bank wanted to establish a master data repository underlying all of its information systems so that changes to data elements in one system would be applied universally across all other systems.

In addition, by establishing a standard change-control process, bank officers would be able to prevent ad hoc, unjustified, and erroneous data updates that could result in financial misrepresentations or inaccurate data in regulatory reports. They wanted to ensure that all finance systems stored data and produced results in a consistent fashion. To achieve these goals, the bank decided to replace an aging, inflexible mainframe system used as a pseudo master data management tool with a more sophisticated, true master data management solution from Oracle. Working with Oracle Consulting, the bank implemented Oracle Data Relationship Management (DRM) to manage master data assets in 34 finance and operational applications, including human resources, general ledger, planning, and an Oracle Financial Management system. To prepare for the Oracle implementation, NAB spent a number of weeks gathering system requirements for interface designs. An Oracle consultant provided initial training on the Oracle Hyperion product. The bank opted for a centralized maintenance and governance approach, where one team looks after the master data. The implementation team included one project manager, two NAB team members, and a consultant from Oracle Consulting.

It took three months to deploy Oracle Data Relationship Management and the results have been exceptional. The new system ensures data consistency by establishing a formal, automated change control process, where updates to the Oracle master data system are fed into or applied to all of the other finance and operational systems. When master data was migrated from the bank’s mainframe and other systems to Oracle Data Relationship Management, NAB improved data quality from 90% to 99% by cleansing the source data and reducing the number of duplicate records. In addition, improving data quality has improved data accuracy in regulatory reports designed to improve supervision, transparency, and disclosure while enhancing risk management and governance practices in the Australian banking sector. Managers have greater insight into the state of the bank’s worldwide financial operation thanks to having a single, master view of financial data.

BSI Improves Accuracy, Boosts Sales with Single Customer View
BSI (British Standards Institution) is the United Kingdom’s National Standards Body and the originator of many of the world’s most commonly used standards. The company works with more than 64,000 clients in 150 countries. BSI used Oracle’s enterprise data quality management suite to create a single, accurate, complete record of each of these clients in just one month. As a result, the accuracy of its customer and corporate data has improved to nearly 100%, and BSI can refresh information four times faster.

The project began with a simple recognition: BSI wanted to optimize customer insight by creating master customer records that captured each client’s profile, purchasing history, relationships, and other attributes in a single view. The goal was to eliminate inaccurate, incomplete, nonstandard, multi-format, and duplicate customer and transactional data from the customer database, which was growing three percent to four percent each year. Senior officers knew that having complete and accurate data would improve the organization’s ability to segment customers, increase marketing effectiveness, boost sales per customer, and reduce churn. In addition, by standardizing names, dates, and values in corporate and customer records, BSI could improve the performance and productivity of its marketing, sales, and operations teams, improving the match between customer needs and BSI services.

The need for standardization also extended to the publications, training documents, tools, and services that BSI sold online. They needed to ensure consistent coding, description, and pricing formats in their electronic catalogue. Always conscious of costs, BSI wanted to complete this master data management (MDM) project using its existing staff resources, despite a 20%, year-over-year increase in data volume. BSI chose Oracle Enterprise Data Quality, a suite of automated cleansing, matching, and profiling solutions, for its intuitive functionality, adaptability, and value. IT professionals used the software to aggregate more than 5 million disparate data records, held in multiple databases and formats, into a set of “golden” customer records that span a four-year period and include transaction histories of all products and services. The results have been impressive:

  • BSI used the Oracle technology to deliver a single customer view to 2,000 workers in 60 countries, while enforcing best practices in data governance and management within the organization.
  • BSI eliminated inaccurate, incomplete, nonstandard, multi-format, and duplicate customer and transactional data for a client database that is growing 3% to 4% annually.
  • Better marketing campaigns have improved sales by boosting cross- and up-selling opportunities and, in turn, providing a more complete customer experience.
  • They now have standardized product categorizations and consistent coding and descriptions to accommodate a 20% expansion in data volume each year, without adding staff.

David Baum is a freelance business writer and marketing consultant with 25 years of experience covering the high tech industry.

Case Studies

software.hardware.complete