Data Quality Management Plan
Once data is captured, though, a new set of issues starts to creep in. Data gets out of date quite quickly: in the US, 15% of people move address annually according to the US Census Bureau (in the UK it is about 11%). How confident are you that all the companies and government departments that you interact with are racing to update that personal data of yours?
However, at this stage we have been talking pure data quality – is that address record right or wrong? There is a more insidious problem in large companies and that is data mastering. According to a 2008 survey by my firm, Information Difference, the average large company has six different systems holding supposedly “master” data about customer, nine in the case of product data, and 13% of survey respondents had over 100 such sources. No one intended this mess to unfold, but most large companies have dozens or even (more realistically) hundreds of separate applications, from ERP to sales force automation, from supply chain to marketing and many more - never mind the abundance of spreadsheets that drive so many companies.
When a new application is implemented then it is populated either from scratch, from a spreadsheet or via a feed from something existing. Ideally that will be a properly maintained interface, but it may be a one-off dump of data, with the sources starting to drift apart over time as they are separately maintained.
Even if this problem is avoided, companies buy other companies, and when a company is acquired then its computer systems are not magically integrated overnight: it may take years for a semblance of integration to take place. With a global company making many acquisitions a year, it is not hard to see that even the purest and well-organised technology architecture will quickly become prone to inconsistent data.
Enter master data management
This is where master data management (MDM) comes in. It is hardly a new subject, but over the last decade or so many technologies have been developed that provide dedicated hubs for managing master data (as distinct from transactional data). The idea is that these hubs can provide a single, authoritative source of master data that feed into other systems that need them. However, MDM, while hardly in its infancy, is barely a teenager in terms of maturity, and relatively few companies have fully and successfully implemented MDM across the complete scope of their enterprise and across all data domains.
What is apparent is that master data initiatives and data quality are intimately linked. In a 2010 Information Difference survey, respondents claimed they budgeted 10% of their MDM project budget to data quality activities, yet on average actually spent 30%, three times the sum they had intended. What is a little odd is the time that it has taken many MDM suppliers to become fully aware of this.
You might also like
Data Governance in Insurance Carriers — Insurance Networking News
The discipline includes a focus on data quality, data management, data policies, and a variety of other processes surrounding the handling of data in an organization.