Understanding the quality of data stored within your enterprise is important for any system implementation, whether it is Customer Resource Management (CRM), Business Intelligence (BI), or Master Data Management (MDM). It is particularly important in order to reap the benefits of MDM implementation, where you are trying to consolidate redundant data that is stored across multiple systems.
The redundancy could be the result of different divisions employing separate databases or the process of mergers and acquisitions combining the databases of multiple enterprises. Regardless of how they occurred, using multiple databases within a single enterprise degrades customer satisfaction, decreases efficiency, and increases costs. Implementing a solution from an MDM provider could reverse these negatives, but only after a successful process of data profiling to establish and maintain initial data quality.
MDM implementations involve establishing a master data hub – such as a customer master, product master or pricing master – along with a data quality matching engine. In order to effectively match duplicate data across multiple systems, you need to be able to first identify those duplicate records.
One common mistake most companies make during MDM implementation is the assumption they know how to identify those duplicate records. These assumptions are based on hearsay or rumors passed down over time. It is not until the source data is obtained and then matched with the master data hub that the company realizes the true state of its data.
By this point, valuable project time has been lost. Project teams must go back to the source systems to clean up data inconsistencies within each source, such as missing data, incomplete data, or non-standard data. Matching rules within the master data hub must be rewritten or adjusted to handle the true quality of the source data so the redundant and duplicate data entries can be identified and consolidated.
To avoid this re-work and delay, eVerge Group recommends profiling source data at the start of every MDM implementation. Data profiling activities uncover data anomalies at the source, which can be corrected either through the creation of new business policies and procedures that can be followed at time of data capture by the end-user, or through updates to the backend systems after data has been captured. The choice depends on the specific needs of the enterprise.
The process to profile and correct data anomalies requires both business and IT teams to work together to understand the downstream impacts of the current data quality situation.
eVerge incorporates a data assessment work stream into every MDM implementation to make sure customers establish a proactive, repeatable process to continually evaluate, measure and improve the quality of their business data. This process ensures customers will fully reap the benefits of master data management from their selected MDM provider as well as experience cleaner, more accurate data for downstream reporting and upstream sales and marketing capabilities. Contact eVerge to learn more about our PrecisionFit MDM methodology.