November 17, 2011 By Florencio Espejo
I’ve previously had a simplistic view of Master Data Management (MDM), that its aim was to build and maintain a ‘Master Entity’ that is clean, unique and free of data quality problems. Something that proves costly on most Data Warehouse Business Intelligence (DWBI) implementations. However after attending William McKnight’s Master Data Management course I now realise there’s much more to it.
MDM is a more integral data solution than current DW implementations. Traditional DW solutions are a one-way street of information traffic: extracting, transforming and loading data from one system to another.
In this approach, latency and distribution are issues. MDM, on the other hand, allows consumption and update of source records in real time, and at the same time publishing quality data to other applications downstream such as DWs, Analytics, etc.
Master Data Management surfaced as much more than reference data. MDM provides clean, quality information that can be ‘distributed’ throughout the enterprise applying total governance on such information. The distribution concept introduces actors in the information management chain such as ‘consumers’ and ‘subscribers’. The advocacy was towards an MDM engine closed to the source systems applying data quality upon data entry.
MDM is blurring the boundaries of Data Warehouse to some degree, questioning the relevance of it in the future. According to McKnight, MDM subject areas enable analytical applications straight off the MDM data hub.
Although possible, I question the effort needed to implement an analytical MDM model that will provide the same capabilities of a well-designed DWBI solution, considering that the industry does not provide for example, MDM data models advocating best practices.
From an implementation perspective, vendors push hard their own models but in most cases, the final cost to customize those models for particular business models is extremely high.
MDM advocates the role of a data steward. Someone who is the business owner of corporate data, responsible for sign-off, approval and updating their value. This is done through what McKnight calls Master Data Workflow.
The workflow essentially passes the ball to the business prompting them to make the decision if a particular data item is fit-for-purpose and can be given the green light for its use enterprise-wide.
Enterprise Master Hub. MDM enables data as a service (DaaS). This ultimately provides the holy grail of quality information services: “build once, use many times”. On this front, MDM clearly provides great benefits up and down streams of data flow.
Obviously there are many other important points on MDM. It is not difficult to see why the value proposition of an MDM solution is more than interesting, particularly in situations where businesses are embarking on the upgrade or decommission of their ERP and CRM applications.
MDM can be applied at any time, however these represent prime opportunities to start the MDM journey. The context is perhaps a little more prohibitive or challenging in private enterprises, as well as government departments with mature DWBI capabilities. In these situations, it is wise to tackle such a journey with baby steps.
Perhaps the simplistic view that I started with is not such a bad idea after all. We should start by taking care of core enterprise entity data quality problems, applying quality processes and governance around conformed dimensions in order to acquire their golden status where the entity is clean, unique and free of data quality problems. After a successful implementation, and having the backing of the business, then move onto the next stage towards a complete MDM solution.
About the author: Florencio Espejo is a Senior Consultant with Altis Consulting, and has more than 10 years experience delivering IT solutions specialising in Information Management. He has a varied background working with organisations across a range of fields including manufacturing and healthcare services, as well as Government.