Lessons learned from a master data management implementation


It’s clear what the benefits are: a single, correct source of the truth—created and maintained through collaboration, stored centrally and shared everywhere. If having the right information at the right time is key to your business success, there is a lot of sense in your whole company having access to (and contributing to the currency of) your most important data. However, the realization of this vision can be difficult to achieve and require more than just implementing a software solution.

Master Data Management (MDM) software companies tend to be quite clear about what an organisation needs to implement or have in place to successfully leverage an MDM solution. Vendors state the need for data governance, defined business rules, trusted sources and universal provisioning. But what perhaps isn’t always so clear is the effort needed to implement and maintain these practices. Considerable thought must be given not only to how to implement an MDM solution, but also to how it’s going to live and breathe well into the future.

3gamma recently helped a large client implement an MDM solution. These are our key learning points from the project:

  1. Iteration is the best way to move forwards
  2. Don’t overestimate the quality of the existing data and recognize the value of humans in cleaning and maintaining it
  3. Beware of consumers driving your integration design
  4. You probably don’t need as much master data as you think you do.

1. Iteration is the best way to move forwards

Attempting to get a committee of stakeholders to agree on what data they want to store (and what data they can live without) takes time. It’s easy to get stuck debating all the possibilities but we need to move forward quickly and build interest and momentum. By taking an iterative approach we can leave the starting line faster and monitor further need in the real world. The solution then creates a compelling case for itself through continuous delivery and feedback. Planning for iterations and ensuring your solution and methodology can handle successive changes in a flexible way is more important than getting it all captured first time.

2. Don’t overestimate the quality of existing data and recognize the value of humans in cleaning and maintaining it

The project involved moving data from an existing set of disparate, non-data management systems (CRM tools, finance systems, data warehouses, spreadsheets etc.) as well as importing public data from commercial data providers into a single enterprise MDM system.

As data is often not correct in the source, you can’t just map from one field to another, and the MDM can’t easily clean the data by spotting errors and correcting it, because it can’t logically make the distinction between e.g. an address and someone’s middle name without a lot of code. There may also be discrepancies between source systems.

So how do we handle the conflict but crucially not lose any information?

Cleaning up data using a data mapping process and a robot to calculate which data is correct will be imperfect and complex. Using the MDM system to achieve this will probably not work well either. There are recent advances in the ability for robots to understand nonobvious relationships but the insight of a human in understanding how to clean data can sometimes be the most effective approach.

Therefore, consider placing the responsibility to clean and maintain the data on a “data steward”. An MDM solution should be seen as the tool to allow them to do that, not the tool that does that for them. This is about placing correct expectations on the technical solution and using the tech in the most appropriate way. Automate the easy stuff, don’t consider the data management as easy.

3. Beware of consumers driving your design

You want every system that has a need for master data to consume it from the MDM database. But there is often debate over who gets to mandate the integration and data standards. Asking consumers how they want to consume the master data means you’ll get many differing requirements. When designing how an MDM solution will integrate, get high level sponsorship stipulating that consuming systems will make parallel changes to fit the standard interface, not the other way around. This requires engagement early on with the consuming systems and a clear vision of how the integration will work so changes can be made in good time. It also requires a good understanding of the level of investment and change in an MDM implementation.

4. You probably don’t need as much master data as you think you do

First draft data models based on asking the business what they want to store can contain thousands of fields of information and many more pieces of related data fields. This calls into question what is master data and what is just data. Being clear and getting agreement on what is master data and what is associated information is important and relies on going back to what problem the MDM solution is trying to solve.

Often, stakeholders will be concerned about losing the ability to store data they someday may require, which will produce a large data model that is complex, difficult to maintain and has many empty fields. A better solution is to separate out master data and additional data, creating placeholders for relevant but not critical information but not including it in the core data set. Cutting out the need to store everything makes it easier to establish the match rules and drive to a core set of relevant information. Separate out the need to store the other stuff and do a separate project to store this is in a relational database.

To conclude, having a single source of the truth is an incredibly powerful asset. If you are on a data management journey or about to start on one, my key advice is not to consider it as solely an IT challenge but also an organisational one.

About the author

Matt Williamson is a senior IT management consultant at 3gamma, with 14 years of experience from a wide variety of leadership roles in IT programme delivery and IT service management.

Related Articles

The platformification of banking

Strategy & Architecture

Fintech firms will soon have considerable impact on the banking landscape, which is good news for customers. It’s no longer a matter of if, but when, banking will be reinvented as major shifts in competition, technology, customer behaviours and regulations are going to shake up the industry.

The Service Management Office: Driving IT performance in the face of rising complexity

Governance, Operations

Delivering IT services efficiently and effectively while managing a multi-vendor environment requires planning, coordination and a high degree of service management expertise. Establishing a Service Management Office (SMO) provides the single point of focus to achieve this.

Driving business value through IT innovation

Innovation is top of mind these days for most business executives and IT managers who feel pressured to stay competitive. 3gamma is deeply engaged in several initiatives related to the topic.

The directional IT strategy: Seizing game-changing opportunities through end-to-end agility

Strategy & Architecture

To keep up with a fast changing environment where competitive advantage and best practices are short lasting, organisations need to adopt an agile approach to IT strategy – enabling them to explore, identify and seize game-changing opportunities.

Unleashing IT’s potential – delivering corporate value through agile IT sourcing


In a fast-paced economy IT organisations must have the ability to continuously acquire and integrate new capabilities in order to support the business strategy, increase flexibility and improve…