Why fixing your data quality issues shouldn’t wait

Blue Down Arrow

In Gartner’s recent report on data quality, they listed the critical five things you should do to make your data complete, accurate, and fit for purpose.

These are:

  • Scope
  • Governance
  • Processes
  • Roles & Responsibilities
  • Technologies, with a very strong message to focus attention and energy on your Data Quality hotspots.

However, the killer point for any business looking to implement a data quality framework is hidden on one of the last pages of this report. FIXING YOUR DATA QUALITY ISSUES SHOULD NOT WAIT until your next Big Digital or Data Transformation programme or project comes along.  It can and must start as soon as you start to see data quality issues cropping up in your data estate.

None of this is new news and there is no reason that data quality issues cannot start to be fixed immediately.  If you have a plan of action, the right processes, and the right people in place this can begin immediately. In fact, from our experience, this does not always need any technology investment upfront to kick start it. Rather, more often the ask from the IT and Technology teams is for them to give practical and supportive help in helping fix issues not configuring or installing expensive Data Quality tooling.

An excellent starting point to create your plan of action is to start logging your data quality issues as they arise.  By using a Data Quality Issues register – basically an Excel spreadsheet will give you  this – you need to start to record details about aspects of each issue – such as who found the issue, what system(s) it impacts, how many records does it impact, whether there are any obvious compliance or regulatory implications and what are and might be the up and down stream impacts for your business etc.

The answers will only be as good as your understanding of the issue at that point in time but that does not matter as it allows you to start getting closer to establishing the root cause of each issue. Also, some issues will be identified as being separate but when you start to investigate them will show that the underlying cause may be a common one.

You will find that as you start to record Data Quality issues you will add more content to your register.  Make sure this register is visible to those involved in fixing the data quality issues and if you have a data forum ensure that it is a standing agenda item.  It is great to be able to share with stakeholders the number of Data Quality issues that have been opened, closed, and are in progress on a regular basis.  It also helps demonstrate empirically to stakeholders where the real data quality issues exist within your business.  This will help dispel many myths and theories within your business as to where the underlying Data Quality issues exist!

Also, the good news is that adopting this approach will help show that fixing your data quality does not require a large budget or wait for senior management to approve.  These activities can be done almost by stealth within the business and under the radar of a wider data programme (although clearly the former will inform and benefit the latter!).

Fortunately, the Oakland Group recently produced an easy-to-read guide – Data Governance by Stealth – that sets out doing a range of Data Governance tasks through such an approach and shows it can be done without the need for a bigger and so more costly data transformation project.

So armed with a clear understanding of each data quality issue from your Data Quality issues register, a small team (indeed maybe even just one individual!) who are willing and able to progress fixing these issues coupled with a little blood sweat, and tears, it can make a significant impact on improving your data quality.  For sure there will be Data Quality issues that cannot be solved using this low-cost route and require being part of a larger transformational project but many can be fixed without such management and financial overhead. By having all your issues logged on your register will help ensure that the cost-benefit of fixing more structural issues can be sized far better than before.

So for anyone scratching their head in a business where you have data quality issues and don’t really know where to start take some time out and read both the Gartner and Oakland reports.  These will give you the why, when, and how to fix many of the less complex and challenging of data quality issues that businesses undoubtedly find themselves facing.  Then, with a bit of sleeve rolled up mindset, just one or two people in an organization can make significant and sustained improvements to Data Quality and help wayward a path to fixing more complex and challenging issues. Being able to sort out these less complex issues will also help inform and guide whether spending budget on fixing the more complex issues is justified.  Also, you will have a small team that has started to learn where and what to look for at very little cost. Having seen this approach applied in a number of businesses you start to realise that you do not need as much time and energy as you might think once the root causes of your data quality issues are better understood.

Andrew Sharp is a Data Governance Expert at the Oakland Group

Previous Post: Is Data Governance at a crossroads, and does it need a "Data Mesh" refresh?Next Post: Gain control of your SharePoint Data