Many data management professionals have some form of tooling or platform to support their business initiatives but often find it hard to get buy-in for why they still need to invest time and resources in this area.
In a recent post by data management blogger Henrik Liliendahl on the passing of Larry English, he refers to English as having “pioneered the data quality – or information quality as he preferred to coin it – discipline.” Liliendahl looks at three main concepts that underpin all data and information quality technologies, and in a moving tribute, inspires us to “roll up our sleeves and continue what Larry started.”
So, for anyone involved at any stage in information management, it’s therefore worth taking the time to consider the impact that English had on the industry and understand what lessons remain today.
Those three concepts are:
- Quantify the costs and lost opportunities of bad information quality
- Always look for the root cause of bad information quality
- Observe the Plan-Do-Check-Act circle when solving the info
Firstly then, on the costs and lost opportunities.
The “Data Doc” Tom Redman has a neat test that data managers can conduct on a Friday afternoon – maybe unsurprisingly called the “Friday Afternoon Measurement.” Instead of repeating it in detail here, head over to Harvard Business Review and take a look.
In short, assemble like-minded people who know the data and can quickly tell if it’s right or not, open a beer or two, take four quick steps and a small bit of cost estimation and – hey presto – your cost of bad information quality is right in front of your eyes.
What else could you have done with that money? Anyone in financial services knows that it’s harder to get budget for something than it is to eliminate an operational cost, but as this very simple business case will show, it’s not actually that hard to demonstrate after a beer on a Friday afternoon with the Data Doc.
Secondly, the root causes.
One of the best ways of discovering the root cause is the “five whys” pioneered by Toyota (there’s a good guide to it here).
Maybe after your Friday Afternoon Measurement, pick some of your data problems and set your people the task of asking “why” five times, with the clear instruction to get the causes as specific as possible. Summarise, prioritise and then look for ways to eliminate those problems.
Lastly, the “Plan-Do-Check-Act” (PDCA) loop
(Again, a super guide here)
… was pioneered by Dr William Deming as a way of uncovering why some processes or products are underperforming. In data and information management, being able to measure what impact you’ve made is a critical feature both as a reporting avenue to senior stakeholders and as the first stage of the next PDCA loop.
Simply offering up dashboards and visualisations to show what’s happened or happening is just one part of it. The slickest PowerBI dashboard might well showcase where the data quality is, but unless it’s also demonstrating where the next priorities and best-recommended actions are, it’s not actually a loop of continuous improvement.
In summary, it’s worth looking back at what Liliendahl said about Larry English: he “…pioneered the data quality – or information quality as he preferred to coin it – discipline.” Fundamentally it’s about being disciplined, but in the right areas!
How could reimagining data quality make a difference to your organisation? Hit me up on LinkedIn and let’s continue the conversation!
To learn more about how how Self-Service Data Quality is the best approach to developing a next-gen data management strategy, catch our webinar from 2020 with key input from CTO, Alex Brown (or read a blog post version here).