What is driving the need for Data Quality Metrics in Retail Banking?

Retail Banking Part 1 – Marathons present a titanic challenge that has long dominated the human psyche: the will to finish, to get over the line, to win. Whether in-person or virtual, the London Marathon is no different!

ATM blog opener

In previous years, the official app has contained a feature to help friends, family and supporters locate their chosen runner by name or vest number, and so track their progress at every stage of their torturous journey of 26.2 miles.

It’s said that many runners hit “the wall” at around 22 miles, so very often loved ones wait at that marker to help spur them on to the finish a little over four miles away. But imagine for a moment that there are no distance markers, no Fitbits, and no stopwatches. Over 38,000 runners crowding the starting line, with no way of knowing critical data dimensions such as where they are, the pace at which they’re running, or how long it will take to finish.

It seems absurd to think that anyone would choose that way of running such a race, yet this is the decision being made by any retail bank not currently using metrics to measure its customer and financial data quality. Understanding the condition, accuracy, quality and maturity of datasets across the vast array of products and service channels is impossible if these elements aren’t actually being measured. 

At this point, retail banks can quite rightly indicate four major issues confronting them that are very often seen as preventers of that holy grail of data management: a timely, accurate and complete single view of a personal banking customer.

1. Size and scale

For starters, the amount of general public data to be measured is usually vast. One data record for a personal current account customer might include first name, middle name, last name, current address (five lines), previous address (another five lines), phone number, mobile number, email, employer and their address (five more lines), National Insurance number, date of birth, dependants, spouse… and that’s before their transactional data, banking IDs, credit card accounts and so on are included! The opportunities for data quality issues to exist are tremendous. 

2. IT infrastructure

Further complicating matters, many retail banks operate systems which are decades old and highly immovable1. New products and services demand features old systems can’t deliver; mergers and acquisitions bring in whole new datasets, and separate systems in silos rarely categorise data in the exact same way. Yet the IT department is frequently held responsible for owning data quality, without specialist knowledge of what the data is, what it seeks to represent and what purposes the bank has for it2. They have no authority or budget to change or improve it, so even when they do report on it, it’s usually only able to say that the data is deteriorating.

3. Operational processes

Single view of customer is fine in theory, but continually compromised in reality: even if a customer’s data is fine today and entered into the best system money can buy, if it’s not being measured and referenced regularly there’s no way of telling if it’s better or worse than any of the rest of the data in the warehouse.

4. Regulation and the marketplace

On top of that, ‘big data’ remains big news and subject to never-ending scrutiny. Since the financial crisis, the importance of measuring data quality has swung from a nice-to-do to a must-do, with key dependencies increasingly assigned to compliance and risk functions. In retail banking, teams managing risk and compliance have seen their numbers increase by thousands of percent3 as they are increasingly regulated by the Financial Conduct Authority and Prudential Regulation Authority.  

This is in stark contrast with the 3,303 bank branches closed in the UK in the past five years4, whilst challenger banks start to pursue the opposite strategy5. Put simply, if getting on top of big data is seen as too big and broad a problem, investment in data quality solutions quickly becomes a major, centralised IT infrastructure purchase with a correspondingly hefty price tag attached – and this multi-million, multi-year outlay makes it much harder to justify getting it done at all.

marathon, running, runner

5. Getting tactical

This is where targeted, tactical data quality metrics can provide genuine and demonstrable insight. Quick wins that utilise industry-standard definitions of data quality (such as the Enterprise Data Management’s Data Management Capability Assessment Model6, DCAM™) work alongside enterprise data management routines to solve specific data problems, meet changing regulations and free up resources to help the bank develop market-leading customer propositions.

Instead of being hamstrung by an unwieldy data warehouse, implementing data quality metrics that visualise critical quality dimensions such as conformity, completeness, and integrity of datasets can not only enhance compliance with regulatory obligations, but also yield an accurate picture of the current landscape and its progression over time.

It moves the needle from simply understanding whether a data element is right or wrong, to intelligent analysis of how right or wrong it is, and whether its quality is improving or decreasing.  Insights like these are critical for commercial banks (as well as other financial services such as investment banks and wealth management) to identify poor data quality and also to solve real world problems. 

Thanks to ongoing data measurement, our marathon runners know where they are, how far they have to go, and how long it’s taken them. They know how they stack up against their competitors, updated to the second through constant data analysis and review. Being just as diligent, and demanding detailed metrics on the condition of vital datasets may seem daunting when it comes to retail banking data, but it has to be the cornerstone of any rigorous approach to data quality management.

This blog is Part 1 of a series looking at how data quality in consumer banking can be reviewed, monitored and remediated. Next part will cover how banks can utilise their SME knowledge to adopt a self-service approach to data quality improvement.

Click here for the latest news from Datactics, or find us on LinkedinTwitter or Facebook

Get ADQ 1.4 today!

With Snowflake connectivity, SQL rule wizard and the ability to bulk assign data quality breaks, ADQ 1.4 makes end-to-end data quality management even easier.