Part 2: Self-service data improvement is the route to better data quality

The route to better data quality – It’s easy to say that planning a journey has been made far simpler since the introduction of live traffic information to navigation apps. You can now either get there faster, or at the very least phone ahead to explain how long you’ll be delayed.

This image has an empty alt attribute; its file name is Retail-Banking-Part-2-1-1-1024x1024.png

It’s just as easy to say that we wouldn’t think of ignoring this kind of data. Last week’s blog looked at the reasons for why measuring data is important for retail banks, but unless there is a strategy taken to react to the results it’s arguably pretty much meaningless.

Internal product owners, risk and compliance teams all need to use specific and robust data measurements for analytics and innovation; to identify and serve customers; and to comply with the reams of rules and regulations handed down by regulatory bodies. Having identified a way of scoring the data, it would be equally as bizarre to ignore the results.

However, navigating a smooth path in data management is hampered by the landscape being vast, unchartered and increasingly archaic. Many executives of incumbent banks are rightly worried about the stability of their ageing systems and are finding themselves ill-equipped for a digital marketplace that is evolving with ever-increasing speed.

Key business aims of using data to achieve necessary cost-savings, and grow revenues through intelligent analytics, snarl up against the sheer volume of human and financial resources needing to be ploughed into these systems, in an effort to meet stringent regulatory requirements and to reduce the customer impact, regulatory pressure and painful bad press caused by an IT outage.

Meanwhile, for those who have them, data metrics are revealing quality problems, and fixing these issues tends to find its way into a once-off project that relies heavily on manual rules and even more manual re-keying into core systems. Very often, such projects have no capacity to continue that analysis and remediation or augmentation into the future, and overtime data that has been fixed at huge cost starts to decay again and the same cycle emerges.

But if your subject matter experts (SMEs) –  your regulatory compliance specialists, product owners, marketing analytics professionals – could have cost-effective access to their data, it could put perfecting data in the hands of those who know what the data should look like and how it can be fixed.

If you install a targeted solution that can access external reference data sources, internal standards such as your data dictionary, and user and department-level information to identify the data owner, you can self-serve to fix the problems as they arise.

This can be done via a combination of SME review and through machine learning technology that evolves to apply remedial activities automatically because the rules created through correcting broken records can contain the information required to fix other records that fail the same rules.

It might sound like futuristic hype – because AI is so hot right now – but this is a very practical example of how new technology can address a real and immediate problem, and in doing so complement the bank’s overarching data governance framework.

It means that the constant push towards optimised customer journeys and propositions, increased regulatory compliance, and IT transformation can rely on regularly-perfected data at a granular, departmental level, rather than lifting and dropping compromised or out-of-date datasets.

Then the current frustration at delays in simply getting to use data can be avoided, and cost-effective, meaningful results for the business can be delivered in days or weeks rather than months or years.

Head over the next part: ‘Build vs Buy – Off-the-shelf or do-it-yourself? ‘ or click here for part 1 of this blog, covering the need for data quality metrics in retail banking.

This image has an empty alt attribute; its file name is part-1-1024x594.png

Matt Flenley is currently plying his trade as chief analogy provider at Datactics. If your data quality is keeping you awake at night, check out Self-Service Data Quality™ our award-winning interactive data quality analysis and reporting tool that is built to be used by business teams who aren’t necessarily programmers.

Get ADQ 1.4 today!

With Snowflake connectivity, SQL rule wizard and the ability to bulk assign data quality breaks, ADQ 1.4 makes end-to-end data quality management even easier.