Data will power the next phase of the economy: DMS USA lookback – Part 1

Last September, Kieran Seaward, our Head of Sales, delivered a keynote at the virtual DMS USA on a data powered economy. In his keynote he unpacked:

  • The impact of COVID on shifting business priorities, focussing on how existing data management processes and data architectures have changed – as well as problems encountered along the way 
  • Case studies demonstrating best practice approaches to managing data-driven processes and how to create impact and add value in a post COVID era

We are still living with the many challenges presented by COVID-19, including a wide range of changes to the way business can be conducted. At Datactics we have been really encouraged that engagement with the market is still strong; since March, and the start of many lockdowns, we’ve conducted many hundreds of calls and meetings with clients and prospects to discuss their data management and business plans. This article is based on a lot of our key findings from these calls and reflects the priorities many data-driven firms

Data will power the next phase of the economy – good or bad?

As global economies look to get back on their feet, it’s clear that data quality is more important than ever before. Whether it’s data for citizen health, financial markets, customer or entity data, or any other type, economies and firms will either be powered by a solid platform of high quality, reliable data, or they are going to grow more slowly, built on poor data quality. It’s not an overstatement to say that what the next phase of economic growth looks like will rely entirely on the decisions and actions we take now.

Kieran’s keynote was underpinned by the fact there’s never been a greater need to get your data quality in order. With some firms really grasping this opportunity, change is visible. However, many have encountered the same old problems with the state of their data, and this can inhibit change.

It is necessary to get data quality under control?

Kieran highlighted that pre-pandemic, MIT Sloan Management Review commented that poor data costs on average between 15-25% of revenue. This figure reaffirms that there is no better time than now to improve the data quality foundation. The long-term future looks bleak if it is built on the quality of data that we have right now!

What is the importance of a foundation of good data quality?

Both before and since the financial crisis of 2008, there have been many conversations reiterating the importance of data quality foundations. Moving forward particularly after COVID-19, a data quality foundation can not only help you get off to a positive start amidst uncertainty, it can also ensure resilience for the future.

Kieran referred to the many conversations that he has had with wealth management, asset management and insurance firms this year. Following hot on the heels of the most innovative investment banks, more and more of these firms are seriously considering what a data governance and data quality framework should deliver.

Redesigning those data architectures that aren’t delivering

Kieran went on to detail that there have been a large number of firms who have told him that the ‘Waterfall’ process of logging tickets into software development and IT to improve data quality simply isn’t moving quickly enough for them, or with the right level of accuracy, and as a result, they’re evaluating DataOps processes and tooling. Stuart Harvey, CEO at Datactics spoke in-depth on this at the European Summit, about having every part of the data management journey at the table on data issues, and this is something one client we signed during the UK lockdown is now putting in place as their data management foundation.

On the growth of DataOps-style approaches, Kieran said:

We’ve been encouraged by the number of firms we’ve spoken to who are keen to start moving away from script-based rules. They’re federating data quality management out from centralised IT and into the business teams so that business users – who already know the data and what good looks like – can self-serve for high-quality, ready-to-use data in their line of work. This covers everything from building rules themselves, through interrogating the data itself in dashboards, right down to making or approving fixes in the data within their domain.

What comes first? Data Governance or Data Quality?

Kieran recently wrote a blog on this topic as it he noted that it comes up in every client engagement! To illustrate the importance of data quality, he gave an analogy:

Imagine you are a kid, and your mother has asked you to tidy your bedroom. Your mum returns a few hours later and the room is still a mess – would you get away with saying “Yes Mum, but look – I’ve made an inventory of where everything should be.” I imagine the response you’d get would be something along the lines of “well done but can you please tidy your room now?!

Kieran used this story to draw attention to the fact that it is vital to consider data quality first and foremost, as having a large inventory with disorganised data will lead to key data being either inaccessible or difficult to find.

Making a holistic approach to data management

There are a number of building blocks that make up a holistic approach to data management including data quality, master data management, business glossary/data dictionary, metadata management, and so on. As Kieran reiterated in his keynote, using intelligent integration via APIs, it is now possible to build the next generation of data management platform orchestrate by leveraging ‘best of breed’ technology components from a multitude of capable vendors. For example,  recently we have explored a more orchestrated approach with vendors like Solidatus, where Datactics provides data quality and Master Data Management, and Solidatus provides the Governance and Lineage pieces.

Start small, think big

Kieran’s session reinforced that if you are exploring introducing new capability/ functionality, completing a proof of concept is a well-proven, low-risk means of proving the efficacy of the software and associated professional services. If the scope is well considered and defined on a specific use case that is causing pain, the quick turnaround of results has the potential to create real impact.  This real impact will ultimately help to make the business case a lot stronger.

This is how Datactics has engaged with the vast majority of our clients and we have successfully delivered remote proof value projects during lockdown that are now being rolled into production.

To use the analogy about tidying your room… Once you start to clean your desk or wardrobe, you can quickly see that cleaning the rest of the room doesn’t seem as daunting a task, if you break it into chunks.

In conclusion, at Datactics, we believe that it doesn’t matter where you start, so long as you make the start!

It is necessary to get your data quality under control and the importance of data quality foundations have never been more paramount. Our platform can help you to take away the hassle of internal “roadblocks” in IT administration, and we can remove the headache from a manual review of data records failing or breaking rules.

Kieran seaward, head of sales at datactics

Pre pandemic, MIT Sloan Management Review commented that poor data costs on average between 15-25% of revenue. This figure reaffirms that there is no better time than now to improve the data quality foundation.

Our platform can help you, get in touch today.

In the next blog, we will be unpacking more themes from Kieran’s keynote ‘A Data-Driven restart’; we will be looking at taking a long-term view; seeking impact that matters, and finally on the budget and implementation approach.

If you want to watch Kieran’s keynote in full, you can do by checking out this link.

Get ADQ 1.4 today!

With Snowflake connectivity, SQL rule wizard and the ability to bulk assign data quality breaks, ADQ 1.4 makes end-to-end data quality management even easier.