Tackling Practical Challenges of a Data Management Programme

Nobody said it was easy” sang Chris Martin, in Coldplay’s love song from a scientist to the girl he was neglecting. The same could be said of data scientists embarking on a data management programme!

data culture

In his previous blog on Good Data Culture, our Head of Client Services, Luca Rovesti, discussed taking first steps on the road to data maturity and how to build a data culture. This time he’s taking a look at some of the biggest challenges of Data Management that arise once those first steps have been made – and how to overcome them.

One benefit of being part of a fast-growing company is the sheer volume and type of projects that we get to be involved in, and the wide range of experiences – successful and less so – that we can witness in a short amount of time.

Without a doubt, the most important challenge that rears its head on the data management journey is around complexity. There are so many systems, business processes and requirements of enterprise data that it can be hard to make sense of it all.

Those who get out of the woods fastest are the ones who recognise that there is no magical way of solving things that must be done.

A good example would be the creation of data quality rule dictionaries to play a part in your data governance journey.

data management programme

Firstly, there is no way that you will know what you need to do as part of your data driven culture efforts unless you go through what you have got.

Although technology can give us a helpful hand in the heavy lifting of raw data, from discovery to categorisation of data sets (data catalogues), the definition of domain-specific rules always requires a degree of human expertise and understanding of the exception management framework.

Subsequently, getting data owners and technical people to contribute to a shared plan that takes the uses of the data and how the technology will fit in is a crucial step in detailing the tasks, problems and activities that will deliver the programme.

Clients we have been talking to are experts in their subject areas. However, they don’t know what “best of breed” software and data management systems can deliver. Sometimes, clients find it hard to express what they want to achieve beyond a light-touch digitalisation of a human or semi-automated machine learning process.

data management

The most important thing that we’ve learned along the way is that the best chance of success in delivering a data management programme involves using a technology framework that is both proven in its resilience and flexible in how it can fit into a complex deployment.

From the early days of ‘RegMetrics’ – a version of our data quality software that was configured for regulatory rules and pushing breaks into a regulatory reporting platform – we could see how a repeatable, modularised framework provided huge advantages in speed of deployment and positive outcomes in terms of making business decisions.

Using our clients’ experiences and demands of technology, we’ve developed a deployment framework that enables rapid delivery of data quality measurement and remediation processes, providing results to senior management that can answer the most significant question in data quality management: what is the return on investing in my big data?

This framework has enabled us to be perfectly equipped to provide expertise on the technology that marries our clients’ business knowledge:

  • Business user-focused low-code tooling connecting data subject matter experts with powerful tooling to build rules and deploy projects
  • Customisable automation that integrates with any type of data source, internal or external
  • Remediation clinic so that those who know the data can fix the data efficiently
  • “Chief Data Officer” dashboards provided by integration into off-the-shelf visualisation tools such as Qlik, Tableau, and PowerBI.

Being so close to our clients also means that they have a great deal of exposure and involvement in our development journey.

We have them ‘at the table’ when it comes to feature enhancements, partnering with them rather than sell and move on, and involving them in our regular Guest Summit events to foster a sense of the wider Datactics community.

It’s a good point to leave this blog, actually, as next time I’ll go into some of those developments and integrations of our “self-service data quality” platform with our data discovery and matching capabilities.

Click here for the latest news from Datactics, or find us on LinkedinTwitter or Facebook

Get ADQ 1.4 today!

With Snowflake connectivity, SQL rule wizard and the ability to bulk assign data quality breaks, ADQ 1.4 makes end-to-end data quality management even easier.