Run (E)DMC – Where Data Quality & Data Governance Collides

Towards the end of 2018, Head of Client Services, Luca Rovesti was invited to speak at the Enterprise Data Management Council (EDMC)’s expanded Member Briefing sessions on Data Governance.

We took some time to catch up over an espresso in a caffè around the corner from his swanky new Presales Office in Milan (ok, it was over email) and asked him to share what he spoke about at the events: how banks and financial firms can truly make a standard like Data Capability Assessment Model (DCAM) work in practice.

[Ed] Ciao Luca! Quindi cosa hai fatto negli eventi EDMC?

Ciao! But for the benefit of our readers perhaps we’ll do this in English?

Actually that’s easier for us too.

Great! At Datactics I have been the technical lead in a number of data management projects, putting me in a good position to talk about the “practitioner’s view” – how does the DCAM framework play out in practice? There is a characteristic about Datactics’ business model which I believe makes our story interesting: we are a technology provider AND a consulting firm. This combination means we are not only able to advise on the data programmes and overall strategy, but we are also able to implement our advice. We are not only talking about “how it should be”, but also playing an active role in the implementation phase.

I was able to live through very successful projects…and less successful ones! Looking back at these experiences to determine what made certain interactions more successful than others will be the basis of this blog post.

Ok, so what is the DCAM framework anyway?
The EDMC describes the Data Management Capability Assessment Model (DCAM) as “…the industry-standard, best practice framework designed to assist today’s information professional in developing and sustaining a comprehensive data management program.” You can find more out about it on their website, here.

Can you tell us a bit more about the data management frameworks you’ve encountered?
I’d like to borrow some of DCAM’s terminology to describe to you how the building blocks of this framework typically interact with each other, as it became evident during our conversations.
When Datactics comes into the game, the high-level group data strategy is already laid out and we are brought into the picture precisely because someone within the organisation is facing challenges in operationalising such a strategy. This has been the case for every large organisation we have worked with: in all my involvements with Tier 1 banks, I am yet to see no theoretical framework for a data strategy. Clearly BCBS 239 principles, published in 2013, had a prominent role in shaping these (you could even spot a bit of copy/pasting from the principles if you read these frameworks carefully!), but the idea is there to accurately manage the data lifecycle and capture the meaning of the information, the relationship between different data points.

So how does this compare with practice?
Translating theory into practice – the implementation phase – is where things get challenging. There are real business cases to be solved and data programs are set up for this purpose. Solving the business case is critical to prove the return on investment of data management activities. This is where the activities can bring measurable efficiency, avoid regulatory fines and operational losses, and dreaded capital lockdowns. Data programmess would have people responsible for making things happen; how interconnected these people are within the organisation and how acquainted they are with the real business pain points can make a material difference in the success of the projects.

What we see is that there are 2 complementary approaches to service the data program:

    • Data Governance is all about creating a data dictionary, understanding its lineage, ownership and entitlement. This is the top-down approach and it defines “how things should be”.
  • Data Quality is all about measurement, accountability and remediation of data, according to test conditions and rules. This is the bottom-up approach and it defines “how the data actually is”.

Do these two approaches, Data Quality and Data Governance, intersect neatly then?
We often get asked about the relationship between Data Quality and Data Governance. Can one exist without the other? Which one is more important? Which one should start first?
I mentioned I was going to give the “practitioner view” so I’ll answer from past experience: in the most successful projects I have seen, they were both there. Two parallel activities, with a certain degree of overlap, complementing one another. Governance with policies and definitions, data quality with real metrics on how the data is evolving.

I like to think that governance is like a map, and data quality is the position marker that tells us where we are in the map, making it a lot more useful. The technology architecture is of course where all of this logic plugs in, connecting it with real information in the client’s systems or from external, open or proprietary sources.

Can you give us an example of a live deployment?
Sure! As promised, let’s see how things played out in a successful interaction. We are typically engaging with organisations to enable the Data Quality part of the data management strategy. This means being instrumental in measuring, understanding and improving the information at the basis of data programmes, and I cannot stress enough the importance of the connection with the underlying business case.

I have seen different models working: whether there would be different programs, each supporting a particular business case, or a single program to service them all.
In our most successful interactions at Tier 1 investment banks or major European banks, we could leverage the data quality initiative to support a number of key business activities, such as regulatory reporting, KYC and AML, because all of these activities rely on complete, consistent and accurate data.

Is there anything else you’d add, something that connects the most successful projects perhaps?
Yes, definitely. The single most important thing that can drive a successful project is buy-in from the technical side. The best implementations we have worked on have depended on a close connection between technical (i.e. IT) teams and the owners of the business case from an implementation point of view. It is extremely rare that a financial institution would go through a level of technical restructuring that would allow a complete change of core technology just to support a business case for regulatory reporting, for example.

In the vast majority of the cases the implementation is “plug-and-play” with the existing technology architecture, which suits our open architecture and deployment approach completely; and let’s face it: banks’ IT, DBAs and infrastructure teams are always swamped! More than once it happened to me that it was all ready to go in a project except…for that one server…

But you roll with these punches, and you work on good relationships with internal teams, to help smooth out these wrinkles because the downstream benefits of perfected, cleansed data are almost limitless in their use cases. I mean, these guys know their stuff and care hugely about the quality of their systems and data. So yeah, I’d say financial institutions where there is a close connection between senior teams owning the business case; those responsible for the technology architecture; and the downstream users of the data that would be able to convey its business meaning and measureable benefits, are the perfect grounds for getting projects off the ground and into production in rapid time.

Thanks, Luca! Where can the kindly folk of the internet find out more about you, or maybe meet you if they’re so inclined?
Well, they can always email me and I’ll gladly meet up. I’m going to Amsterdam in March as part of the Department for International Trade’s Fintech Mission to the Netherlands, so I’d be more than happy to catch up in-person there if that suited. I’m pretty easy-going, especially if coffee’s involved…

Like this?

Perfect

Get ADQ 1.4 today!

With Snowflake connectivity, SQL rule wizard and the ability to bulk assign data quality breaks, ADQ 1.4 makes end-to-end data quality management even easier.