Data at the Heart of Smart Cities: Brussels 2022

On the 23rd June, Matt Flenley visited the UK Ambassador’s Residence in Brussels as part of a delegation of UK tech firms to Belgium. Here, he recounts his pitch on Smart Cities from the event:

Data at the Heart of Smart Cities Brussels 2022

Hello and thank you for inviting us to take part in this session today. I am here representing Datactics, a Belfast-based software and services company specialising in AI-enabled data quality solutions.

Our platform is measuring and improving data quality in major global financial firms including ING in the Netherlands; and Neuberger Berman in New York, as well as to automatically classify and standardise crime reporting throughout the UK’s police forces in a major nationwide data quality improvement programme.

As a firm in software, we know our coffee! And I know that Belgium does too. For instance, I read in my preparation that the Port of Antwerp is the world’s largest coffee handler, typically stocking enough coffee beans at any one time to make at least six cups for every human being on the planet. Or to keep Dave and our DevOps team going for about six months.

You might be like me, the kind of person who needs those six cups to really get the day going! In the UK, the coffee brand Costa ran a campaign recently to show the amazing things people had planned for later in their day… such as this guy, who was getting ready to show off his truly amazing new trousers… but first, they’d need a coffee. And that coffee would be Costa.

For us, it’s the same thing with smart cities and data. There are so many exciting initiatives in the Smart Cities ecosystem which are the equivalent of those fantastic gold trousers, but unless they’re built on good data, it’ll be like trying to start your day without a coffee. Definitely do your exciting initiatives, but first, Datactics.

This mission has been named “Data at the Heart of Smart Cities”, and this is why I’m delighted to have been invited to take part today.

The big dreams of smart cities need big data to be delivered. And performing analytics on that data requires a huge amount of standardisation and cleaning, a task often left to data scientists.

That’s why at Datactics we like to say – don’t build your smart city on dumb data!

Take for example the recent announcement of the Climate Neutrality project, an exciting venture for the four cities in Belgium selected by the European Commission. The ability to measure the reduction of CO2, mitigating factors and innovations applied to achieve climate neutrality will ultimately come down to a lot of data – across buildings, energy, transport, waste management and investment plans.

The Smart City organisers will have to be able to measure that the data is valid, complete, timely, unique or not duplicated, consistent and accurate. These are standard definitions of data quality that the Datactics platform is pre-configured to measure and report.

Our software connects, typically via API or JSON file, to a wide array of old data silos and systems as well as more modern platforms. We profile, cleanse, standardise and de-duplicate data, and measure it to those industry dimensions I mentioned before.

It’s all performed in a no-code environment, aimed to support non-coders, at data scientists rather than at coders, and integrates with dashboarding systems like Microsoft’s PowerBI to help business users understand where data is failing, how to fix it, and then connecting into our remediation clinic to fix the broken data elements.

Then through embedding AI in our platform, we can suggest rules that might apply to data elements based on what’s in the data itself.

The platform can handle data in any context, as long as it’s structured or semi-structured.

Typically, we’re installed within a client’s private cloud, rather than being offered as software-as-a-service. This is usually down to the type of data being collected and used and aiding compliance with data protection regulations.

In this instance, we would work with the municipal authorities to sit across the data stores and sources and provide a consistent, standardised view of all data enabling analytics, data preparation for AI and ML, for a wide range of smart cities use cases.

We’re already working in the UK public sector on crime reporting and in supply chain procurement for the National Health Service.

We’d welcome the opportunity to explore your smart city initiatives, and see how we can become your indispensable partner for your data quality and smart cities ambitions.

But first, Datactics.

Thank you.

And for more from Datactics, find us on Linkedin or Twitter.

Get ADQ 1.4 today!

With Snowflake connectivity, SQL rule wizard and the ability to bulk assign data quality breaks, ADQ 1.4 makes end-to-end data quality management even easier.