As featured in the recent A-Team webinar, we’ve been strong advocates of a self-service approach to data quality (SSDQ), especially when it comes to regulated data types and wide-ranging demands on a firm’s data assets.
This whitepaper SSDQ, authored by our CTO Alex Brown, goes deeper into the reasons why this approach is so much in demand and explores the functionalities that a fully self-service environment needs to equip business users with rapid access to high-quality data.
The Changing Landscape of Data QualityThere has been increasing demand for higher and higher data quality in recent years – highly regulated sectors, such as banking have had a tsunami of financial regulations such as BCBS239, MiFID, FATCA and many more stipulating or implying exacting standards for data and data processes.
Meanwhile, there is a growing trend for more and more firms to become more Data and Analytics (D&A) driven, taking inspiration from Google & Facebook, to monetize their data assets. This increased focus on D&A has been accelerated by easier and lower-cost access to artificial intelligence (AI), machine learning (ML) and business intelligence (BI) visualization technologies. However, in the now-waning hype of these technologies comes the pragmatic realization that unless there is a foundation of good quality reliable data, insights derived from AI and analytics may not be actionable. With AI and ML becoming more of a commodity, and a level playing field, the differentiator is in the data and the quality of the data… To read more see the whitepaper above.