In this AI whitepaper, authored by our Head of AI Fiona Browne, we provide an overview of Artificial Intelligence (AI) and Machine Learning (ML) and their application to Data Quality.
We highlight how tools in the Datactics platform can be used for key data preparation tasks including cleansing, feature engineering and dataset labelling for input into ML models.
A real-world application of how ML can be used as an aid to improve consistency around manual processes is presented through an Entity Resolution Use Case.
In this case study we show how using ML reduced manual intervention tasks by 45% and improved data consistency within the process.
Having good quality, reliable and complete data provides businesses with a strong foundation to undertake tasks such as decision making and knowledge to strengthen their competitive position. It is estimated that poor data quality can cost an institution on average $15 million annually.
As we continue to move into the era of real-time analytics and Artificial Intelligence (AI) and Machine Learning (ML) the role of quality data will continue to grow. For companies to remain competitive, they must have in place flexible data management practices underpinned by quality data.
AI/ML are being used for predictive tasks from fraud detection through to medical analytics. These techniques can also be used to improve data quality when applied to tasks such as data accuracy, consistency, and completeness of data along with the data management process itself.
In this whitepaper we will provide an overview of the AI/ML process and how Datactics tools can be applied in cleansing, deduplication, feature engineering and dataset labelling for input into ML models. We highlight a practical application of ML through an Entity Resolution Use Case which addresses inconstancies around manual tasks in this process.