Content Copyright © 2019 Bloor. All Rights Reserved.
Also posted on: Bloor blogs
It is a truism that governments continuously make more laws and regulations. Industries increasingly become more regulated. This is obvious from the introduction of GDPR and the forthcoming CCPA. However, it is not limited to concerns over personal data.
Back when Basel II and MiFID II were first promulgated, I posited the idea that where previously regulators had focused on organisations having auditable processes in place (Sarbanes-Oxley) they would in future put more emphasis on data assurance. While I am not aware of specific regulations – outside of those in GDPR – that specifically require the accuracy of data and its provisioning, anecdotal evidence suggest that regulators and auditors are increasingly requiring a more dutiful approach to data assurance.
Partly for this reason we will shortly be publishing a hyper-report on Data Assurance. This consists of a synthesis of all the research we have done over the last couple of years in the areas pertinent to this subject. It covers data discovery, data profiling, data quality, data catalogues, data preparation, master data management and single customer view, spreadsheet governance, data governance, data lineage, knowledge graphs, and bias detection and remediation for training data.
One of the conclusions of this hyper-report is that “there are NO complete and comprehensive data quality or data governance tools.” The reason for this is that the data management vendors, which make up the bulk of the companies discussed in our report, do not have capabilities with respect to end user computing (EUC) assets. On this point, I want to return to the issues surrounding regulators and data assurance. The trend now is for auditors to want companies to prove not just the lineage of their financial reporting but to provide evidence that they are actually deriving data from the production version of that data, and moreover that that production version is up-to-date. In addition, whereas EUCs used to consist primarily of spreadsheets and Access databases, regulators are widening this remit to include such as things as Tableau dashboards, PowerBI reports and chatbots. If these contain data, which is used to inform financial or other regulated information, then assuring the provenance of this data is increasingly regarded as mandatory.
From the point of view of those companies specialising in managing EUC-based data this means a change in emphasis. Historically, they have focused on identifying critical EUC assets and then implementing version control. This is a simplistic description, but they have essentially been about data management. They are now having to evolve towards being about data assurance.
I am going to be discussing this in more detail in a couple of webinars on 12 November (one timed for a European audience and for the States). I will be joined by Gavin Spencer of Apparity, which is an EUC management/assurance vendor that has already taken this particular bull by the horns.
You can now register for either of these webinars:
It’s a Trust Issue! In the UK and Europe, Companies are Facing a New Data Assurance Gap – This Time Across End User Computing. 12 November, 2019, 11:00 AM GMT [REGISTER HERE]
It’s a Trust Issue! In the US and Canada, Companies are Facing a New Data Assurance Gap – This Time Across End User Computing. 12 November, 2019 1:00 PM EDT [REGISTER HERE]