How much do you trust your data?

This article was originially published 19 March 2019 and edited 23 August 2021.

If you were to quantify your level of confidence in your data quality, what would it be? 80 per cent? 40 per cent? Even less? Or do you ‘blindly trust’ that it’s ok?

Were the Board of Directors to ask you for a number, could you give them one? What would you tell them? Do you expect them to trust a response that hasn’t been quantified, and therefore can’t be benchmarked or monitored?

'Blind faith' and 'gut feel' shouldn’t be concepts we rely on when describing the quality of our data in this day and age. It’s too important. Our KPIs and dashboards are built from data. Our directors and managers make significant and important decisions under the influence of this data.

Big things can go wrong if this data is wrong. For example, imagine if patient data was inaccurate, incomplete, missing, corrupt or unrecognisable to the operations and processes that consume them. It could impact the safety of patients, the accuracy of reimbursement of service provision or many other aspects of healthcare delivery. 

Alternatively, rather than a cataclysmic event resulting from poor data quality, a gradual erosion of trust in the data can mean businesses don’t use it in any meaningful way.

What is trust?

Trust is a feeling of confidence and security. It’s to hope and expect that something is true. Essentially, it’s an emotional state. 

Sometimes we can trust without any understanding of probability or any precise predictions. It just involves a positive feeling about something. To trust, we need to feel good about it.  

Ideally, trust should also involve a logical element, based on data or evidence, where the validity and accuracy have been assessed and rated. In other words, proof.

Gathering your proof

This step is essential, so that when the Board of Directors inevitably ask:

  • “What evidence do you have that our data quality is high?” 
  • “How high is high enough?”, and
  • “How does that compare to industry benchmarks?” 

You will have the answer, and the proof to back it up.

Not having this proof available on demand is no longer acceptable in this modern day, when:

  • it’s straightforward to check 
  • the risk impact is high, and 
  • it’s expected by the Board.

We all know how long it takes to build trust, and how quickly it can be broken. Evidence is the foundation to build on. 

So, where do we start? The relative value of our data and the criticality of the systems that utilise it should be assessed, so we can prioritise our efforts on the data that needs it first. With our precious resources, we won’t be able to (nor should we) focus the same amount of attention on all of our data; some of it won’t be worth the effort. 

We also need to assess the maturity of our existing data management processes against the appropriate maturity model, along with the quality of our data against our metrics, so that we know what our current state is.
Once this has been done, we can then determine how high to set the quality benchmark and begin the process of improvement. 

Data quality can be improved as a once-off, but the real benefits come when regular processes are put in place, particularly for critical data. 

Some of those processes that should be in place are:

  • Regular reviews embedded into reporting systems, so issues can be identified and potentially corrected as near to real-time as possible
  • An independent annual review of the key metrics used in data quality reporting, the assumptions made in determining them and the credibility of trends in the data from the previous review
  • Targeted periodic deep-dives into specific datasets that may need it (e.g. from a new or significantly changed system). 

Maintaining high data quality is important to organisations, because it underpins and impacts on critical business systems. For example: Business intelligence, data warehousing, customer relationship management, inventory management and supply chain management. It is one of the most important responsibilities for Chief Data Officers and is an essential part of good governance and data management practices.  

There are a number of frameworks and approaches to maintaining data quality, but checks on accuracy, completeness, validity, consistency, timeliness and availability of the data will usually be included.  

Directors are being encouraged to ask tough questions to get to the truth. They need to feel comfortable with the adequacy of information that is provided to them, so they can assess the relevant risks, including those associated with data.

To further discuss data management, data governance, data quality or data valuation, please get in touch with our team.