<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=155514168445921&amp;ev=PageView&amp;noscript=1">

There's a Gap Between Data Quality Confidence and Data Quality Reality

by on in

Upon stumbling across the Banking Technology Vision 2018 report, it became apparent that a common trend related to data quality was evident in the banking industry. In many industries, the data stakeholders are routinely confident in the quality of their data, accepting its validity and accuracy at face value. However, a deeper dive reveals that their confidence is misguided. Here’s a quick look at why feeling good about your data doesn’t mean that your data is reliable, accurate or valid. 

Misplaced Data Quality Confidence

The findings of the survey are a bit startling for those working in the data quality industry, although they’re far from unexpected. For instance, 94% of the bankers who replied to the survey said they’re confident in the integrity of the sources of their data. This confidence comes despite, as the survey accurately notes, that banks receive consumer data from notoriously unstructured outside sources. Further insights reveal that this confidence in data integrity is optimistic at best. 11% of bankers surveyed viewed their data as reliable, but do not do anything validate the reliability. As data professionals, would you blindly accept the accuracy and reliability of datasets you’ve never validated? We certainly wouldn’t recommend doing so. Additionally, 16% try to validate their data, but aren't convinced of the overall quality; 24% say that they fully validate their data, but still do not have an accurate picture that the data's overall quality is valid, timely, accurate, etc. In short, even those who take the time to validate their data fail to incorporate crucial components of data quality into their process.

Poor Data Quality: A Compounding Problem

As we’ve demonstrated in the past with the 1-10-100 Rule of Data Quality, poor data quality isn’t an isolated issue. When leaders make business decisions based on flawed, incomplete, or invalid data, the consequences are often detrimental to the business as well as its consumers. So, while a full 84% of respondents stated they’re increasingly using data to drive the decision making process at their institutions, the fact that few organizations actually validate their data to begin with should be extremely disconcerting.

What can we take away from this survey? I’d argue that it’s imperative to validate your data, rather than simply just hoping it’s correct. Additionally, taking steps to improve the quality of data as they enter your databases and applications can go a long way towards improving data quality before it makes its way downstream into your analytics.

Be sure to check out more on this study here. You can also follow us on Facebook, LinkedIn, and Twitter for more industry insights and company news.

 

Start trusting all your data with Naveego DQS

Derek Smith

Derek is Co-founder and CTO of Naveego, Inc. His belief that leading-edge analytics should be in reach for everyone led to the development of the Naveego business intelligence platform. Naveego brings all of a business’ data together, visually and in real time, with an end-to-end solution that includes a built-in data warehouse, dashboards, analytics and reports.

Connect with me!

Leave a Comment