Learning the 1-10-100 Rule of Data Quality

by on in

The Data Quality Metric That Matters

It’s easy to get lost in the myriad of statistics surrounding the importance of data quality. But as insightful as they may be, we start off most of our conversations with clients interested in a data quality solution with a very simple metric. This metric, which we call the 1-10-100 Rule, depicts both the high costs associated with the endless cycle of using bad data to run your business.

Deconstructing the 1-10-100 Rule of Data Quality

The basis of the rule is very simple; it takes $1 to verify a record as it is entered, $10 to cleanse it, and $100 if nothing is done - the ramifications of the mistakes are felt over and over again for as long as the data goes unaddressed. It’s this last part of the rule, the financial impact of leaving unaddressed quality issues in place, that should raise the eyebrows of any business leader or head of a company department. It shows what many fail to realize about data quality; poor data quality isn’t a one-off problem. It’s an issue the compounds lost energy, time, and money every single day it’s not addressed.

 

Take A Tour of Naveego

The Financial Impact of Bad Data

IBM estimates that 2016, U.S. businesses lost in the neighborhood of $3.1 trillion dollars due to unaddressed data quality issues. That’s not a huge surprise, considering that some industry experts say that 32% of any given business data set doesn’t meet basic data quality criteria. The Harvard Business Review supports this estimate, saying that just 3% of companies’ data meets basic data quality requirements.

Based on this information, it’s appropriate for us to draw two pretty simple, yet important conclusions. First, poor data quality impacts almost every business in some way. Second, as the 1-10-100 rule shows us, unaddressed data quality issues only get more expensive over time.

 

Start trusting all your data with Naveego DQS

Leave a Comment