<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=155514168445921&amp;ev=PageView&amp;noscript=1">

What is Data Quality?

What-Is-Data-QualityCompanies live and die by the data that they compile. Information regarding everything from the typical buying characteristics of preferred customers to current technological trends used to predict potential success rates of future product releases is crucial to the longevity and profitability of the business. If the compiled data is not accurate, then the results can be catastrophic.

Data Quality Defined

The term data quality refers to the reliability of information in regards to the specific purposes for which it is compiled. Poor data quality costs corporations billions of dollars annually. Not only must the data be accurate and complete, but the very way in which organizations gather, enter, store, and manage the data plays a critical role in determining its level of quality, as well. Information is a powerful asset, but it is an asset that is somewhat intangible.

Unlike other, more concrete assets, many business executives have a terrible time placing a monetary value on their data resources, which is a primary reason that so many companies today are slow to implement data quality management programs. If the organization cannot easily assign a dollar value to the volumes of constantly updating information running through their databases, then how does an executive decide how much money to spend on an effective management solution? After all, everyone wants a significant return on their investment.

What is Data Quality Management?

Thanks to the advancements in cloud-based services, mobile software, and other Big Data technologies, businesses now have more information entering their organization than ever before. This volume of data is impossible to analyze without an automated, data-driven, decision-making process. Instead of focusing on how the company compiles, stores or manages the data, business intelligence systems focus on how the organization shares this information throughout the company so that key players or individual departments can make accurate decisions based on their unique set of criteria.

In other words, data has different meanings to different people. For example, the marketing department might look at data from last year’s sales figures to determine which advertising campaigns were most effective. Meanwhile, the Human Resources Department might use the very same data to determine the validity of hiring new employees. Poor data quality can lead to a domino effect of bad decisions occurring throughout every department of the organization.

Roles, Responsibilities, and Procedures

Every piece of data that enters the system must be accurate, reliable, and instantly accessible by a wide range of decision-makers within the company. Data Quality Management refers to the establishment and deployment of different roles, each with their own responsibilities and procedures regarding the acquisition, maintenance, and dissemination of the related data. Within a business intelligence environment, there are several roles involved in an effective and efficient data quality management operation.

 

Program Leader

Sometimes referred to as the Project Manager, this position is responsible for overseeing the individual projects and the overall business intelligence program. The Program Leader manages the day-to-day operations based on scheduling, budgetary, and scope constraints. His or her primary role is to work with other company decision-makers to establish the requirements regarding data quality. What makes one piece of data “quality” and another “subpar?” The answers will vary according to the department.

 

Change Agent

Sometimes referred to as the Organization Change Manager, this position oversees revisions to the overall business intelligence protocols as they relate to different issues and their potential impacts. No organization operates in a vacuum. There will always be challenges in the collection, entry, storage, or distribution of high-quality data. The Change Agent helps the organization as a whole to understand the importance of dealing with these inaccuracies quickly.

 

Data Analyst

Sometimes referred to as the Business Analyst, this role is largely responsible for establishing the guidelines for defining the minimal quality levels of the incoming data. He or she then ensures that the business intelligence systems reflect these data quality requirements.

 

Data Steward

This is the “go-to” person within the organization regarding all queries or issues related to the compiled data. For example, if a department head has a particular problem to solve but does not know which data to use to arrive at a solution accurately, then the Data Steward is the person who will most likely have the answer. If an employee wants to validate the accuracy, reliability, or completeness of data within a specific business context, the Data Steward is the person to ask.

Data Quality Management Challenges

Establishing a data quality management program is not an easy task. Even defining the threshold guidelines for classifying which data is “quality” and which is not can result in a series of time-consuming conversations that involve all levels of organizational management. Some of the primary reasons that many companies fail to implement a data quality management program include:

  • “Too many cooks in the kitchen.” Everyone has a different definition of data quality.
  • Even though everyone has an opinion, few people step up to the plate to take control of implementation process for the data quality management program.
  • Executing such a process requires company leaders to officially recognize existing problems.
  • Return on investment can be difficult to quantify.

Another big problem is locating a reputable person to oversee the entire program once it is in place. Most employee job descriptions do not include a responsibility for data quality. Once the data is entered into the databases, most employees tend to wash their hands of the affair almost immediately. If anything goes wrong, then they simply blame the IT Department.

IT should not be creating the guidelines for defining data quality. They should not be involved in the individual decision-making processes that result from the data either. The role of the IT Department is to ensure that the provided data quality protocols are implemented and enforced.

Unfortunately, many top-level business executives of today fail to make this distinction. Even though the World Wide Web is 25-years old, technology is advancing at such a rapid pace that too many live in fear of implementing a system that may very soon be obsolete. However, the corporate world is finally discovering that data quality management systems can easily scale and expand along with the ever-changing needs and demands of the business. Data quality management is no longer a concept of the future. It is now an essential component for the success and profitability of any organization in today’s highly competitive marketplace.

 

Try Naveego Data Quality Services (DQS) and see how it can help you automate your data quality process.

 

Derek Smith

About Derek Smith

Derek is Co-founder and CTO of Naveego, Inc. His belief that leading-edge analytics should be in reach for everyone led to the development of the Naveego business intelligence platform. Naveego brings all of a business’ data together, visually and in real time, with an end-to-end solution that includes a built-in data warehouse, dashboards, analytics and reports.

Read All »