4 Steps to Take Now to Determine the Quality of Your Data

With executive confidence dwindling and the high costs associated with poor data, it’s crucial to ask: How bad is your data quality? More importantly, how can it be quantified? US businesses lose an estimated $611 billion annually due to data quality problems, and less than 33% of companies trust their data’s quality.

Understanding your data’s quality is essential. Check out our previous post on the cost of bad data for more insights. This post will guide you through a quick method to measure your data quality using a simple yet effective approach.

We recommend the Friday Afternoon Measurement (FAM) method to assess data quality (DQ). This method provides a clear, actionable score for your data quality. According to the Harvard Business Review, 47% of newly created data records contain at least one critical error, and only 3% of data quality scores were rated as “acceptable,” even by the loosest standards. These poor scores span across all business sectors, both private and public.

How to Use the FAM Method

Here’s how you can apply the FAM method in four straightforward steps to get a DQ score.1

Step 1: Gather the last 100 data records your team used, such as setting up a customer account or delivering a product.

Step 2: Invite two or three colleagues who understand the data for a two-hour meeting.

Step 3: Review each record with your colleagues, marking obvious errors. This process should be quick, usually taking no more than 30 seconds per record. In some cases, you may need to discuss whether an item is incorrect, but typically, errors like misspelled customer names or misplaced information will be immediately apparent.

Step 4: Summarize the results in a spreadsheet. Add a “record perfect” column, marking “yes” if there are no errors and “no” if there are any.

To interpret the data, extrapolate the errors. For example, if only 40 out of 100 records are error-free, you have a 40% DQ score and a 60% error rate. This error rate can be quantified using the rule of 10, which states it costs ten times more to complete a task with defective data than with perfect data.

For instance, if your team must complete 100 units per day at a cost of $1.00 per unit with perfect data, the daily cost is $100. However, with only 40% perfect data, the total cost would be:

Total cost=(40×$1.00)+(60×$1.00×10)=$40+$600=$640\text{Total cost} = (40 \times \$1.00) + (60 \times \$1.00 \times 10) = \$40 + \$600 = \$640

As shown, the cost increases over six times when the DQ score is not 100%. Reducing errors by 50% in this scenario would result in a 42% reduction in daily costs. Imagine the savings your organization could achieve by improving data quality.

By following these steps, you can gain a clearer understanding of your data quality and take actionable steps to improve it, saving time and resources for your organization.

1 Thomas Redman, Harvard Business Review, Assess Whether You Have a Data Quality Problem

    Share

           

    Categories

    Related Posts