Friday Afternoon Measurment-CONTEST!!!
Many, many people start their data quality programs with their first measurement. It can be scary, but I encourage everyone to do so. To help those that may need a bit of inspiration, we’re running a contest. To enter you must:
Complete a Friday Afternoon Measurement. Full instructions on how to complete one can be found here
Turn in your results by March 20, 2020 to info@dataqualitysolutions.com
We’ll hold a drawing of all valid submissions on March 24 and the winner will receive a free day of my training or consulting.
Those pesky details:
Your FAM Measurement: Follow the steps outlined here.
Your Submission: Please include the following:
Your Name and email
Company/Government Agency Name
Industry Sector (e.g., finance, oil & gas) or Government
Country in which you are based
Company/Agency Size: Small (less than 1000 employees); Medium (1001 to 5000 employees); Large (greater than 5000 employees)
The type of data you examined (e.g., Employee data, customer data, etc)
Your DQ Score
What is the most important thing you learned along the way?
May we use your country, industry, data type and DQ score results in analyses and summaries going forward?
Note: We will NOT use your name or your company name, without obtaining explicit permission.
****Please send an email to info@dataqualitysolutions.com if you have any questions****
The Prize: The person/company whose name is drawn will receive one free day of consulting or training from Dr. Tom Redman. If you are located within two hours of New York City, this can be done on-site. If you are located outside this area, the consulting/training will be delivered via the web.
Part 3: Coming to Grips with the Implications:
Update to “Only 3% of Companies’ Data Meets Basic Quality Standards”
Thomas C. Redman, January 28, 2020
In data quality, measurement is an important step. But it is only the first. The last installment of this three-part update, captures some next steps.
When they are making their Friday Afternoon Measurements, most people grow concerned as they see more and more red on their spreadsheets (see this link to understand why they are seeing this red) and none-too-happy when they tote up their score. A few people, unfortunately, dismiss their results. But most reflect more deeply on their scores and implications for themselves, their departments, and companies/government agencies. It is almost like the “wake up” to data quality. The following quotes convey the range of emotions.
Physician: “The FAM exercise was interesting. The biggest takeaway for me was how willing I have become to accept flaws in data. Even data flaws that certainly contribute to worse patient care, inefficiency, dyscoordination, etc are so commonplace that I have ceased getting worked up about all but the most egregious. The flaws, as you might expect, are ubiquitous: patient contact information being wrong, medications being incorrectly documented in some way, metrics being mis-presented or miscalculated, blood pressure measurements being incorrectly drawn or recorded, etc.”
Senior executive: “If I were being honest, I would have expected theresults to have shown close to 99% perfect records–however that was not the case. In fact I was shocked by how many records were imperfect.”
Clinical Nursing Manager: “Unfortunately there was a serious data problem.Only 46% of the data are perfect records.This is not acceptable data and it does not conform to standard.”
Manager: “Confirmed our under- lying fears that in spite of the checks and audits in place a high degree of inaccuracy persists in the data.”
Clinical Risk Manager: “While ≥83% of each attribute was populated, the requisite values were not always of adequate depth and breadth.The perfect score of 3% is concerning particularly as the data forms part of a national dataset. Therefore, the FAM highlighted data quality issues, which will impact analysis undertaken at a local and national level to identify risk clusters. This has the potential to cause reputational damage.”
Of course, many of those making their first data quality measurement then pick something to improve, finding and eliminating root causes and increasing their data quality scores for good. I’m especially proud of these people. And I’ll take up their work another time.
This three-part update is the prelude the contest we’ll announce in a couple of days
Part 2: Two Deeper Analyses:
Update to “Only 3% of Companies’ Data Meets Basic Quality Standards”
Thomas C. Redman, January 21, 2020
This article builds on an earlier one, summarizing results of data quality statistics gathered using the so-called Friday Afternoon Measurement Method. Table 2 presents a head-to-head-to-head comparison of Ireland, the US, and Israel, and Table 3 compares health-care and non-healthcare results.
I present table 2 because, some people in the US have discounted results from Ireland, claiming, “well, results may be poor in Ireland, but we’re different in the US.” While there is not a lot of US data, this claim simply does not hold up. Indeed, initial indications are that there are no practical differences in measured data quality between the US and Ireland.
While the results from Israel appear to be a bit better than Ireland and the US, be careful about reading too much into this result: The sample size from Israel is small (17), the students were MBA candidates, and other factors may complicate direct comparison. Most importantly, even “slightly better scores” are still low, with the high attendant and costs to companies and government agencies.
Shifting gears a bit, I’ve not run into a single person who claimed that “data quality doesn’t matter to my company or job.” But those in health care are especially adamant. They tell me, “When our data is bad, people die.” Hence Table 3.
I don’t see any meaningful difference between DQ scores in healthcare and other industries. To my friends in that industry, I ask, “what is going on?”
I’ll conclude this three-part summary with a series of quotes as people come to grips with the implications of their data quality scores.
Part 1: Pre and Post Analysis:
Update to “Only 3% of Companies’ Data Meets Basic Quality Standards”
Thomas C. Redman, January 14, 2020
Two years ago, Tadhg Nagle, Dave Sammon, and myself published “Only 3% of Companies’ Data Meets Basic Quality Standards.” That article attracted a lot of attention. And rightly so--it confirmed that most data was in far worse shape than people and companies thought. It’s time for an update--actually three updates, which I’ll provide over the next several weeks.
Before jumping in, the most important headline hasn’t changed: ONLY 3% OF COMPANIES’ DATA MEETS BASIC QUALITY STANDARDS!
Tadhg, Dave, and myself continue to ask people to conduct make data quality measurements using the Friday Afternoon Measurement (FAM) technique (fully explained here-link to video) in classes and seminars (I also ask people to use FAM in my advisory practice, though these results are NOT included here). Our database now includes 195 measurements from a diverse array of companies/government agencies and data types--customer data, operations data, financial data, and so forth. While the original seventy-five measurements were made in Ireland, the 195 now includes 17 measurements made in Israel and 19 made in the United States, respectively.
This article compares pre- and post-September 2017 measurements from Ireland (75 measurements pre, 84 post). Table 1 provides the summary stats.
It is clear enough that there is no practical change in the data quality scores of Irish companies since September, 2017. The headline result holds. And the impact is enormous! While there is high company-to-company variation, expect the costs of bad data to be about 20% of revenue for a typical corporation.
Next up: Two more comparisons of DQ Statisticss