Healthcare providers knew that COVID-19 would threaten the lives of their patients, but few understood the greater ripple effects across their business and industry as a whole. For providers, two significant COVID-19-induced challenges arose: analytic strain and resource limitations. These challenges highlighted the critical importance of data quality.
Healthcare leaders can improve data quality throughout their organizations by understanding the data quality lessons learned from COVID-19. Five guidelines from these lessons will help organizations prepare for the next pandemic or significant analytic use case:
1. Assess data quality throughout the pipeline.
2. Do not leave analysts to firefight.
3. Look outside the four walls of the organization.
4. Data context and purpose matters.
5. Use a singular vision to scale data quality.
COVID-19’s onset was unexpected, and its rapid spread across the globe was unprecedented. Healthcare providers knew the disease presented a threat to their patients’ health, but few fully grasped the extent to which it would affect other facets of their business.
Early in the pandemic, providers faced COVID-19-induced challenges that largely fell into two categories: analytic strain and resource limitations. New COVID-19-specific value sets created analytic strain, the outbreaks drove urgent lab code updates, and the Centers for Disease Control (CDC) and other public health entities published ever-evolving guidelines. Compounding these issues were resource limitations, spawned as providers had to furlough staff due to canceled or postponed surgeries and office visits. Non-furloughed staff often had to work from new remote environments, straining communications and reporting structures.
Pandemic-driven urgency, variety of data, and a lack of resources have highlighted the critical importance of data quality as a prerequisite for any analytic use case. The COVID-19 pandemic will not be the last of its kind. Organizations must prepare for the next large-scale emergency by committing to a systemwide data quality strategy that produces accurate data at all organizational levels.
Data quality, the state of qualitative or quantitative pieces of information, ranks as “high” when it helps users make quick and accurate decisions. Healthcare providers must cultivate or adopt a systemwide approach to achieve and maintain this data quality level. For example, health data users can base their quality approaches on the Toyota Total Quality Management Approach, a widely accepted framework that integrates customer-centric data quality into each business’s facet. In a healthcare adaptation, all aspects of a health system work together to ensure the free flow of data across the organization, and all are accountable for the quality of that data.
The fast and furious nature of COVID-19 has highlighted areas for improvement in healthcare data quality that organizations must address in preparation for future analytic use cases. To get started, healthcare providers can follow the guidance of five data quality lessons learned from COVID-19.
End users discover most data quality issues too late—at the conclusion of the pipeline. At that point in the process, an analyst or subject matter expert (SME) must engage in a time-consuming root cause analysis to determine where things went awry, further delaying the delivery of accurate and actionable results.
Report writers, analysts, and SMEs can move quality up the pipeline by assessing their data and inserting quality checks on top of the model wherever they add data. For example, imagine a report is created that lists COVID-19 patients and their primary care providers (PCPs). When an analyst looks at that report, she sees that each patient is listed with every PCP they have ever encountered, as opposed to just their current PCP.
The analyst knows patient and PCP should be a 1:1 relationship, not a one-to-many relationship, so she kicks off a root cause analysis to determine where the error entered the model. She may find that when the patient and PCP tables joined, the analyst hadn’t included a time component. She is then able to build a data quality check on top of the model to ensure that if that 1:1 relationship is broken, it sets off an alert.
Analysts are not firefighters. Organizations shouldn’t rely on them to quickly address analytic emergencies when they arise, though this is often the case. The onset of COVID-19 has created human resource shortages while also increasing requests for metrics and reports. Analysts need a framework that allows them to focus on data quality but should not carry the full burden of quality maintenance. SMEs and report writers access the data in the later stages of its lifecycle and must contribute to its quality by analyzing it from their unique perspectives and implementing quality checks as needed.
When analyzing data, team members should make an effort to look outside their organizational silo with an eye on the two Vs—verification and validation:
Both context and purpose matter when determining whether data quality is sufficient to support decision making, but they become increasingly important as data travels up the framework. Table 1 shows a four-level data quality framework.
DefinitionContext Dependent?Purpose Dependent?Level 1 – StructuralDatabase constraints are enforced including data types, NULLs, primary keys, and referential integrity.NoNoLevel 2 – Content: Single Subject AreaValues are reasonable within the context of the domain.NoNoLevel 3 – Content: Multiple Subject AreasValues are reasonable across multiple domains.YesNoLevel 4 – UtilityValues represent information empirically demonstrated to support better decisions.YesYes
Table 1: Four levels of a data quality framework.
The data quality framework suggests that providers think of data as a product and address its quality as it traverses the system. Analysts and data users should address structural data quality before moving on to more complex challenges. They can then work with the SMEs who use the data to define single subject area and multisubject area data quality use cases, resulting in a data quality coalition across the organization. Overall, the framework helps providers maintain quick access to accurate data for use in typical day-to-day operations or extreme cases such as the response to COVID-19.
Team members in different departments across the organization likely spend a small percentage of their time addressing data quality, resulting in a low level of commitment and impact across all resources. When those team members adopt a singular vision and framework, they pool all resources, turning those small percentages into a scalable, single vision for data quality, resulting in cohesive insights.
COVID-19 has highlighted the crucial importance of embracing a systemic approach to healthcare data quality. Analysts and SMEs are necessary pieces of the puzzle, but they are not the solution, and just-in-time data is simply too late when lives are on the line. By taking these lessons learned from COVID-19, organizations can build a reliable data quality framework, preparing them to save jobs and lives when the next urgent analytic use case arrives.
Would you like to learn more about this topic? Here are some articles we suggest:
Would you like to use or share these concepts? Download the presentation highlighting the key main points.