Dec 24, 2021

3 Steps Providers Can Take To Fix the Data Quality Issues That Hurt Their Patients and Practices

This article was written by Dr. Oleg Bess and published by Healthcare Business Today. Click here to see original article.

Data quality is essential both to the delivery of quality care and the financial health of a hospital, health system or private practice.

 

All data is not created equal. High-quality data is accurate, usable and actionable, enabling insights and processes that benefit patients, providers and payers. Conversely, low-quality data – such as the duplicate records, missing patient names and obsolete information that fill far too many electronic health records (EHRs) – creates obstacles to care delivery and billing/payment. This not only negatively impacts providers of all sizes, it creates waste and inefficiencies across the entire healthcare ecosystem. 

 

If data is the new gold, healthcare providers are sitting on top of a goldmine. Unfortunately, healthcare has lagged behind virtually every other industry in leveraging data for strategic and operational advantage – despite having access to more patient data than ever as medical equipment, digital devices, apps and other data sources continue to proliferate. 

 

Yet provider organizations too frequently fail to leverage patient data. Like gold ore, it remains unrefined and thus of little to no value. This does a huge disservice to patients and other healthcare stakeholders – including hospitals, labs, private practices, accountable care organizations, health information exchanges and payers – by depriving them of value that is real and attainable. 

Impact on patient outcomes and the revenue cycle

Poor data quality creates problems throughout healthcare, imperiling healthy outcomes and organizational efficiency. It starts in the physician’s office, virtual visit or clinic exam room, where a significant lack of aggregated data for patients leaves clinicians knowing only part of a patient’s story.

When a doctor sees a patient at the hospital, for example, that physician needs the patient’s ambulatory outpatient records to make sound, evidence-based decisions about appropriate treatment. Too often, though, data that could inform clinicians at the point of care is trapped in siloes scattered across the healthcare landscape. Without knowing a patient’s medication history or whether (and when) a patient had a specific test, physicians run the risk of prescribing medications that could have an adverse impact or ordering tests that already have been conducted. These are just two examples of how poor data quality affects health outcomes and healthcare costs. 

 

Healthcare has been late to use artificial intelligence (AI) and machine learning (ML), two technologies that are transforming every other industry. While the low adoption rate of AI and ML by the healthcare industry is due in some part to these technologies being inaccessible to provider organizations, the overall quality of health data is so poor that intelligent machines would struggle to process and analyze it. “Garbage in, garbage out” isn’t just a glib phrase. 

 

If you feed quality data to AI/ML, however, clinicians at the point of care will be prepared with information that enables them to educate patients about their specific condition, offer referrals to appropriate specialists, suggest a new medication, and improve outcomes. The right data and insights can even help a clinician to match a patient with a clinical trial that could be lifesaving.

 

Low-quality data is a huge problem for payers because they must make decisions based on data, whatever it’s condition. This, as any provider will tell you, inevitably creates billing and payment problems. The longer a course of action is delayed, the longer it takes for providers to submit invoices to payers. Something as simple as a duplicate patient record can create confusion for payers that leads to denials and delays. When this happens, a patient’s medical condition may worsen and require more expensive medical interventions that could have been avoided if high-quality patient data had been available and utilized. 

3 steps to ensuring quality data

To achieve high-quality data, provider organizations must follow three specific steps. First, they must make sure they can access patient data. This in and of itself is a great challenge because healthcare organizations still struggle with interoperability and data sharing, despite some progress in recent years. It costs money and takes time for providers to replace legacy systems that may hold up to 80% of their medical records on in-house servers. Nonetheless, as healthcare continues migrating toward value-based care models that rely on quality data to improve outcomes and control costs, data must be made accessible via a central database. This necessitates a cloud-based approach to data storage.

 

The second step involves identity management. A great amount of healthcare data’s value is in its ability to document longitudinal change in individual patients. A single lab result, for example, only provides clinicians with a snapshot of a patient’s condition. This is helpful up to a point, but to track the progress of an illness or disease or gauge the effectiveness of a medication, clinicians need longitudinal data that can reveal trends and patterns.

 

However, even if providers can extract data from disparate sources and place that data into a single database, they may struggle to match data to patients. Effective identity management makes it possible to create a quality longitudinal patient record that can bring substantial clinical and efficiency benefits.

 

Once providers put data into the right patient’s chart and create a longitudinal record, they must make sure it is organized and easy for clinicians to locate and read. This is the third step, which is a process called data normalization. Health data comes from multiple sources (hospital and physician EHRs, labs, pharmacy systems, etc.), all of which may use different coding for a specific medical procedure, different terms for a certain test, or even use different language to categorize genders and ethnic groups. By normalizing data, provider organizations create a common terminology or language that facilitates the semantic interoperability necessary to make that data actionable.

Conclusion

Providers simply can’t wave a magic wand to improve healthcare data quality. Attaining consistent and widespread data quality will require continued development of interoperability standards, industrywide collaboration and the innovation inherent in the free market. Solutions will emerge to address pain points along the data journey – from accessibility to patient identification to normalization – while innovators will build quality databases that healthcare organizations can access to apply advanced analytics. Once provider organizations harness these powerful digital tools, they will have the data quality they need to successfully implement value-based care.

4medica Can Clean Your Patient Data Records. We Guarantee a 1% Duplication Rate or Less!

Talk With An Expert About Our Health Data Quality Solutions

4Medica in the news and Industry publications