Skip to main content

How the big data tools ACA, HITECH enable will improve care

By Roger Foster , Partner at ValityX

The cost to the U.S. healthcare system from provider performance inefficiencies and medical errors has been estimated in the range of $75-100 billion annually. The lack of good care coordination across healthcare services has been estimated to cost an additional $25-50 billion annually. These care delivery inefficiencies from individual providers are a large component of the $600-850 billion surplus in healthcare spending.

Medical errors and their impact on patient safety have been recognized as a significant issue since at least the 1960s. The Institute of Medicine (IOM. “To Err is Human: Building a Safer Health System," 1999) estimates that medical errors cause between 44,000 and 98,000 deaths annually in the US. The estimated cost of these errors is between $17 and $29 billion annually. The high error rate also leads to some patients delaying medical treatments and allowing their conditions to worsen because of concerns about becoming a victim of medical error.

[Big data and public health, part 4: Reducing administrative inefficiencies with big data tools.]

Diagnostic errors, prescription errors, and performance variance across medical sites are all part of the embedded inefficiencies that ultimately increases cost and decreases the overall quality of public health. The inability to easily share medical records across care providers and institutions also causes duplication of services and redundant costs. Big data is a critical tool for improving provider performance, reducing medical errors and improving the coordination of healthcare.

Bolstering provider performance
Hospitals are capital-intensive businesses that must make efficient use of their assets to remain viable in a cost-constrained healthcare environment. Improvements include better utilization of professional staff extenders such as nurse practitioners and physician assistants to replace routine work that before was performed by medical doctors at a higher cost.

It also includes better utilization of facilities and equipment, such as low utilization of expensive imaging equipment and the efficient scheduling of operating rooms and support teams. Most hospitals still face problems with the over-utilization of acute care (emergency room) services and intensive care units.

From a big data perspective, hospitals need to get control of their operational information. That means knowing what data they have. They need to link this data to financial drivers, and build teams of data analysts who can review the hospital data from an operations perspective to benchmark performance. Finally, this data needs to be put into a format that can be used for executive decision-making.

Reducing errors
Traditionally, medicine has relied on the mind of the individual physician as the repository of medical knowledge. With the collection of enormous data stores of individual electronic records, huge population databases, and the latest literature of clinical knowledge, it is now possible to integrate individual patient data with comprehensive medical knowledge and population health data to support an enhanced clinical decision making process – and that, in turn, promises to reduce medical errors.

According to Dr. Larry Weed, MD (Medicine in Denial, Weed & Weed 2011) two things need to happen to support the integration of patient data with medical knowledge:

  1. The most relevant data need to be selected and its implications understood by the care providing team to supplement the decision process before a clinical judgment is made.
  2. The data needs to be organized to manage multiple problems over time, particularly in the case of complex medical cases where there are multiple medical issues involved.

The critical big data problem is to identify the relevant data and present the implications of those data to the healthcare team. That approach couples individual patient specific information with the best medical knowledge and relevant population data to enable better diagnosis – and to do this for both simple single-problem cases and for complex cases with multiple medical issues.

Coordinating care
Patients don’t use a single healthcare provider. Yet, today, patient records remain the proprietary information of the provider and not the individual. Medical records are still not easily exchanged between institutions. When they are, it is often done via printed records hand carried or faxed. The difficulty with sharing information between providers results in duplication of tests and diagnostic effort. In critical care situations this can also result in inappropriate treatment and adverse drug reactions due to lack of medical record information.

[Part 3: Top 9 fraud and abuse areas big data tools can target.]

Better standardization of the medical record increases use of health information exchanges and patients’ ability to access, control, and manage their own medical information will improve this situation. Improved ability to share information across providers will provide better population data and enable more efficient use of healthcare resources. Both uses will enable better big data analysis results.

Go to the next page to continue reading...

Effective information sharing will require medical record interoperability between not only the large government health care providers such as the Department of Veterans Affairs (VA) and DoD’s Military Health System (MHS), but also all major third-party commercial systems. The agencies need new tools to allow for greater electronic record interoperability across different electronic record systems. Additionally, patients need access to their health records so that they can personally participate in their own healthcare treatment decisions, which is where joint iEHR project come in, as does the “Blue Button” initiative to enable veterans to download copies of their personal health records.

HITECH and ACA will enable big data tools
Two major legislative initiatives have supported healthcare reform over the last several years: The Health Information Technology for Economic and Clinical Health (HITECH) Act, initiated under the American Recovery and Reinvestment Act of 2009, and the Patient Protections and Affordable Care Act (ACA) of 2010.

HITECH focuses on the supply side of healthcare services by laying the foundation for a de facto nationwide, interoperable, private, and secure electronic health information system. It requires healthcare provides to make meaningful use of their electronic records, financially rewards hospitals, physicians, and healthcare professionals for using EHRs through the Medicare and Medicaid reimbursement system. HITECH also includes necessary standards and certifications for electronic records, without which EHRs will do little to improve actual health of individuals and achieve efficiency in overall care across our health system. These standards enable health data to flow freely, privately and securely.

[Big data and public health, part 2: Reducing unwarranted services.]

The ACA, for its part, gets at the demand side of healthcare. It focuses on making sure that nearly all Americans are covered by affordable health insurance and have access to quality care. Its goal is to reduce the amount of cost-shifting that goes on within the healthcare system to cover the acute emergency care of individuals without health insurance and without the ability to pay for their healthcare costs.

Together, HITECH and ACA are enabling a private healthcare system that provides for standards of healthcare record interoperability and will enable the use of big data tools to supply innovative improvement in the use of health IT to reduce cost and improve the quality of healthcare. While both of these Acts are important to the current national effort at healthcare reform, it is primarily the HITECH – which is not part of the Supreme Court case – that is driving the implementation of next-generation health IT and big data systems through the mandated meaningful use criteria for EHRs.

Use case: VA transforming care via informatics
The VA, as a major provider for health care delivery, is interested in reducing the diagnostic, prescription errors, and performance variance across their healthcare delivery sites. The VA began a major initiative established by Secretary Shinseki in September 2010 to Transform Health Care Delivery through health informatics. The initiative will provide the foundational IT and informatics components for VA’s transition from a medical model to a patient-centered model of care. Big data tools will be critical to improving quality and lowering cost of care.

A key part of this initiative is to build a Health Management Platform (HMP) to Transform Patient Care. HMP will integrate health informatics and IT to the next generation of browser-based EHR. This platform will be the basis for VA’s ability to use big data tools across the entire VA EHR.

[Part 1: How to harness Big Data for improving public health.]

The new HMP will comprise infrastructure and clinical functionality that focuses on increasing patient engagement and satisfaction, enhancing team capability and adaptability for patient centered care, and addressing population-based or health system aspects of care. The system will be open and extensible with a published Collaborative Development Environment (CDE) to expose data from the Veterans Health Information Systems and Technology Architecture (VistA) enterprise EHR and other sources. The CDE will work in a manner that allows true interoperability, decision support, and will support big data longitudinal studies of veterans health data.

Healthcare organizations and government agencies must use their vast big data collections of patient and population data to improve the overall quality of medical resources and reduce costs associated with the variability in performance across the healthcare system.

In the next article, I will address how big data can be used for preventable conditions and support avoidable care including the early detection of disease, drug and product failures and helping individuals manage their overall health.
 

Roger Foster is a Senior Director at DRC’s High Performance Technologies Group and advisory board member of the Technology Management program at George Mason University. He has over 20 years of leadership experience in strategy, technology management and operations support for government agencies and commercial businesses. He has worked big data problems for scientific computing in fields ranging from large astrophysical data sets to health information technology. He has a master’s degree in Management of Technology from the Massachusetts Institute of Technology and a doctorate in Astronomy from the University of California, Berkeley. He can be reached at rfoster@drc.com, and followed on Twitter at @foster_roger.