Skip to main content

Privacy not the barrier to health IT

By Deven McGraw

Deven McGraw director, Health Privacy Project Center for Democracy & Technology testified before U.S. House of Representatives Committee on Science and Technology  Subcommittee on Technology and Innovation. Below is an excerpt of her remarks.

 

Survey data consistently show the public supports health IT but is very concerned about the risks health IT poses to individual privacy. Contrary to the views expressed by some, privacy is not the obstacle to health IT. In fact, appropriately addressing privacy and security is key to realizing the technologyʼs potential benefits.

Simply stated, the effort to promote widespread adoption and use of health IT to improve individual and population health will fail if the public does not trust it.

To build and maintain this trust, we need the “second generation” of health privacy — specifically, a comprehensive, flexible privacy and security framework that sets clear parameters for access, use and disclosure of personal health information for all entities engaged in e-health.
Such a framework should be based on three pillars:
Implementation of core privacy principles, or fair information practices;
Adoption of trusted network design characteristics; and
Strong oversight and accountability mechanisms.

This requires building on – and in some cases modifying – the privacy and security regulations under the Health Insurance Portability and Accountability Act (HIPAA) so that they address the challenges posed by the new e-health environment. It also requires enacting new rules to cover access, use and disclosure of health data by entities outside of the traditional healthcare system and stimulating and rewarding industry implementation of best practices in privacy and security.

In a digital environment, robust privacy and security policies should be bolstered by innovative technological solutions that can enhance our ability to protect data. This includes requiring that electronic record systems adopt adequate security protections (like encryption; audit trails; access controls); but it also extends to decisions about infrastructure and how health information exchange will occur.

For example, when health information exchange is decentralized (or “federated”), data remains at the source (where there is a trusted relationship with a provider) and then shared with others for appropriate purposes. These distributed models show promise not just for exchange of information to support direct patient care but also for discovering what works at a population level to support health improvement. We will achieve our goals much more effectively and with the trust of the public if we invest in models that build on the systems we have in place today without the need to create new large centralized databases that expose data to greater risk of misuse or inappropriate access.

We are in a much better place today in building that critical foundation of trust than we were two years ago. The privacy provisions enacted in the stimulus legislation – commonly referred to as HITECH or ARRA – are an important first step to addressing the gaps in privacy protection. However, more work is needed to assure effective implementation and address issues not covered by (or inadequately covered by) the changes in ARRA.

In my testimony below, I call for:
•Establishing baseline privacy and security legal protections for personal health records (PHRs);
•Ensuring appropriate limits on downstream uses of health information;
•Strengthening    protections    against    re-identification    of    HIP A A    de-identified    data;
• Encouraging the use of less identifiable data through the HIPAA minimum necessary standard;
• Tightening restrictions on use of personal health information for marketing purposes;
•Strengthening accountability for implementing privacy and security protections; and
•Strengthening accountability for implementing strong security safeguards.