Search all of the Society for Participatory Medicine website:Search

Epic is a widely used Electronic Health Record (EHR) system by thousands of hospitals across the United States. There’s a very good chance that your physician uses Epic software in their everyday practice. Among many other tasks, Epic’s software helps manage your patient record, data, and laboratory results.

Epic is against a proposed rule by the U.S. Office of the National Coordinator for Health IT (ONC) to support patients’ ability to access their data. The rule would allow health apps and patients to more directly access all of their own clinical data locked up by proprietary systems like Epic.

The problem with Epic’s stance is that it takes a traditional paternalistic view toward health data — that Epic knows best, and patients can’t be trusted with their own health data.

Trust Us! Epic Knows Best

Epic says:

By requiring health systems to send patient data to any app requested by the patient, the ONC rule inadvertently creates new privacy risks. According to a recent study, 79% of health care apps resell or share data[i], and there is no regulation requiring patient approval of this downstream use. There are two highly likely patient privacy risks.

To which I reply, “So what?” People already have willingly exposed their personal data — including sensitive financial data — to apps for years. This data is regularly bundled, “anonymized”, and resold to data brokers.

It is up to each of us to protect our own data. It is NOT up to some company most people have never heard of nor know anything about to take it upon themselves to act as the data police — the sole arbiter of what app is appropriate or meets whatever arbitrary private, proprietary standards they set.

I can’t image a company in any other industry that would take it upon itself to say, “Sorry, no, we don’t trust you enough to send you your own data in whatever format or to whatever app you want.” I’ve never had a bank deny me access to my bank statements, a credit card company deny me access to my transactions, or a university deny me access to my transcripts.

Fake Scary Examples for Fake Use Cases

Epic then provides a few fake, made-up examples of possible misuse of health data that they want to protect patients from. They are meant to be scary, but they have no apparent basis in reality.

After surgery, Jim’s doctor wants to prescribe an opioid for Jim during his recovery. Jim prefers not to take an opioid because his brother Ken struggles with addiction. The doctor makes a note about that in Jim’s medical record. When Jim’s health data is sent to an app, and that data is used, shared, or sold, Ken’s addiction status may become public without Ken’s knowledge or permission.

I can’t imagine any app sharing private health information in such a way that it magically becomes “public without Ken’s knowledge or permission.” What would be the business model of a company that did that?

It boggles the mind that this is one of the best examples Epic could come up with.

A wellness app offers Liz a cholesterol study and asks her to approve sending the app her lab results. Liz does not realize that the app has gathered all of her lab results, including sensitive information such as her pregnancy status and STD testing results. She does not know that the app will sell that data. Once her health information is out, she cannot pull it back.

There is no app in the world that is reselling a user’s data without that user’s permission, as that would violate multiple laws. If such an app is doing so without a person’s permission, that app is engaged in illegal activity. No rule in the world is going to help with this, since it’s already illegal.

Individual Responsibility Lies with People, Not Companies

Granted, not every user is fully aware of the terms of use of the apps they download, but that is each person’s responsibility — not some paternalistic, proprietary software company’s responsibility. We are each individually responsible for how and when we share data that belongs to us. And nothing belongs to us more than data about our own health — doctor’s notes, laboratory and other data, CT scans, X-rays, the works.

That is the whole purpose of terms of use and privacy policies that companies are required to have. It is to give the end user the understanding of how their data will and will not be used or shared. The user can then make an informed decision about whether to use the app or not, based those terms and policies.

In fact, Epic themselves have extensive terms of use and privacy policies of how their software can and cannot be used. Apparently they are good enough for a company like Epic, but not good enough for app developers.

An ‘Epic’ Failure of Understanding

Epic is simply a large software company. They are, however, one with a lot of influence with hospitals and hospital systems across the country, but few patients know who they are — or care.

Epic should listen to the patient’s voice in this discussion, because e-patients overwhelmingly support greater, transparent access to their own data. The proposed ONC rules are completely inline with this effort. Epic is on the wrong side of this particular fight, and they would do well to listen to the patient voice.

 

Learn more

Stat News: Epic’s call to block a proposed data rule is wrong for many reasons

Take action: Sign the patient letter supporting the ONC rule

 

 

 

Donate