Epic is a widely used Electronic Health Record (EHR) system by thousands of hospitals across the United States. There’s a very good chance that your physician uses Epic software in their everyday practice. Among many other tasks, Epic’s software helps manage your patient record, data, and laboratory results.
Epic is against a proposed rule by the U.S. Office of the National Coordinator for Health IT (ONC) to support patients’ ability to access their data. The rule would allow health apps and patients to more directly access all of their own clinical data locked up by proprietary systems like Epic.
The problem with Epic’s stance is that it takes a traditional paternalistic view toward health data — that Epic knows best, and patients can’t be trusted with their own health data.
Trust Us! Epic Knows Best
Epic says:
By requiring health systems to send patient data to any app requested by the patient, the ONC rule inadvertently creates new privacy risks. According to a recent study, 79% of health care apps resell or share data[i], and there is no regulation requiring patient approval of this downstream use. There are two highly likely patient privacy risks.
To which I reply, “So what?” People already have willingly exposed their personal data — including sensitive financial data — to apps for years. This data is regularly bundled, “anonymized”, and resold to data brokers.
It is up to each of us to protect our own data. It is NOT up to some company most people have never heard of nor know anything about to take it upon themselves to act as the data police — the sole arbiter of what app is appropriate or meets whatever arbitrary private, proprietary standards they set.
I can’t image a company in any other industry that would take it upon itself to say, “Sorry, no, we don’t trust you enough to send you your own data in whatever format or to whatever app you want.” I’ve never had a bank deny me access to my bank statements, a credit card company deny me access to my transactions, or a university deny me access to my transcripts.
Fake Scary Examples for Fake Use Cases
Epic then provides a few fake, made-up examples of possible misuse of health data that they want to protect patients from. They are meant to be scary, but they have no apparent basis in reality.
After surgery, Jim’s doctor wants to prescribe an opioid for Jim during his recovery. Jim prefers not to take an opioid because his brother Ken struggles with addiction. The doctor makes a note about that in Jim’s medical record. When Jim’s health data is sent to an app, and that data is used, shared, or sold, Ken’s addiction status may become public without Ken’s knowledge or permission.
I can’t imagine any app sharing private health information in such a way that it magically becomes “public without Ken’s knowledge or permission.” What would be the business model of a company that did that?
It boggles the mind that this is one of the best examples Epic could come up with.
A wellness app offers Liz a cholesterol study and asks her to approve sending the app her lab results. Liz does not realize that the app has gathered all of her lab results, including sensitive information such as her pregnancy status and STD testing results. She does not know that the app will sell that data. Once her health information is out, she cannot pull it back.
There is no app in the world that is reselling a user’s data without that user’s permission, as that would violate multiple laws. If such an app is doing so without a person’s permission, that app is engaged in illegal activity. No rule in the world is going to help with this, since it’s already illegal.
Individual Responsibility Lies with People, Not Companies
Granted, not every user is fully aware of the terms of use of the apps they download, but that is each person’s responsibility — not some paternalistic, proprietary software company’s responsibility. We are each individually responsible for how and when we share data that belongs to us. And nothing belongs to us more than data about our own health — doctor’s notes, laboratory and other data, CT scans, X-rays, the works.
That is the whole purpose of terms of use and privacy policies that companies are required to have. It is to give the end user the understanding of how their data will and will not be used or shared. The user can then make an informed decision about whether to use the app or not, based those terms and policies.
In fact, Epic themselves have extensive terms of use and privacy policies of how their software can and cannot be used. Apparently they are good enough for a company like Epic, but not good enough for app developers.
An ‘Epic’ Failure of Understanding
Epic is simply a large software company. They are, however, one with a lot of influence with hospitals and hospital systems across the country, but few patients know who they are — or care.
Epic should listen to the patient’s voice in this discussion, because e-patients overwhelmingly support greater, transparent access to their own data. The proposed ONC rules are completely inline with this effort. Epic is on the wrong side of this particular fight, and they would do well to listen to the patient voice.
Learn more
Stat News: Epic’s call to block a proposed data rule is wrong for many reasons
Take action: Sign the patient letter supporting the ONC rule
Thanks, John. What was especially weird to me was that this big company chose to spray this message onto its home page! Not on a press release page, not as a headline with link, but the WHOLE PAGE was taken over by this statement … on the same day that HHS Secretary Azar said, “Scare tactics are not going to stop the reforms we need.”
The other thing, completely weird on a different dimension, is that the company has expressed no such concerns (particularly the “brother Ken” scenario) about companies like Google buying access to EVERY PATIENT’s data in Epic and other systems.
Well, maybe not weird – maybe disingenuous.
Yes absolutely! They should listen to their patients. Their main role is to understand the patient and assist and explaining any confusion.
In the best case, such conversations are simply useless, and in the worst – harmful. Not later than today, I saw a post, the author of which claimed that a responsible person should take responsibility at his own expense for everything that happens to him. Others confuse responsibility with the ability to act. Someone even thinks that responsibility is such a form of social burden.
What are you talking about?? What “useless conversations”? And what the heck do you mean by “not later than today”?? NOTHING later than today has happened yet.
OH WAIT – you’re a spammer! You posted two weird comments on different posts, linking to Florida Rehab Experts! Nice try – busted :-) (I’ve removed the link, so your “client” won’t get anything for your work, except maybe a ding to its reputation for using you :-))