For these meetings, one needs to submit prepared remarks in advance, for the committee to digest in advance. And from what I’ve learned so far about this, there’s a lot to chew on, and people of all stripes (that’s you) can probably provide valuable input. At very least you can express yourself.
[Update 1 pm ET Thursday: I haven’t been able to convert the recording mentioned below (which is in RealPlayer format) to display it here, but if you have RealPlayer installed you can play it yourself at http://real.welch.jhu.edu/ramgen/DHSI/Dec182009.rm. The slides are often out of sync with the audio but they catch up. Skip the first 9 minutes; the talk starts around 9:15 and goes 30 minutes, followed by 30 minutes of Q&A. The meat of it is in Dr. Koppel’s talk, but the Q&A has more juice.]
The meeting subject is Health IT Safety. This sounded like an odd topic – considering all the things that go wrong because data is NOT computerized, what are they talking about, re safety risks WITH the technology??
I know I’m in over my head, being “just a patient.” But the more I learned tonight, the more I thought maybe that’s a good thing.
Because tonight I learned there are truly massive problems with the workflow and flow of information in today’s systems. Later Thursday I hope to post a link to a 30 minute webcast with 30 minutes of Q&A from December, documenting what some horrifying failures in today’s big EMR systems. You’ll judge for yourself, but imagine…
- …if a system you had to use every day would often display unreliable information, while insisting you do things that make no sense.
- …that the maker of the system insists on a “hold harmless” clause, so when something goes wrong, it’s not their fault. Legally no consequence for failures.
- …that nobody involved in the malfunctions talks about them, and many people are not allowed to talk about them, much less collect a bug list.
Examples of software issues in these multi-million dollar systems, as documented in the webcast:
- The user interface (UI) may not highlight what needs the user’s attention. (Ever dealt with a web page where you can’t spot the info you need? That’s what these systems can be like.)
- Sometimes, values can’t be sorted in numerical order. The computer system cannot be programmed to do this, so workers have to hunt through a list to find the number they want. (I’m not making this up.) (Imagine a website where a list of states was unsorted and couldn’t be fixed.)
- Units of measure get intermingled: a patient’s weight that must be entered in kilograms on one screen might be displayed on another screen without the units showing, so sometimes it’s interpreted as pounds (because pounds are used elsewhere in the same system). Imagine the consequence on medication dosage.
- Workflows are sometimes set up to insist on certain actions (even if they’re wrong), so users have to create torturous workarounds. (Remember, “users” is the doctors and nurses who are trying to take care of your mother. Part of their attention is consumed by coping with a system that doesn’t work.)
- Attempts at quality control (for instance bar codes for prescriptions) can be thwarted by real-world circumstances that weren’t tested: babies who chew off their bar code bracelets, bar codes on prescription bottles that aren’t durable enough to withstand normal handling, etc. In this study, 4.2% of all bar code bracelets didn’t work! How’s that for a safety feature?
I’m not making this up; you’ll judge for yourself when the video is up. Pitfalls we wouldn’t tolerate in the simplest word processing program are commonplace in million dollar medical systems. Ending up with erroneous data is not unusual.
And we want to transmit this data? Yikes.
For the purpose of this post:
- Imagine that the system running your hospital might be full of crap. (Yes, that word appears in the webcast: a major hospital executive famously rejected a major EMR system and replaced it with another, which he described as “the cream of the crap.”) (He said about the system they’re moving to.)
- Given this situation, what do we ask our government to do?
Our answers will go to the specific people who will recommend national policy on this. What advice can we give them on how certify a system so that it qualifies for Federal stimulus money?
You software system people out there: what would you do? You don’t get to demand a different reality; we have to start where we’re at. What do we do?
You clinicians – the people who have to use the systems: what would you want the government to do?
Here’s one radical idea:
What if a system could only get certified (and thus get the stimulus money) if the people who use it say it basically works??
(Can a system be meaningful if the users say it doesn’t work?)
Guiding principle: ask the workers who are directly impacted if the system screws up.
Another idea: since we can’t wave a wand and fix everything instantly, prioritize collecting failure data so we can figure out what needs fixing, and we can prove that a fix has worked. (Software tools to do this are common in high tech.)
Two guiding principles here:
- Lives are at stake. I can imagine no valid excuse for interfering with this effort.
- Let doctors and nurses do their jobs. If a system interferes with my nurse practitioner, to me that’s a problem. We must stop systems from getting in the way; good systems don’t.
I imagine this has to be combined with amnesty for errors. If people get punished for reporting a mistake, it won’t happen. (I heard the FAA has such a policy, and it’s helped greatly in reducing causes of crashes.)
A third idea, which would need to be thought out: Allow a second set of eyes to check for obvious mistakes. An obvious resource here is the patient or family or advocate. But given that the systems can be awkward for professionals to use, I’m not sure how to approach this.
I do know, though, that no stakeholder is more motivated. And as cancer widow and 73 cents artist Regina Holliday made abundantly clear in December, you might be surprised what a motivated “just a high school graduate” can spot that’s a useful contribution. (And free.)
What else can we say about achieving safe, reliable data? What policies should they recommend, to cope with the cream of the crap? Comment please.
Some background “footnotey” details follow.
This workgroup’s position in the hierarchy:
- The Health IT Policy Committee is chaired by Dr. David Blumenthal, the National Coordinator for health IT. The committee is tasked with
- recommending a “policy framework for … a nationwide health information infrastructure, including standards for the exchange of patient medical information.” (That’s your medical data, your mother’s, etc. etc.)
- “recommendations on standards, implementation specifications, and certifications criteria in eight specific areas.” In other words, they get to say what’s acceptable and what’s not, when it comes to health IT.
- Inside that committee, this is a meeting of the Certification/Adoption Workgoup, which will make recommendations about “certified electronic health records that support meaningful use, including issues related to certification, health information extension centers and workforce training.”
- Got that? That’s “How do we certify that a given health IT system is reliable, so it contains – and transmits – accurate data?”
- They even get to recommend issues of workforce training. If you’ve ever implemented a new computer system in your workplace, you know how important that is.
This meeting’s agenda:
Adoption/Certification Workgroup Meeting
Omni Shoreham Hotel, 2500 Calvert Street, NW, Washington, DC
Thursday, February 25, 2010, 9 a.m. to 3:00 p.m./Eastern Time
9:00 a.m. Call to Order/Roll Call – Judy Sparrow, Office of the National Coordinator
9:05 a.m. Meeting Objectives and Outcomes: Health IT Safety
– Paul Egerman and Marc Probst, Co-Chairs
9:15 a.m. Identifying the Issues
- Ross Koppel, University of Pennsylvania
- David Classen, CSC
- Gil Kuperman, Columbia University
- Alan Morris, Intermountain Healthcare
10:45 a.m. Stakeholders
- Dave deBronkart, ePatientDave
- Justin Starren, Marshfield Clinic
- Jeanie Scott, Veterans Health Administration
- Susan Edgman-Levitin, National Patient Safety Foundation [invited]
- Gay Johannes, Cerner
- Carl Dvorak, Epic
12:15 p.m. LUNCH BREAK
1:00 p.m. Possible Approaches
- Jeff Shuren, FDA/medical devices
- William Munier, AHRQ
- James Walker, Geisinger
- Edward Shortliffe, AMIA
2:30 p.m. Summary Comments from the Workgroup
2:45 p.m. Public Comments
3:00 p.m. Adjourn