One of the principal areas to be understood and developed as we expand participatory medicine is decision making. As patients become “responsible drivers of their care, and providers welcome and value them as full partners,” patients participate in decisions – and inevitably start to understand the decision process that used to be the domain of the clinician. So we’ve written often about shared medical decision making.
And whether you’re the clinician or the patient, it turns out illusions are a real challenge. The thing is, even when you know it’s an illusion, the illusion’s still there. And it’s really hard to persuade people that they’re not seeing something they know they’re seeing. Example:
If you can’t see the presentation below, view it here: Optical illusion examples
If that doesn’t work, try the no-animations Slideshare version.
Update 1/4/2014: Be sure also to see the new videos in the comment below.
The three books cited here – Nudge, Switch, and Thinking, Fast and Slow – have earned popularity in health circles. Thinking is heavier than the others, literally and metaphorically; together, they make the case that it’s not too useful to overcome an illusion by presenting the brain with facts.
And that’s really hard to accept if, like most clinicians, you’ve been trained to respond only to facts. In this case though, the ultimate fact is that it’s not that simple: information doesn’t change behavior.
Thinking, Fast & Slow (Oct. 2011) is by Nobel-winning economist Daniel Kahneman. It’s hefty and chewy, all full of 30+ years of research into the usual quick-thinking mind (fast, which he calls “system 1”) and the methodical “stop & think” mind (slower, “system 2”). The decades of evidence are pretty compelling that System 2 isn’t involved in nearly as many wants and decisions as we’d like to think: System 1 runs almost everything, and is quite prone to illusions and mirages. Even when System 2 knows it.
The other two books are at a much more popular level. At first they struck me as potentially air-headed or light (“oy, another trendy one-word title”), so I didn’t read them. But Kahneman took me to school and I get it.
And here’s the problem: clinicians and policy people alike try to persuade using facts and rationale, but it doesn’t work. That approach speaks to “system 2,” which is often overruled by what system 1 swears is actually important or actually happening. And it’s an illusion of the worst kind: the rational mind swears it’s thinking, but other factors are more compelling.
So what do we do?
2010’s Switch describes this as “the elephant, the rider and the path.” The rider is the rational one, but the elephant has all the power. You can talk logic all you want, but if the elephant gets spooked or excited, it does what it wants. So there’s more leverage, Switch says, by shaping the path that the elephant walks – the environment, the way choices are presented, and so on.
2008’s Nudge (with elephants on the cover!) is along similar lines and talks particularly about presenting choices in ways that “nudge” us to better decisions, which Switch would call “shaping the path.” For a good quick understanding of Nudge, I recommend this Amazon review.
_____
For cross-discipline subjects like this, I always try to ask, “What are the takeaways for e-patients? For participatory clinicians?” This time that’s a rich question, because we’re changing the culture of medicine, which cares a lot about what’s true, what’s valid, etc. Plus, we’re asking clinicians to make wrenching changes in how they do their jobs and run their businesses; we try to persuade with information, when the real challenge may be people’s beliefs and expectations – the elephant’s home.
(If you get irked by this, have another look at those slides. This is a human issue, not an individual one.)
Some suggested takeaways:
- Information is not a sure path to behavior change.
- Don’t be surprised if information isn’t persuasive
- Don’t try to solve it with more of the same.
- Don’t interpret resistance as a moral failure.
- Look for ways to relate to the elephant.
Others, anyone?
Great post, Dave. Thanks for summarizing Think, Fast and Slow. Half way through it my System 1 rebelled against my System 2 and said “Get to the point! Somebody must have written an article summarizing all these studies.” I stopped reading, and have been searching for a summary ever since.
One thing I think is not sufficiently treated in these discussions of decision making is the influence on System 1, if you will, of the immersion for a lifetime in culture that constantly reinforces specific patterns of behaving and deciding. The decider (homage to GW Bush, here) isn’t just trying to reach a rational decision, but they’re trying to decide without running contrary to their interpersonal and societal context that’s giving them cues every moment. Our elephant wants to stick to well-worn paths carved out by our companions. I think there’s more sociology involved in decision making than is generally acknowledged.
Your five points are great. I wish they had been in the health education curriculum when I was in school 40 years ago. It would have saved us a lot of beating our heads against the wall.
And in this election year keeping those ideas in mind might take some of the venom out of the rhetoric we’ll be exposed to.
Thanks, David. We gotta have coffee someday.:)
I had the same experience with Thinking. It’s a hot seller but I bet a quiz on the back half of the book would draw a bunch of goose eggs. BUT, I got it in audio too and listened to it on the treadmill.
And I have to say, the depth and chewiness is necessary, because its ultimate message is so irrational, and so unacceptable to the thinking mind, that only overwhelming evidence will be convincing.
(When I can’t get through a book like that, I go to Amazon and read the highest-rated reviews. For me that still wouldn’t have been convincing enough, this time – only the weight of all the evidence was enough.)
Thanks for the discussion, Dave, and the motivation to finish the book! I haven’t come across Nudge nor Switch, but I will seek them now.
Healthcare is not my deal. I am applying the thinking to teams I work with in business. Your discussion is helpful.
CMC
Can’t help but ask, Charlie, if healthcare isn’t your schtick, what brought you here?
A superb new example of the power of illusions just popped up on Facebook: http://distractify.com/fun/amazing-t-rex-optical-illusion/
After you watch the first one (about T Rexes) watch some of the others.
Thanks for another great post, Dave (which I am now seeing, very late in the game, for the very first time)! What a cool way to help folks open up their minds when it matters most – identify assumptions they might not be aware of, and prepare the mind for literally life-changing options and outcomes.
I’d love to see these optical illusions side-by-side in a presentation with specific medical examples. Some good ones that come to mind: what patients think EHRs do, but of course they don’t; what MDs and other clinicians think on the same subject; what patients think when they hear about measures to combat the Opiod Epidemic, vs what MDs & gov’t officials, and actual stats, by illness and Rx; what patients think about the different types of mammography, breast sonography, and breast MRIs…what doctors think…gov’t regulators…insurers…and what they really do.
So many great examples just waiting to bring a huge audience to its knees, laughing and crying at the striking contrasts. And, no illusion here – when they’re laughing and crying, that’s when you’ve really achieved something and set the stage for real change. Unless they really feel something, they won’t remember whatever you are saying, and unless they remember, they will not take action. Feel – Remember – Act! (Always have liked the sequence of those three letters anyhow – those of you with alphabetically-oriented brains will not wonder why :)
Hi Francie – somehow I just saw this comment now.
This summer I’ve been fascinated with another behavioral economist, Dan Ariely, author of Predictably Irrational, which I now consider indispensable for anyone who wants to understand this field.
He’s the most intriguing writer and speaker I’ve found in the field, including several TED Talks. For instance, see the visual illusions a couple of minutes into this talk.
Ariely’s concepts are important to understand and, like good writing, can be used to reveal and illuminate, rather than manipulate, patient and doctor/nurse/HC provider choices.
In approaching chemotherapy, for example, some patients are asked to choose between “the Cadillac of chemo” [inappropriate, insensitive real life MD phrase describing regimen of strongest hit on cancer cells, but with increased risk of congestive heart failure and leukemia]; a middle path with somewhat less good statistical results, many possible side effects but not all of the super scary potential side effects of option one; and a sharply less aggressive regimen, with less impressive results for cancer not returning, in which patients might not even lose their hair.
Needless to say, Option 2 looks pretty good when described in this limited manner (the info patients get really is amazingly limited! Maybe someday an IBM Watson-like chat bot could supplement with more info – patients have a lot of questions that are not answered, not even when the answer might be “Science doesn’t know yet” or “Good question, let’s check on that and find out”).
Is this a good thing or a bad thing? I don’t think it’s an accident that the info presented on chemo options in the example above is so limited. Doctors/hospitals are trying to help patients choose – – but for patients who are the thinking type, it’s obvious that a big decision is being confronted with very few facts, and lots of legal waivers of responsibility – – a situation which naturally ignites a state of nerves, not confidence.
It will be great when personalized medicine and clinical use of big data become an entrenched, accessible reality, with patients/providers being exposed to info more like “For patients like you in X, Y, Z and Q ways, this treatment tended to produce the following results” on their way to making decisions informed by facts as much as possible.