My last post discussed how the building blocks of expertise and experience – stored memory collections of events and ideas – are the repository from which we make diagnostic and therapeutic decisions. Now, let’s consider three key strategies that may assist us in being more analytical in our decision making, counterbalancing availability bias (confusing memorability with probability). Before we do, remember that intuitive thinking (heuristics, or “pattern matching” thinking) serve us well most of the time. We can make faster decisions with less information, and we are often correct. This post is intended to address the smaller number of cases in which heuristics and expert intuition lead us to a cognitive error called “premature closure” – selecting the first plausible diagnosis that comes to mind and seems to match the contextual information.
Strategies to prevent premature closure hinge upon deliberate effort to analyze the situation and to regularly seek decision support (whether via technology, cognitive aids, consultation with colleagues, or all of these).
Strategy: The Rule of Three – This rule is very simple, and purposefully so. This strategy requires developing a new habit of always considering at least three possible causes for any medical perturbation, even if one cause seems very obvious or is very common (for example, it is extremely common for blood pressure to drop as a result of anesthetic induction agents, but that is not the only possible cause of hypotension in the operating room. What about anaphylaxis, myocardial ischemia, etc?) This habit ensures that we are mentally exercised to consider a variety of options, and primes us to recognize and rescue from an error sooner than we otherwise might. Not restricted to diagnostic possibilities, the Rule of Three also mandates that you never repeat a therapeutic maneuver more than three times without pausing to consider whether the therapy (and presumed diagnosis) are correct. It may turn out that you are indeed correct, and more of the same is appropriate. As an example, giving a third dose of the same vasopressor for hypotension should trigger consideration of whether a different pharmacologic agent might be more appropriate, and perhaps whether the presumed cause of hypotension is fully understood. The focus is on cultivating a habit of analysis, and establishing an easy mental trigger for reassessment – the number three.
Strategy: Prospective Hindsight – This strategy (adapted from military tactics) asks you to pause in the middle of a clinical scenario, and ask yourself to metaphorically see into the future. When it is discovered later that you are wrong right now, what will people say are the “ obvious red flags” you missed? Hindsight and outcome bias allows us to identify data others should have seen or steps they should have taken, and criticize others for what seems so very obvious after the fact. Of course, this is not always so clear as events are evolving. The Prospective Hindsight exercise is intended to harness the value of hindsight in the present moment. So, seek disconfirming evidence, and ask yourself “Could this be something else?” “What am I missing now?” If you engage in this kind of thinking and you are satisfied that indeed you are on the right track, and you cannot identify evidence to the contrary, you can proceed along. You might still be wrong, as we all are from time to time, but you will have at least considered whether all of the pieces truly fit. You will also be able to justify your decisions if they are questioned, because you will have thought about the situation thoroughly.
Strategy: “Ten Seconds for Ten Minutes” Marcus Rall suggests that in emergencies, we act to quickly and make rash decisions. He proposes that instead of making a snap-judgment diagnosis, we instead take ten seconds to summarize the data aloud, which in itself may be enough to trigger thoughts of alternate possibilities. Going further, use the ten seconds to solicit input from team members and craft an initial treatment plan. Many of us are familiar with Law #3 from the book “House of God”: “At a cardiac arrest, the first procedure is to take your own pulse.” I believe Dr. Rall’s advice is in line with this law. Slow down and be thoughtful rather than impulsive. Impulses are often right, but we must balance them with careful consideration, even if we ultimately carry out the plan our instincts suggested.
These three strategies are intended to counterbalance the natural state of expertise – that is, the natural inclination to “know” what something is based on a recognizable pattern that fits past experiences and learning. Heuristics lead to correct answers most of the time, but since there is no way to know we are wrong at the time we are wrong (here I have to recommend Kathryn Schulz’s book on “wrongology”) we must practice strategies to deliberately add both internal analysis and external decision support into routine decisions. Doing so as a habit when we are right will ensure we are also doing so when we are wrong.
PS – it is worth considering whether we can actually be successful in overriding our intuitive thinking, and if we are, whether that would in fact improve diagnostic accuracy. It is perhaps just as likely that we would be interrupting and sabotaging our expert decision making that has been serving us well most of the time. On balance, will we be better off?
Note: This post is the second part of a series intended to compliment my upcoming lecture as faculty for the Stanford School of Medicine course “Medical Education in the New Millennium: Innovation and Digital Disruption.”
What is your favorite strategy for ensuring you are making the right decisions?