There have been several recent posts about cognitive biases, and I thought I’d weigh in with some ideas to consider. It's hard to fit all of this into 140 characters, so bear with me. Ping: @TheSGEM @srrezaie @sherbino @sandnsurf @TChanMD @precordialthump 1/
In the spirit of furthering evidence-based #FOAMed. I think we as a community need to be careful about accepting the "truisms" of cognitive biases without thinking critically about where they come from and evidence supporting their impact on clinical care. 2/
The story of cognitive biases in medicine is largely based upon the work of Kahneman & Tversky, who demonstrated that individuals think using heuristics rather than calculating actual probabilities. 3/
These experiments were largely done in novices (not experts) and were “rigged” to give incorrect answers when participants used heuristics; this was HOW they demonstrated the use of heuristics in their experiments. 4/
This does NOT mean that heuristics are maladaptive in all settings, yet this idea of humans being inherently fallible and bias-prone was a common translation of these experiments. There are holes in this logic nicely outlined by Lopes: https://bit.ly/2Kb3HPz 5/
Translations of Kahneman & Tversky’s work into the medical world are based almost entirely on conjecture ( https://bit.ly/2Kb3HPz ) or retrospective reviews of adverse events ( https://bit.ly/2FcE25E ). 6/
Remember that biases are SUBconscious, so we should be skeptical when individuals say that they found biases such as ‘anchoring’ and ‘availability’ in themselves or others by examining actions (it often means they have looked back at the medical records). 7/
If learning about biases is to be useful for practice, we should be able to use this approach in REAL TIME when we can correct our behaviors. This means that that our behaviors should be INDEPENDENT of future outcomes that are unknowable in these decision-making moments. 8/
Yet when clinicians review hypothetical cases w/ BAD outcomes, they identify TWICE as many biases in comparison to clinicians reviewing the SAME cases with IDENTICAL management decisions w/ GOOD outcomes. A clear demonstration of hindsight bias. https://bit.ly/2K8TTWm 9/
And, when expert clinicians are asked WHICH biases they thought were in play there was ZERO agreement about which biases influenced behavior. https://bit.ly/2K8TTWm 10/
Taken together, we do not seem to be able to identify specific biases in the moment, and our assumptions about biases are based on information not available to us in those moments. 11/
Knowing about biases may still be useful. It highlights the non-probabilistic ways that we think. But bringing the intricacies of our subconscious mind into conscious awareness in real time likely is not possible. 12/
And @sherbino will likely agree that focusing our teaching on approaches towards identifying and circumventing SPECIFIC biases may lead to misguided efforts ( https://bit.ly/2Hoe0lL ) 13/
Instead, there is much stronger evidence that focusing our teaching on information gathering, knowledge, and knowledge application will have a greater impact on our diagnostic performance ( https://bit.ly/2Fcjt9p ) 14/
While it is compelling to think about the human mind as inherently fallible and error prone, the message that ‘intuition is bad thinking’ and that ‘we should overcome our intuition’ neglects the centrality of heuristics in expert performance. 15/
General approaches to ‘slow down’ and ‘think harder’ in ALL cases have not been shown to improve performance ( https://bit.ly/2JlTgYh ); in fact, moving efficiently through cases where you are confident is a marker of strong Dx performance ( https://bit.ly/2HruFB0 ) 16/
Instead, as @moultonca has suggested, “slowing down when you should” is likely a more adaptive approach ( https://bit.ly/2qWlm4R ). These slowing down moments are a useful way to trigger metacognitive thought when it matters, gather more data, and ask for help. 17/
If conceptualizing the mind as fallible/biased helps individuals recognize that they should pay attention to moments when they are feeling uncertain, I can buy the argument that learning about the potential failings of our subconscious has utility. 18/
Just don’t expect that you’ll actually be able to put words to what’s actually going on in your head in the moment. It is, after all, your SUB-conscious. 19/
Reflecting in the moment still has utility, not because you have identified a bias, but rather that you have identified complexity and uncertainty. These are the times when analytic, conscious thought (slowing down) is likely to help. 20/
And reflecting on your performance and potential biases may help you to perform better in the FUTURE. This is how knowledge is shaped by our experiences. 21/
And finally, remember that most of this discussion about bias pertains to DIAGNOSTIC accuracy. Dx is only part of expertise in CLINICAL reasoning. There are many other considerations (risk, patient preferences, resource utilization) that factor into patient-centered care. /fin