Wednesday, August 29, 2012

Anchors Aweigh! Cognitive Bias - Where IS This Ship Headed?

The Gist:  We all succumb to cognitive errors from time to time.  Identifying these errors in our medical decision making though exercising metacognition may improve patient safety but it may also allow us to be better clinicians.  NB:  This is not a comprehensive review of all types of cognitive biases, which one can find here in "List 1."  Rather, this is a synopsis of some of the commonest cognitive biases, which I've learned from first-hand.

Anchoring Bias
 - when a first impression or one piece of evidence exerts undue influence in the diagnostic process.
  • Case:  Listen to a great new blogger/podcaster, Dr. Bob Stuntz, present a case on Anchoring Bias in which he gives an excellent example of the patient who comes in complaining of "I have a kidney stone."
  • Solution: wait until information about the case is complete before forming an impression or selling yourself on a diagnosis.  Note:  clearly in some critical situations one must act before information is complete.   
Triage Cueing - bias initiated by the patient's initial triage level, assuming that a patient can't be sicker than their triage level.  
  • Case:  A patient was placed in the "minor care" area for a "sore throat."  The patient's PMH included hypertension and the history elucidated that the patient's complaint was more of a dry throat (drinking copious amounts of water) coupled with a yeast infection that wouldn't go away.  She also generally felt terrible and weak.  The attending was initially wary of the idea of a fingerstick glucose level but acquiesced after discussing that polydipsia and intractable yeast infections are harbingers of uncontrolled diabetes.  The result = 587 mg/dL.  Chemistry demonstrated that the patient was in mild DKA, with newly diagnosed diabetes.
  • Solution:  Recognize that patients have the potential to be sick regardless of initial triage level.  Triage cueing may also set up another cognitive bias, Diagnostic Momentum, where a patient's workup is based solely on one diagnosis or label (hand-offs at sign out serve as notorious examples).  
Premature closure - when one accepts a diagnosis before verification of the diagnosis. 
  • Case: A 48 year old male is transferred from an outside hospital for CHF.  He presented with acute dyspnea accompanied by some pinkish phlegm.  He denies chest pain, pressure, leg swelling, travel, or cough. He was slightly hypertensive, sating 93% on 3L (non-smoker). Troponin was negative, ecg showed potentially new LBBB. Patient was given furosemide and a diagnosis of CHF. Upon arrival to our ED he had a BNP of 52 and was found to have a PE upon further work up since his story of CHF didn't seem to fit with our independent evaluation.
  • Solution:  Look at the evidence that both supports and refutes the diagnosis and, if lacking, obtain appropriate evidence.
Confirmation bias - look for evidence to confirm the hypothesis rather than searching for evidence to refute.
  • Case:  A 40-something year old patient presents to the hospital with nausea, vomiting, and epigastric pain. The patient has a history of pancreatitis and helicobacter pylori in addition to the all-American trio:  diabetes, hypertension, and hypercholesterolemia.  When the lipase came back over 300 and the bilirubin and transaminases were also fairly elevated, the patient was observed until his pain and nausea were controlled and he passed a PO challenge.  The patient bounced back within 24 hours in heart failure from a sizable MI.
  • Solution:  Look at incoming data objectively before selecting out certain pieces.
Search Satisfying - the tendency to cease looking for other findings/disease processes once something is found.
  • Case:  A patient presents s/p motorcycle crash with right arm pain.  Exam demonstrates an avulsion injury over the patient's right elbow.  X-rays were negative, the wound was repaired, and the patient was readied for discharge.  Upon an additional exam, the patient had tenderness in the anatomic snuff box and we found the following:
Scaphoid Fracture!
  • Solution:  Ask yourself - Is there anything else going on here? ATLS has helped decrease the tendency for search satisfying bias in trauma situations through algorithms.
Availability and Non-availability - the greater prevalence (in the ED,literature,community,news,etc), the more likely we will think of and pursue the diagnosis (and the converse also holds true).
  • Case:  A 4 year old male presents with several days of fever.  He also had cracking of his lips, a maculopapular rash, and cervical lymphadenopathy.  The patient was diagnosed with Kawasaki Disease after a day.  The next several patients that came in with more than a few days of fever got complete workups/evaluation for Kawasaki. 
  • Solution:  Ask yourself - Is the diagnosis based on the case and data or based on something you're comfortable with?  Uncommon things happen too - keep these in mind as well (I'm biased as I love a good Zebra!).
Ascertainment Bias - one sees what one expects to see (self-fulfilling prophecy)
  • Case: A 34 year old male, well known to the ED and EMS for frequent overdoses, presents with AMS and respiratory depression after his friends watched him shoot up heroin.  The patient was brought in with the diagnosis of overdose and his initial workup and treatment revolved around that one diagnosis.  Eventually the patient required intubation and upon further exam was noted to have unequal pupils.  Although this "frequent flyer" did have some level of overdose going on, the label as a "frequent flyer" and "overdose" initially obscured the fact that he was actively herniating due to a large subdural hematoma.
  • Solution:  Realize that patients who abuse drugs or have "red flag" diagnoses or allergies get sick, too. Look at each patient with fresh eyes.
As a student, I force myself to generate 5 items on a differential before I present.  Sometimes this is ridiculous and a clear stretch but I use that tiny bit of time to think about why I'm thinking the way I am and potentially identify some of my cognitive bias.  As information comes in from further evaluation, diagnostics, etc I look at the data and integrate it into my leading diagnosis, as well as my differential.  It really doesn't take additional time if I force myself to do it every time.

We will never be able to eliminate all errors, especially since systems errors play an enormous role in medical errors (and contribute to cognitive error), but perhaps we can train ourselves to reduce those that are in our control.  Life in the Fast Lane provides some succinct case-based insight into cognitive errors with these case scenarios.  There's some argument that recognition of these biases may not translate into meaningful patient outcomes, but I still think it's good form to think with intention and act when necessary.  Also, if you haven't yet, check out Dr. Patrick Croskerry's  free lectures,  on the subject (most span all of clinical decision making and errors).

References: 
Croskerry, P. The importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them.  Academic Medicine.  August 2003, Vol 78, Issue 8. p775-780.
Croskerry, P.  Achieving Quality in Clinical Decision Making:  Cognitive Strategies and Detection of Bias.  Academic Emergency Medicine.  Nov 2002, Vol 9. No 11.
Jepson, Zak.  University of Massachusetts.  Medical Student Lecture. August 15, 2012.

1 comment:

  1. awesome post. below are some of my personal notes I included in my residency application essays thought you might enjoy:

    Heuristic traps are prevalent in many other realms:
    Occupational safety, ski patrol, mountain guiding, avalanche control, dive mastering, expedition guiding
    Heuristic, Gr= find or discover. They are experience-based techniques that help in problem solving, learning and discovery. It is a method to rapidly come to a solution that is hoped to be close to the best possible answer, or 'optimal solution'. Also known as "rules of thumb", educated guesses, intuitive judgments or simply common sense One can utilize soft clues, global awareness to gain extrasensory input.

    Examples: the acceptance trap- beginner mistake of believing someone simply because they are more experienced, discounting their own instinct just because they are not entirely justified in having.

    The familiarity trap- expert mistake of seeing something many many times and thus missing subtle differences that may exist in the environment.

    The relying on the expert trap - stopping paying attention in a dangerous environment just because you are with an expert.

    ReplyDelete