Monday, August 27, 2012

Thinking About Thinking

The Gist:  Clinical reasoning in the EM setting is different than other arenas as one must often make life and death decisions (and actions) with limited information and even less time.  Our patients are often undifferentiated and nearly any ailment exists as a possibility.  Furthermore, EPs are constantly juggling multiple patients and responsibilities.  As a result, cognitive errors are common and play a role in clinical decision making (although this also applies to all fields of medicine).  Thinking about thinking or, "metacognition," may help reduce these cognitive errors to allow Emergency Physicians (EPs) to improve diagnostic and treatment decision making.   

The case that made me care about cognitive errors:  A 40-something year old patient presented to the hospital with nausea, vomiting, and epigastric pain stating "this feels a lot like my pancreatitis."  The patient had a history of pancreatitis and helicobacter pylori in addition to the all-American trio:  diabetes, hypertension, and hypercholesterolemia.  The labs, work up, and patient all seemed to proclaim pancreatitis.  Unfortunately, the patient bounced back within a day in heart failure, status-post massive MI.  Ever since, I've been exceptionally wary of diabetics with nausea and vomiting and garnered a fascination with the thinking of an EP.

In medical school, we're overtly taught clinical reasoning through data mining in the form of history, physical exam, and diagnostic testing.  I suppose we're subtly and indirectly taught to think about the way we integrate and assimilate this data into a coherent picture of the patient and an accurate diagnosis.  We subconsciously use heuristics, cognitive short cuts, to inform clinical gestalt.  My formal medical education, however, did not include any discussion of cognitive bias or meta-cognition until a resident at a program I rotated at gave a brief presentation on metacognition.
  • Apparently, we can reduce errors if we step back for a moment and think:  What doesn't fit? What have I failed to consider (perhaps a zebra? or a different horse?)?  What biases may be present?  What is leading me to think that this patient has X?
This is so important that I wanted to disseminate this information to help others who strive to be excellent thinkers and swell clinicians.

Basic systems of thinking.  Between which we have the capability to toggle, which we should probably exercise more often. Check out this lecture.
  • "System 1" or "Fast":  This is the instinct, intuitive, adaptive, associative, quick thinking.  We use this system when we say, "this patient just looks sick" or "I have this gut feeling the patient has ______."  Caution: medical students and trainees should really not rely on this type of thinking although it is often key in EM (think clinical gestalt)
  • "System 2" or "Analytic":  This is a slower process of thinking which is more deliberate and analytic.
Common EM thinking patterns:
  • Hypothetico-Deductive Model:  One of the most common cognitive pathways used in medicine - useful in non-critical situations, as algorithms such as ATLS and ACLS tend to dominate in the more critical circumstances.  
    • Main steps:  Generation, Evaluation, Refinement, Verification.  Errors can be present in any step.  
    • Error in Generation - failure to consider a potential diagnosis (influenced by disease prevalence, atypical presentations, etc).  
    • Error in Evaluation - problems in gathering data, interpreting and assimilating the data, and putting the data in the proper context.
    • Error in Verification - failure to ensure that the final diagnosis fits with the clinical picture and established data/workup.
  • Pattern Recognition Model:  This dominates when an experienced clinician uses clinical gestalt to inform the diagnosis rather than generating a complete differential diagnosis (and thereby prone to error for failing to consider alternative diagnoses).
  • Rule Out Worse Case Scenario Model:  Some clinicians employ this most of the time and others tend to apply it when particularly high-risk diagnoses are on the table.  This method of reasoning is expensive and exposes the patient to excess harms through extensive investigations.
Why may the ED be a breeding ground for cognitive error?
  • High levels of diagnostic uncertainty
  • High decision density and cognitive load
  • High levels of activity
  • Inexperience of some providers (and students)
  • Interruptions and distractions
  • Shift changes
  • Many of these, integrated, produce fatigue
  • These errors occur in every facet of medicine but in EM, there is a certain expectation that EPs not miss the badness (or anything).
The papers to read (note:  Dr. Patrick Croskerry is a world-reknowned expert in this arena.  If you're looking for even more to read, check out his plethora of articles.  Also, he has several free talks on, which can be converted to podcasts within iTunes)  

Life in the Fast Lane provides some succinct case-based insight into cognitive errors with To Err is Human 1 and To Err is Human 2

Achieving quality in clinical decision making: cognitive strategies and detection of bias - Croskerry

This article, also by Croskerry has a rather complete list of cognitive errors in the tables embedded in the free text article.

Other References:
Jepson, Zak. University of Massachusetts.  Medical Student Lecture. August 15, 2012.



    I found this a really good presentation that has definite practical applications to problem solving in general, and clinical problem solving in particular.

  2. An excellent book covering this, though from a general perspective is Daniel Kahnemans "Thinking, fast and slow". I would highly recommend it for an indepth analysis of heuristic biases, cognitive errors, and the psychology behind them.

  3. This comment has been removed by the author.

    1. Thank you for sharing this post, and your experience. Another great great read is "How Doctors Think" by Jerome Groopman, written for lay person, and gives insight into doctors cognitive errors, and advocates for patient-centered care. The good news is : We can prevent cognitive errors by being aware of them, and by teaching and practicing routine cognitive forcing strategy.

  4. The Coursera Clinical Problem Solving Course is also very good! It is a 6 week introduction to clinical reasoning that explicitly works through decision making.