Posted by & filed under Noah's Notes.

One of the biggest safety concerns across healthcare settings is diagnostic error, and unfortunately diagnostic error is relatively common.

It accounts for 17% of preventable errors in hospitalized patients and one autopsy study conducted over 40 years indicates that 9% of patients had important medical conditions that went undetected.  There are multiple factors which contribute to diagnostic error, flaws in the way we communicate and work together, lab and other diagnostic test errors, various members of the healthcare team working in isolation from each other (“silo effect”), and time pressure.

But our natural human tendencies and their influence on the manner in which we arrive at a diagnosis can also lead to errors.  Those of us responsible for making a diagnosis and creating a treatment plan have to obtain and process information like symptoms, duration of illness, physical exam findings, environment, past medical history, current clinical circumstance and individual patient characteristics, then identify a potential or “provisional” diagnosis which adequately explains all of this information and which will inform additional testing and treatment.  And it turns out that we have a tendency to develop mental short cuts or “rules of thumb” to arrive at those conclusions.  In the lingo of cognitive psychology those shortcuts are called heuristics, and we are especially prone to using heuristics when the presenting symptoms are common.  The Patient Safety Network recently listed some common areas of concern, referred to as cognitive bias, in our use of heuristics.  They include:

  • Availability heuristic – the diagnosis applied to a current patient is unduly influenced by experience with past cases, especially if the diagnostician had a positive or negative emotional experience with those cases
  • Anchoring heuristic – relying too heavily on initial impressions, sometimes developed before even talking to the patient (based on chief complaint, initial vital signs and labs, past experiences with the patient)
  • Framing effect – undue impact of subtle cues and collateral information (for example, staff tells you that patient always exaggerates their symptoms)
  • Blind obedience – giving exaggerated deference to a specialist’s opinion, or relying heavily on a test result

As you might imagine there is a growing body of literature on this topic and much to be learned.  But there are some strategies for mitigating the impact of cognitive bias.  These include regular feedback on performance and open discussion of diagnostic errors when they are identified, more autopsies (these used to be much more common and were important learning opportunities – unfortunately this is unlikely to occur), and the development and use of evidence based decision support software.  The creation of teams of clinicians who care for patients will also be an important improvement.  But right now each of us involved in the care of our patients can reflect on our own processes for diagnosing, how our team functions and where we can work to limit the negative impacts of cognitive bias.

Noah Nesin, MD

Dr. Nesin, Vice President of Medical Affairs for PCHC, is a family doctor with 30 years of experience.

Leave a Reply

Your email address will not be published. Required fields are marked *