Skip to main content
Sign In
 

Clinical Reasoning

January 2018


​​Applying the Science of Human Fallibility to Patient Safety 

When a clinician makes a mistake that harms a patient, the excuse I’m only human doesn’t fly. This is why Joe Grubenhoff, MD​, is beginning to examine the science of clinical reasoning. He aims to better understand the inherent shortcuts made by the human brain that lead to errors in judgment. 

Dr. Grubenhoff is creating a measurement system at Children’s Hospital Colorado to track this type of harm. His goal is to reduce the likelihood of the same error in reasoning happening again. 

According to Dan Hyman, MD​, chief quality and patient safety officer, Children’s Colorado is at the forefront in examining clinical decision making to improve patient safety. 

“Although errors in diagnostic reasoning were not originally included in the national initiatives to reduce patient harm, we are increasingly aware of the opportunities we have to improve clinical reasoning,” said Dr. Hyman. 

Creating a model for mistakes in clinical reasoning 

Dr. Grubenhoff has been focused on known misdiagnoses that occur in the emergency department. 

“The time pressure, lack of history, and limited patient information provided to doctors in the ED mean we often rely on mental shortcuts to make diagnostic decisions. So often, these shortcuts lead to the right answer. I’m interested in the times they don’t,” Dr. Grubenhoff said. 

As a model for tracking this type of error didn't exist, Drs. Grubenhoff and Hyman determined that a successful program would include three main areas: education, culture shifts and measurement. 

Teaching providers about decision making 

“Training in clinical reasoning or formal decision making is not a central part of medical education,” explains Dr. Grubenhoff. “We don’t teach our residents and students how we, as humans, decide.” 

He cites the dual process theory, which describes two systems of thinking we use to make decisions: System 1 thinking involves using mental shortcuts and our ability to make rapid-fire, intuitive decisions. System 2 thinking is deliberate, analytical, and considers each set of data to come to a conclusion. 

“In the ED, System 1 thinking often leads to the right answer. But it’s also prone to error,” said Dr. Grubenhoff. “System 2 thinking is time-consuming and inefficient--it’s impossible to rely exclusively on this in busy clinical settings.” 

Examining poor outcomes that result from System 1 thinking can help us better understand the types of mental shortcuts that happen most frequently. 

“If we can understand the failure points, we can come up with directed solutions to help avoid such mistakes,” said Dr. Grubenhoff. He cites an example of a patient who had been diagnosed with migraine headaches in the extended urgent care network. 

“The patient was suffering from an intracranial abscess, but he never got the right care because no one made the diagnosis. It’s clear from reviewing the records that he suffered from diagnostic errors. It had to do with diagnosis momentum, the first provider's diagnosis getting passed along from provider to provider.”  

Dr. Grubenhoff uses case studies like this, as well as the science of human decision making, to educate clinicians. 

“My hope is for clinicians to understand that their errors in judgment have a scientific basis. I.e., it’s not a failure of you as an individual provider, insufficient knowledge or training; rather it’s an error human beings make in general,” he said. “Knowing there’s a system that predictably fails can help us all feel more comfortable owning our own errors.” 

Which is why the second goal of the program is shifting the culture around medical mistakes. 

Shifting the “shame and blame” culture surrounding errors 

“Recently we surveyed a few of our sections, and we found that most clinicians feel that it’s important for us to learn about diagnostic errors,” said Dr. Grubenhoff. “But we also found that clinicians as individuals were uncomfortable talking about their own errors. So how do we get clinicians to the place where they are comfortable and eager to have these conversations?” 

Dr. Grubenhoff believes shifting the culture is closely tied to education. 

“We aim to give clinicians a scientific framework to understand decision making. This allows the conversation to be less about decisions made by individuals and more about the systems involved in making decisions,” he said. 

Measuring errors in clinical decision making 

“No one nationally has figured out the best way to measure this,” said Dr. Grubenhoff. “The movement in this diagnostic area is pretty nascent -- how do you measure something inside a person’s head or a team’s dynamic? It’s a lot harder to get into a clinician’s head to learn how they came to the decision.” 

To develop a system, Dr. Grubenhoff is investigating the concept of the safe diagnostic journey. Using the dataset captured by the hospital’s Clinical Effectiveness team aimed at reducing the number of CT scans to diagnose appendicitis, he examined the differences between appendicitis diagnosed at the first visit versus those diagnosed at the second visit. 

Through this examination, he found examples of confirmation bias that contributed to a missed diagnosis.

“When a patient presents with abdominal pain and fever, a rapid strep test may be performed. If the test comes back positive for strep, you think about treating the strep throat even though the patient’s primary complaint was a belly ache,” Dr. Grubenhoff said. 

“You can’t get in the clinician’s head on a large scale. But we’re finding there’s the potential that certain practices, such as ordering unnecessary lab tests, might predispose us to making wrong decisions.” 

Dr. Grubenhoff believes that once we understand the mental shortcuts that lead to failure points, directed solutions to help avoid such errors can be implemented. 

Clinician response to examining errors in decision making 

The physicians Dr. Grubenhoff speaks with typically react in two ways. 

Some attendees are fascinated by what he teaches about the foundations of human reasoning. They understand that they are prone to errors in decision making and want to learn more. The other reaction is one of fear and skepticism. This group is hesitant to have their errors made public. Legal liabilities and fear of reputational damage are legitimate concerns. 

That’s why it’s so helpful for Dr. Grubenhoff to walk clinicians through actual case studies to examine how the decision was made. During a recent presentation, he walked attendees through a case of delayed diagnosis. 

“At first, the reaction from the attendees was ‘the ED should have known better,’” he said. “But once I walked them through the steps taken during the decision-making process, it became obvious to them why that decision had been reached. These examinations diffuse the finger pointing that can creep into conversations between disciplines.” 

Above all, Dr. Grubenhoff wants everyone to understand that he’s not interested in who made the error. 

“I want to go a layer deeper and find patterns in the errors we as humans make,” he said. “My hope is if clinicians understand that their errors in judgment have a scientific-basis, everyone will feel more comfortable owning up to their errors, knowing they happen as the result of a system that predictably fails.”