• Article
  • July 30, 2018

Cognitive Bias and How it Affects Health Care Providers' Decision Making

Humans make 35,000 decisions a day. How do we know if they're the right ones? Here's a look at cognitive bias and what children's hospitals can do about this issue.

By Darcie Reeson

Turning left onto your street, it hits you: you don't remember the drive home. Somehow, you got there safely. How? You're an experienced driver. You're familiar with the car and comfortable with the route. Your brain recognized the pattern and acted automatically—it simplified how it processed information to conserve cognitive energy. Contrast this to driving a rental car in a city you don't know; an unfamiliar task that requires more deliberation.

It's estimated that humans make 35,000 decisions each day. Experience gives the brain the option of drawing from heuristics, or mental shortcuts that help you make decisions quickly and efficiently. And that works perfectly—most of the time.

In medicine, this fast thinking, called cognitive bias, can inhibit decisions and judgments, leading to diagnostic errors and potentially patient harm and financial consequences. Human brains are conditioned to find the simple answer, and finding that answer may mean we stop looking for alternatives.

7 Tips to Reduce Risk of Diagnostic Errors

7 Tips to Reduce Risk of Diagnostic Errors
Click to Enlarge

Human judgments in health care

During a training session at Texas Children's Hospital, clinicians were split into two groups and asked a question few people know the answer to: Is the population of Ethiopia more or less than 45 million or 75 million? Then, each group was asked to estimate the population. The estimates tended to cluster around 45 million or 75 million, depending on which group people were in—the individuals were primed, or anchored, to the initial number, even though it had no particular significance. That's cognitive bias at work, specifically the anchoring bias.

Biases are normal for everyone, but for health care providers, the stakes are high. The Joint Commission estimates 20 percent of diagnostic errors are attributed to cognitive error, but awareness of this health care issue is lagging behind other industries.

In Cognitive Bias Mitigation: Becoming Better Diagnosticians, Patrick Croskerry, M.D., Ph.D., writes that because the topic isn't part of traditional medical training, many clinicians are simply unaware of, and unfamiliar with, cognitive biases and how they affect thinking—especially because they occur in obscurity. "The challenge of these biases is they're subconscious," says Andrew Olson, M.D., FAAP, FACP, assistant professor of medicine and pediatrics at The University of Minnesota Medical School in Minneapolis. "They happen in silence. We don't even know we're doing these things."

There's no argument that cognitive biases exist. Researchers have identified more than 30 types of bias that may lead to diagnostic errors in health care. But there is debate on how to mitigate the effects. How can we change bias if it's hardwired into us? And what if some of the same processes that lead us to err also help us get it right?

Jumping to conclusions

Joe Grubenhoff, M.D., associate medical director of clinical effectiveness at Children's Hospital Colorado in Aurora, was a new resident when he first saw cognitive bias in action. He didn't have a word for it 15 years ago, but he knew something wasn't right. The patient had a dermal sinus tract, an abnormal connection between the skin and the spinal cord that causes cord tethering.

A culture by her primary care physician led to the first diagnosis of a urinary tract infection (UTI), which fit the symptoms. After failing to improve with oral antibiotics, the patient was admitted to the hospital for IV antibiotic therapy. When she didn't improve, the admitting physician, still convinced the cause was a UTI, ordered a kidney ultrasound.

Despite knowing the dermal tract put the patient at high risk for meningitis, it wasn't considered as the possible source of her infection. "There was something else going on. She should've gotten better with the treatment," Grubenhoff says. The team eventually discovered she had meningitis and unusual bacteria, which led to a three-week hospital stay. "This caused a gnawing frustration in the back of my head," he says. "Why do we try to simplify these problems and fit things into stories when they don't fit?"

For the diagnosing physician, the search-satisfying bias meant the UTI fit. For Grubenhoff, it wasn't enough. "It's important to find the opportunity to say, 'I'm going to think about this differently. I'm not going to let the simple story sway me. I need to keep looking at this.'"

Diagnostic error is elusive

When the Institute of Medicine (IOM) released its first report, "To Err is Human: Building a Safer Health System" in 1999, medication errors were mentioned 77 times, while diagnostic errors were mentioned twice. In 2016, a new IOM report, "Improving Diagnosis in Health Care," focused exclusively on diagnostic errors, calling them a "blind spot in the delivery of quality health care." Serious safety events reported to the Child Health Patient Safety Organization in 2017 revealed failure to recognize as the top cause of serious harm in children's hospitals.

Today, while there is more national attention on the topic, little is known about how to define or measure the effect of cognitive biases that lead to diagnostic errors, especially in pediatrics, Olson says. "We have a road map if we want to decrease wrong-site surgery, catheter-associated UTIs or central line-associated infections," he says. "We know the best practices. It's still evolving with diagnostic safety. No one knows the right way to do it or how often we're right or wrong."

Grubenhoff agrees it's difficult to see these errors happening, compared to errors related to well-established clinical practices. "You can see where the failure points are when you're watching the process of properly inserting a bladder catheter. It's not so easy with a clinician's reasoning."

Training takes over

In Thinking, Fast and Slow, author Daniel Kahneman writes, "As you become skilled in a task, its demand for energy diminishes. Studies of the brain have shown the pattern of activity associated with an action changes as skill increases, with fewer brain regions involved."

Medical education teaches clinicians to look for patterns of disease and match patient history with the illness scripts that fit those patterns. What takes intense thought and practice for medical students learning how to make diagnoses will eventually morph into quick decisions. "No one tells you in five years, you're not going to think about this anymore; you're just going to decide," Olson says.

Once they become comfortable with those patterns, clinicians may jump to an early conclusion because it seems simple and straightforward. They feel confident in their diagnostic reasoning, and typically, that's a successful way of making a diagnosis. But there are flaws.

Ellis Arjmand, M.D., chief of Otolaryngology at Texas Children's Hospital in Houston, reviewed a case where a toddler was evaluated for an esophageal foreign body after swallowing a Lego piece. When the exam showed it wasn't in his esophagus, he was sent home. The child returned several weeks later with a persistent cough and fever. A second evaluation revealed a tracheal foreign body, requiring immediate surgery.

Looking back at this example of the framing effect, the initial team had the right answer to the wrong question: Is it in the esophagus? When the answer was no, they didn't ask, "Where else can it be?" This put the child at significant risk of pneumonia.

"The heart of being a clinician lies in making the correct diagnosis," Grubenhoff says. "We learn through standardized testing and shelf exams and board exams there's always a right answer, there's only one right answer, and we by ourselves are responsible for having that correct answer."

There's a place for shortcuts

Working in a high-stress environment like a children's hospital creates the perfect nesting place for cognitive bias, increasing the possibility of diagnostic errors. "After years of training and patient care, you're looking for the answer that fits the pattern," Arjmand says. "And when people are under pressure, there's a tendency to default to a more rapid mode of decision making. It's hard to slow down and embrace more uncertainty."

Consider the typical hospital environment: a busy clinic day, a full emergency department waiting room, half a dozen patients to round on. "We're forced by our environment to rely on shortcuts because they are time savers," Grubenhoff says. "Most of the time they get us to the right conclusions."

Anika Kumar, M.D., FAAP, pediatric hospitalist at Cleveland Clinic Children's Hospital in Ohio, agrees. "Burnout and fatigue make us rely heavily on our heuristics," she says. "But those aren't always perfect. Nine times out of 10, we're right, but there's one chance that something's different than it appears. Are we lumping everything together with the bias that most children only have one problem? We have to be able to believe we could be missing something."

Using shortcuts is a reflection of a normal cognitive process. The peril, says Arjmand, is when you default to a simpler model, System 1 thinking, at the wrong time. He says the key is to be aware of when you're doing it and regulate your thinking so you're not doing it at the wrong time.

The other way of approaching decisions—System 2 thinking—is analytical and resource intensive. Both systems have a place in health care, but when is the right time to use them? Emergencies require a reliance on an acute care algorithm to deliver immediate and lifesaving care, which is different than seeing a patient with time to assess the situation. "Do we process every small decision with a great deal of deliberation? That's not the solution," Arjmand says. "The solution is to recognize when it's okay to make a quick judgment and when it isn't."

Support is critical

Well-educated, well-trained and experienced professionals can make diagnostic errors. But the strategies to address those errors or perceived performance issues don't get at the underlying problem. One approach is to provide more education, which is based on the idea that someone made a mistake because he or she didn't know enough or needs to be mentored. The other is centered on discipline.

However, these approaches don't account for normal brain function that predisposes people to certain judgments. And focusing on the wrong thing contributes to the shame clinicians feel, which further buries the issue. "When you tell someone their reasoning was wrong, it's interpreted as, 'You're not a good provider.' They fear their knowledge base is being judged," says Grubenhoff. "If you're concerned about your peers judging you, it makes it hard to have those conversations. If we can't bring them out in the open and talk about them, we're not going to move forward."

Cognitive bias deserves serious attention

This evolving area of medicine shows potential for improving quality of care, which energizes Grubenhoff. "We have an opportunity to define a new area of health care and scientific inquiry," he says. "But there will be stops and starts because we're trying to reframe 200,000 years of evolution that led us to use these hardwired systems of decision making."

Physicians already actively involved in this area agree it's a difficult topic to tackle. "But at the end of the day, recognizing our biases helps us take care of our patients because no one wants to miss a diagnosis, and no one wants to do anything that could potentially hurt a patient," says Kumar. "And that makes the hard work worth it."

Send questions or comments to magazine@childrenshospitals.org.