Posted on

conbias

Do you believe what you see or do you see what you believe? 

Confirmation Bias affects the way we search for, interpret, and recall information. It is the powerful tendency of humans to confirm their pre-existing beliefs and give more weight to data that supports their beliefs and less weight to data that tends to refute their beliefs. Francis Bacon said,

The human understanding when it has once adopted an opinion … draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or reject…

If you believe in ghosts, then strange sounds in your house at night, artifacts in pictures, and other unexplained observations are taken as “proof” that ghosts exists. If you don’t believe in ghosts, then the same phenomena are just noisy pipes, bad lighting, and other things that all have a rational explanation. If you believe that a particular medical therapy works, say magnesium sulfate for treatment of preterm labor, then you will look for, interpret, and recall data in a way that supports your belief. Your search of the literature in PubMed will be worded in a way to find studies that agree with you (and less likely to find articles that do not); you will less critically analyze any study that has a favorable finding to support the belief, and more likely to criticize papers that do not; and you will recall dozens or hundreds of success stories from years of experience where magnesium “worked” while being less likely to recall times when it did not work.

Some people are more prone to confirmation bias than others. For example, people with poor self-confidence are more likely to seek out only data that supports their beliefs compared to people with higher levels of self-confidence.

When researchers do studies, they tend to design and frame studies in a way that will provide “evidence” that their theories are correct, even though the scientific approach should be to design the most rigorous study possible to disprove their theories. We prefer “positive tests” or confirming tests to “negative tests.” For example, if you believe that ghosts are causing the strange noises in your house, you are more likely to look for ghosts with cameras or “paranormal” noise detectors and less likely to examine your noisy pipes. If you perform more “positive tests” than negative tests, then you are more likely to develop false-positive data that affirms the belief rather than false-positive data that refutes the belief.

Doctors do this not just in the interpretation of literature and in the design of research studies, but also in the diagnosis and treatment of patients.  If a woman presents to the ED with pelvic pain, and the physician’s theory is that an ovarian cyst is the cause of the pain (or at least that the pain is gynecologic in origin), then if an ultrasound shows a 3 cm ovarian cyst, the case is closed (this tendency is called Premature Closure). Instead, the scientific approach would be to try to disprove that an ovarian cyst is the source of pain, and then, when finding no evidence that goes against the theory, accepting it as working diagnosis until new data comes along.

Patients, like all humans, are driven by the Confirmation Bias. If a patient’s preset tendency is to distrust the mainstream medical community, he will look for “evidence” such as websites and blogs and “studies” that tend to undermine or discredit what he perceives as mainstream medical opinions. Unfortunately, search engines are good at pandering to our tendency to confirmation bias. This makes how we search for information on search engines or electronic databases very important.

Let’s look at some examples. Let’s perform a Google search in two different ways that explores evidence about the (non)link between vaccines and autism.

  • First, let’s search for “proof that vaccines cause autism.” On the first page of results, there are six links that claim to provide evidence that vaccines cause autism, one that is unclear (without reading it), and three which deny that vaccines cause autism. What’s more, the webpages are bold with titles like, “Vaccines DO cause autism – Undeniable scientific proof” and “22 medical studies that show vaccines cause autism.” Google also could tell by the way the question was asked that the searcher already believed that vaccines caused autism; in fact, Google suggested, in the middle of the search results, that I should search for “Which vaccines cause autism?”
  • Next, let’s search for “evidence that vaccines don’t cause autism.” With this query, all 10 results on the first page boldly claim absolutely no link between the two, with titles like: “Massive Study Provides the Best Proof Yet That Vaccines Don’t Cause Autism” and “75 studies that show no link between vaccines and autism.” And Google did not suggest that I search for “Which vaccines cause autism?” because it knows that I don’t care.

So two people with two vastly different beliefs go to Google to do “research” and both people find exactly what they need to confirm their beliefs. Worse, in most instances, Google knows the user’s search history and uses previous searches to tailor the results of your next query.

What about the scientific literature? Let’s repeat the experiment with PubMed.

  • If we first search for “Benefits of Vitamin D,” and limit our search to “Review Articles,” in the first page of search results, 16 out of 20 results are relevant and all of those purport something beneficial about Vitamin D, improving diseases ranging from multiple sclerosis to rheumatoid arthritis.
  • If we search instead for “Harms of Vitamin D,” we find that 18 of the 20 results are relevant and about half of these are frankly critical of Vitamin D usage and screening in contexts outside of the prevention of osteoporosis.

Pick almost any medical topic and similar results on Google and PubMed can be had. Do you search for “Fetal benefits of magnesium sulfate” or “Fetal harms of magnesium sulfate”? The truth is, neither search method is appropriate. A better query might be to search for specific outcomes, like “magnesium sulfate AND cerebral palsy” or “magnesium sulfate AND neonatal mortality.” A generalized version of the search might be, “magnesium sulfate AND neonatal outcomes.”

What happens if we repeat the Google search with this strategy? Searching for “autism AND vaccines” (no bias implied) actually produces the best results yet, with 12 results, 10 of which are excellent rebuttals of the purported link, and suggestions from Google Scholar for a number of excellent scientific studies. But we also see two results that make the case for a link between vaccines and autism, giving balance to both sides.

Debiasing Strategies

So what can we do to minimize our susceptibility to the Confirmation Bias? I’ll suggest the following debiasing strategies for the three cognitive arenas where this bias is important: search, interpretation, and recall.

  • Search
    • Perform searches with neutral language, as discussed above.
      • Avoid qualitative words that reflect an underlying bias, as we explored.
    • After finding data that supports a certain conclusion, explicitly search for data to refute that conclusion.
      • Knowing what supports the alternative viewpoint will help strengthen the certainty of your position and being aware of the weaknesses of your viewpoint will stimulate additional questions that need to be answered (and help you anticipate potential criticism of your position).
  • Interpretation
    • Examine evidence for and against a position with the same rigor and process.
      • All papers should be critically analyzed with the same fervor, not just the ones that disagree with your belief.
      • This is made easier by using a standard, routine method each time, such as mine.
    • Be cognizant of the bias of the researchers, journals, or other sources of evidence that you interpret.
      • Really, this is part of a proper examination of a study; but the reality is evidence has to be weighted differently depending on the source.
    • Don’t overestimate or underestimate the quality and strength of evidence, both for and against your position.
      • Understanding levels of evidence and strengths and weaknesses of various types of studies aids in this.
    • Don’t overestimate or underestimate the magnitude of effect of a particular intervention or risk factor.
      • Antioxidants may be shown to eliminate free-radicals in a good clinical trial with a P-value < 0.05, but if the amount of antioxidants necessary to eliminate free radicals in a clinically meaningful way is more than a human is able to consume, then stop being so passionate about antioxidants.
  • Recall
    • Don’t base your beliefs solely on your own experiences or anecdotal evidence.
      • A variety of cognitive biases tend to affect our recall. Most of your most vidid memories are inaccurate and self-serving. We exaggerate our accomplishments and minimize our failures.
    • Keep a database of your patient outcomes.
      • How good is your surgical technique? How many wound infections or re-operations or bounce-back readmissions have you had? Your memory alone cannot answer these questions. What is your primary cesarean section rate? Only a comprehensive spreadsheet or database of your outcomes can provide the objective answers to these questions.

Confirmation Bias is the first of many types of bias that we will talk about. Being aware of our biases is key to reducing medical errors and waste related to cognitive bias.