Posted on

ptolemy

Imagine waking up tomorrow morning and finding a world you don’t know. Everything isn’t as it should be. Things that should be good are now bad. Things that should make people happy instead make people sad. Your fundamental ideas about the world, nature, and people are all sorely wrong. Even things you know to be true about yourself no longer hold up under examination. Imagine that you discover that almost everything you know to be true is in fact false!

The experience of this realization would be termed cognitive dissonance and we might call the anxiety and stress induced by such realization an existential crisis. We all want a feeling of internal consistency; we want to feel like what we believe comports with the world around us. We run away from situations or facts that show us that what we know to be true might not be true.

Indeed, the most important invention of the human mind is its ability to protect itself from evidence that it is wrong. We lie to ourselves almost constantly, filtering out all evidence that we might be incorrect and quickly grasping for any data that tends to support our personal worldview (confirmation bias). When some proof of our misbelief sneaks in, we either change our belief (rarely) or we disregard the proof, discredit it, ignore it, rationalize it, reject it, or misinterpret the evidence (commonly). The more deeply held the belief, the more profound the magnitude of cognitive dissonance, and, in turn, the more profound the rejection.

“The lady doth protest too much, methinks.” – Queen Gertrude, Hamlet

Typically, the louder or more banal the reaction, the greater the cognitive dissonance. We really start fighting when our worldview is under attack. Emotional reactions and passionate responses often betray a lack of confidence in the belief. In other words, people who have a well-supported position rarely need to resort to immature defenses since logic and evidence is on their side.

That unfamiliar world I asked you to imagine waking up in is actually the world that surrounds you now. But your brain has protected you from this stark reality by convincing you otherwise. How many fundamental beliefs do you hold that are wrong or are at least poorly supported by evidence? How many assumptions that guide your daily life are actually invalid? Is it a scary thought to realize that most things you believe are wrong, incomplete, or at least poorly evidenced? This includes many things of which you are absolutely convinced, even though history will look back at those ideas and judge them as foolish and inane.

We do this on a very personal level. Do you believe that you are really awesome? You may suffer from the Dunning-Kruger effect. This occurs when we overestimate our skills. Incompetent people often fail to recognize their inadequacies, lack of skill, and usually fail to recognize incompetence in others, despite plenty of external evidence that they are incompetent. The more incompetent you are, the worse you are at recognizing your (or anyone else’s) incompetence.

Do you think you are really terrible at everything? You may suffer the Imposter Syndrome. This is basically the opposite of the Dunning-Kruger effect. In the Imposter Syndrome, highly competent and high achieving people are unable to internalize and believe that they are competent, despite external evidence of their awesomeness. These phenomena don’t occur because you’re awesome or because you’re incompetent, they occur because you have a view of yourself that is not compatible with reality and your brain can’t make sense of it. They occur because you are either an arrogant tool or because you lack self-esteem.

Science is not immune to the consequences of cognitive dissonance. In fact, science perpetuates the condition by lending rhetorical credence to virtually any idea. If you use PubMed enough, you’ll know that you can find literature that supports almost any belief in science you might have, including exact opposite beliefs.

Do you believe that eating processed or red meats increases or has no effect on your risk of pancreatic cancer? Well, you’re right. Heinen et al. in 2009 found no evidence among 120,852 patients of a link between the risk of pancreatic cancer and eating red meats or processed meats. Hurray! Pass me the hot dogs. Of course, Nöthlings et al. in a 2006 study of 190,545 patients found an increased risk of pancreatic cancers among those who consumed red meat or processed meats. Ah, snap! What are we to do? Pick whichever study best comports with your preexisting world view (at least that is what people actually do, whether that is right or not).

I can give hundreds of examples like this one. But alas, this is not science – this is confirmation bias. Yet “science” is invoked by both sides, like some Oracle on Mount Delphi, as the authority (an appeal to authority) that justifies the potentially false belief. Nor does a consensus or plurality of experts or mass of evidence qualify as science. This too is mere Scientism. Every correct idea starts as a heretical fallacy until, finally faced with an overwhelming burden of evidence, the majority accept the idea as valid.

Okay, but at least we can all agree that elevated LDL cholesterol is associated with an increased risk of mortality. Well, except that this trial in JAMA from 1994 found that elevated LDL and low HDL were not associated with an increased risk of total mortality, coronary heart disease mortality, or hospitalization for myocardial infarction or unstable angina. This 2003 study found that high levels of LDL may protect against atherosclerosis. In fact, according to this 2002 study, low LDL and total cholesterol levels were associated with higher mortality in patients with heart failure. Worse, this 2007 study of over 309,000 people found that lower LDL levels achieved with use of statin medications was associated with a doubling of the risk of cancer. Hmm.

That feeling right now in your head is called cognitive dissonance. I am not asking you to draw any conclusions about what you think about cholesterol or statin drugs (I kind of am, I guess and so has science as the “cholesterol hypothesis” is being replaced), but I am demonstrating that it is easy to pick and choose conclusions from the scientific smörgåsbord. Your mind is quickly rationalizing and reconciling these papers without any real reason to do so, since you’ve never read them (“That’s a bad paper…”, “The increased cancer risk is outweighed by all the good that statins do…”, “That’s only true in people over 70…”, etc. ).

We are all guilty of this. It is hardwired in our brains. It is unavoidable. Whole communities of people are guilty of this in a self-perpetuating and synergistic fashion. This large scale, cultural dissonance leads to large groups of people seeking to harmonize their beliefs with their environments. We need to rationalize the world we live in to avoid cultural dissonance.

Can you imagine a society where the fathers of young teenage boys routinely allow older adult men to have sex with their sons, and, worse, consider it an honor to do so? Well, that was ancient Greek society, and the practice was encouraged and normalized by the greatest minds in Greece, from Aristophanes to Socrates. Why did that society as a whole not defend the young boys and end the shameful practice? Cultural dissonance. We are good at defending what we believe in and we are good at defending what is commonplace, status quo, or widely accepted. We are reticent to deviate from what we are used to. I call this cognitive inertia or normalcy inertia. We see the world not as it is, but as we need it to be.

We are slow to change our views about anything we are deeply invested in. We are comfortable with what we know, what is familiar, what is near, and what is common. We are biased towards these things. This is cognitive inertia. We are attracted to what we are told is normal or what seems normal. This leads to a normalcy inertia. It is comfortable to think that everyone around us, our parents or our friends, have a good grasp of reality. If our parents or our elders or those whom we view as smarter than us or are leaders are wrong about the world, what chance do we stand? We are taught culturally what is normal, and we can hardly see the world elsewise.

Cultural dissonance and cognitive inertia exist in science in what is termed prevailing bias. If the prevailing scientific bias is that the sun revolves around the earth, then all evidence (even evidence that refutes the idea) will be seen as supporting that prevailing belief.

It was, of course, Copernicus who made the first serious, modern suggestion that the earth revolved around the sun, and not the other way around, in 1543. In doing so, he challenged the prevailing view (the prevailing bias) of Ptolemy that had reigned relatively unchallenged for nearly 1400 years. Copernicus did not provide “proof” of this theory, but he did do something that I think is very interesting: he showed that the same celestial observations (the same data) which were accounted for by the Ptolemaic system could also be accounted for by his system. In other words, he showed that there was more than one hypothesis that could fit the data. Tycho Brahe quickly added yet a third hypothesis that was compatible with the data.

At the moment that Copernicus first showed that all known data could be explained by his system as well as by the Ptolemaic system, “science” should have immediately stopped and admitted that, barring any new evidence, both explanations were equally likely. This is a foundational principle of Bayesian updating. But, of course, that is not what happened. His contemporaries couldn’t objectively view the issue due to the prevailing bias and the cognitive inertia that went with it, just like you probably didn’t take seriously the very credible articles cited above about cholesterol. In fact, wide acceptance of the Copernican theory as even potentially valid didn’t really occur until after 1700.

Nevertheless, by 1610, Galileo had discovered the moons of Jupiter, Kepler had shown that the planetary orbits were elliptical rather than circular, and by 1639 Zupi had discovered the phases of Mercury. All of these were crucial developments and discoveries. If science had properly held heliocentrism as at least equally likely as geocentrism in 1543, it should have accepted it as the most likely explanation by 1610, and all but proven by 1639.

But none of this happened. It took nearly another hundred years, and this is not because science is slow and careful, it is because science is biased and imperfect and only gives up prevailing ideas when forced to or when those who are emotionally invested in the ideas die. This is literally what happened in the Copernican revolution. The diehards resisted and rationalized and denied until those diehards …. well …. died. This is cognitive inertia.

Magnesium sulfate tocolysis is an example of this that immediately comes to mind in Obstetrics. There is absolutely no scientific evidence that the practice works or is beneficial, and this has been the case for over 30 years. Yet it persists, and its true believers simply cannot be convinced otherwise. It is a prevailing bias.

It doesn’t need to be this way. The problem in all of these examples is attachment. We are too attached to the ideas and beliefs that we hold, and they define us, they give us meaning and purpose and comfort. As good as humans are at rationalizing away evidence that contradicts their beliefs, we are also pretty good at determining what is true and what isn’t … as long as we are not attached to the outcome. Indeed, our brains are perfect computers for Bayesian updating, and therefore for making decisions and deciding the epistemological value of evidence.

If you don’t care whether Chevy or Ford is better, and you are presented with objective data about a truck from each manufacturer, you will likely make a good decision. But we care too much about Chevy vs Ford so this is impractical … unless we blind ourselves. Blinding is helpful in science expressly because of issues like prevailing bias. As much as we talk about blinding in scientific studies, we don’t blind where it really matters. We should blind study designers to what is being studied. We should blind statisticians to what the data represent. We should blind peer reviewers to the subject of the paper. Sounds crazy? It’s not and it’s all possible and effective.

We all have learned that an “appeal to authority” is a logical fallacy, but why is it? Shouldn’t authorities be the ones who most adeptly are able to decide the truth regarding a particular issue? Shouldn’t expert peer reviewers be the first choice to review papers in their respective fields? Shouldn’t those who have dedicated their lives to a particular area of medicine be the ones who make the practice guidelines about that subject? Well … no. An appeal to authority is a fallacy because it is an appeal to bias. (I realize that some texts relegate the appeal to authority fallacy as referring only to an appeal to an unqualified authority, but these authors miss the point and find themselves in disagreement with the history of the concept).

Those who are most invested personally in a field are the ones least likely to make objective decisions; they are the most biased and the most emotionally invested; they are the ones most likely to have cognitive inertia; they are the ones most likely not to see the forest for the trees. What’s more, they are really good at defending their position. A rhetorician and logician no less than Socrates himself defended pederasty, and he was  certainly a qualified authority. But his ability to make a convincing argument that justified his worldview doesn’t qualify as evidence of the moral rightness of pederasty.

The experts rebuffed and scorned Copernicus for over 100 years; yet a schoolboy, uninvested in how the planets work, who was presented with an argument for heliocentrism by Galileo and an argument for geocentrism by one of his critical contemporaries, would have easily decided that Galileo was right. In the same way, a medical student shown the body of research for and against magnesium sulfate tocolysis will dismiss the idea in mere seconds, while a high risk obstetrical specialists with forty years’ experience clings to the falsehood like his life depends on it.

In other words, we typically don’t let facts speak for themselves; rather, we speak for them.

Some of the experts we depend on to make important scientific decisions would not be allowed on a jury to decide a case if it were revealed how invested and therefore biased they are regarding the subject matter. Yet we freely and liberally allow their opinions to go unchallenged on an endless number of scientific subjects.

The idea of prevailing bias has been studied in the scientific literature. John Ioannidis has stated that in some fields of research, like nutrition, “the extent that observed findings deviate from what is expected by chance alone would be simply a pure measure of prevailing bias.” Remember the question about whether red meats and processed meats are associated with an increased risk of pancreatic cancer? The answer according to some very poor meta-anlyses of the subject is yes (here’s one), but this yes answer (of a very small, negligible magnitude) likely represents just the prevailing bias in the nutrition field that red meat and processed foods are bad. How do I know? Because of the poor design quality of the original studies, the poor quality of the derivative meta-analyses, and the very minute magnitude of effect. But try telling all of that to a vegan. The vegan would accept, without question, even the poorest of evidence that red meat increases the risk of cancer, but would fight tooth and nail against any claims, even from high quality studies, that vegetables cause cancer (I’m not claiming that vegetables cause cancer!).

This recent study examined how likely a person is to change his belief when presented with incontrovertible evidence that the belief is wrong. Study participants were happy to change their minds about subjects like “Thomas Edison” or “Reading early predicts intelligence” when confronted with contradictory evidence, but they were very unlikely to change their minds about “abortion” or “gay marriage” given equally strong evidence. In general, they were very unlikely to change their minds about political or religious beliefs (since these are deeply held) yet likely to change their minds about things like Thomas Edison because their beliefs about Thomas Edison don’t define them; their senses of self don’t care about facts related to Edison’s life. Objectively, they should have been willing to change any belief with equal probability when contradictory evidence was presented – but they did not.

So what’s the moral of all of this? Well, I’ll let you decide, because you already have anyway. If I say something you agree with, you’ll say ‘Amen,’ and if I say something you disagree with, you’ll attack my character and my halitosis. But such is life.