Have you ever tried to convince someone that she should vaccinate her children? Or that she should have a vaginal birth after cesarean in the hospital rather than at home? Have you ever tried to convince someone to vote a different way? Or to take a prescribed medicine rather than an alternative medicine? Have you ever tried to convince someone to change based on new data or evidence? Or convince him that something he believes is false?
If you’ve ever tried these things and failed, keep reading.
(If want the bottom line of what works, start reading past the picture of the picture of the arrow below.)
I spend a lot of time thinking about why people misunderstand or, perhaps, don’t care about “facts.” Or, even, more simply, what “facts” are in the first place. Why do physicians do things which are clearly contradicted in the medical literature or contradicted by widely-established guidelines? Why do we continue to repeat myths and perpetuate ignorance? Why do people see evidence and almost immediately and thoughtlessly discard it if it doesn’t agree with what they already believe?
We can describe these behaviors when we talk about bias, like confirmation bias, and other cognitive biases that affect how we respond to information. A lay term for the phenomenon might be “close-minded,” meaning that a person is not open to new ideas or ideas that challenge what he already believes. A lot of research has demonstrated that people don’t have good awareness that they are biased or close-minded and also, perhaps not unexpectedly, that teaching them about bias or pointing out their biases doesn’t help change their minds. In fact, it may even make them less likely to change their minds. Many people who are aware of cognitive bias use that awareness as a tool to more successfully convince themselves that they are right and everyone else in the world is wrong. It is easy to see everyone’s bias but our own.
If there were a verb meaning “to believe falsely,” it would not have any significant first person, present indicative. – Ludwig Wittgenstein
People are reticent to change their minds because they can’t possibly be wrong and anything that contradicts their correctness must itself be wrong. C’est la vie.
So, a more interesting question to me is why many physicians won’t budge from a long-held practice or belief despite good evidence, yet at the same time they will accept a new practice rapidly with almost no evidence? There clearly is a difference in the way we handle evidence depending on whether we already have an established opinion about the subject matter. For example, if a person is of the opinion that magnesium is an effective tocolytic, apparently no amount of good data will convince him that it is not; on the other hand, if he doesn’t have an opinion about whether betamethasone should be given to patients between 34-37 weeks gestational age, he will immediately adopt this practice on the weakest of evidence presented to him. In the former example, he had a well-established opinion so the evidence was irrelevant, but, in the latter case, he had no opinion so any evidence at all was seemingly sufficient.
A person who believes that vaccines are harmful will persist in this false belief despite overwhelming and consistent evidence to the contrary; but that same person might start using apple cider vinegar to treat some ailment without any evidence whatsoever that it is effective. Someone may persist in holding a long-believed political view that is not compatible with any evidence, and refuse to alter her opinion about the idea, while at the same time take up a new view or opinion without any compelling evidence simply because a favored political leader has suggested it. So, it’s not that people don’t change their minds or evolve their thinking; quite the contrary. It’s just that they don’t do so based on evidence or reason or logic.
No evidence is powerful enough to force acceptance of a conclusion that is emotionally distasteful. – Theodosius Dobzhansky
Notice that in each example I gave, the level or quality of evidence is irrelevant in the decision, but the difference in each is whether an opinion is already present in the mind of the person who is choosing to believe or disbelieve. In many cases, the authority that disseminates the data or guideline is also irrelevant (though there is some evidence that some trusted figures, like politicians or media personalities, can be effective in changing minds). The American Congress of Obstetricians and Gynecologists (ACOG) says that we should consider using antenatal betamethasone from 34-37 weeks in pregnancies at risk of early delivery, and a physician might cite the authority of ACOG to support adoption of this novel practice; but even though ACOG also says not to use a tocolytic in this same circumstance, the provider might very well do so, ignoring ACOG and its authority in this instance because it disagrees with his long-established practice of using a tocolytic at the same time as a steroid.
Long-held beliefs are usually part of a narrative that helps us explain the world around us. It is the narrative that has so much permanence. If you see the world in a certain way, consistent with a narrative that helps your brain maintain order, then you will only see evidence that agrees with you and you will ignore data that disagrees with you. This is the essence of the confirmation bias and it is why facts don’t seem to matter when making decisions. It is hard to deconstruct a long-held narrative, even though it might be entirely false. What’s more, different narratives can rely on the exact same, undisputed facts; if you look for it, you will see this in political discussion all the time. As I have written about before, the exact same “facts” support that Obama was the greatest jobs-creating president of all time and that he was the worst of all time. The facts are the same, the narrative is different. Both sides can sleep easy at night knowing that their world views are supported by the same facts.
Samuel Arbesman has described three types of facts:
- facts that don’t change,
- facts that change constantly,
- and what he calls “mesofacts,” or a slowly changing fact that is somewhere in between the first two.
I’m not sure that the category “facts that don’t change” is quite right; as an example, Arbesman cites the height of Mount Everest or the capital of the United States. Both of those are facts that could change though. You probably learned in grade school that Everest was 29,029 feet tall, but the most recent and modern measurement from 2005 recorded the height at 29,017 feet, and, worse, due to tetanic shifts, it probably grows by about 4 mm every year. So, that might not be the best example of a fact that doesn’t change.
At the other end of the spectrum are facts that change quickly, like the temperature outside or a stock price. Indeed, Arbesman observed that people are very comfortable with these types of quickly-changing facts since there is no expectation that they should be fixed.
Mesofacts, on the hand, change slowly enough that people think of them more like the height of Mount Everest rather than the stock price of the Gander Mountain Company. Arbesman cites the example of the world population; at some point you memorized the population of the world for school, but that number was out-of-date on the date you learned it.
Most “facts” are mesofacts, and the problem is that people assign too much permanence to them. Certainly, most medical knowledge could be described as “mesofacts.” These mesofacts form the building blocks of complex narratives in our brains that describe the world around us; our brains fill in the blanks in our knowledge to avoid cognitive dissonance. This means we make up new facts, which serve to smooth out our narrative, and these new facts are themselves dependent on mesofacts that may no longer be true.
Eventually, we have a hard time knowing which facts are which, and the whole house of cards comes tumbling down. Except, it doesn’t. Our brains put up a fight and don’t allow the cards to tumble. We dig in deeper, supporting with all our might the false narrative based upon false beliefs and other made-up facts. Worse, we don’t even know this is all happening. We usually don’t critically think about evidence but rather we rationalize evidence in a way that agrees with our preexisting narrative. We confuse this rationalizing for critical thought.
When one “fact” changes, it often has downstream effects due to the house of cards built atop the first assumption that the first fact was true. When that fact is falsified by new data, it causes cognitive dissonance and our brains rationalize away that new data (ignoring it in most cases). You might call this being “close-minded,” that is, unable to fairly assess evidence that challenges our opinions or even being able to admit that such evidence exists.
Still, how valid an opinion might be is certainly related to how good the original data is. Even long-held beliefs start somewhere; if they are based on good facts to begin with, that’s better than if they are just based on what your parents or teachers or attendings or celebrities told you. But because most facts are mesofacts, these facts almost always change. How well your view of the world (your internal narrative) comports with reality is directly related to how many of your mesofacts are up to date.
You were probably taught as a child that there were 9 planets and 7 continents; those are mesofacts. Today, we currently recognize the existence of 8 planets and 8 continents (sorry Pluto and welcome Zealandia!). We all know that spinach is a great source of iron (yay, Popeye!), but it isn’t at all (squash has about 6 times as much iron per serving – the myth that spinach had a lot of iron was based on a 19th century mistake). Doctors love telling patients who they think need more potassium to eat a banana, but the average potato has at least three times the amount of potassium as the average banana. Oh, and if the doctor was recommending the potassium because of leg cramps, well, that’s a myth too (or maybe we should call it an old fact or a mesofact).
In this way, complex narratives and assumptions about the world are created. Because of two old mesofacts (that leg cramps are caused by potassium deficiency and that bananas are a good source of potassium), a whole false narrative is constructed and these narratives die very hard. I don’t have the data, but I’ll bet good money that most doctors tell patients with leg cramps to eat bananas, and that many still recommend that anemic patients each spinach. Certainly, many obstetricians are still using magnesium as a tocolytic. I saw someone yesterday at the gym doing cardio wearing a hoody so that he could “burn” fat more effectively (based again on a false fact). Despite undisputed evidence of the falsity of these narratives, the incorrect advice continues.
We encourage students to learn these mesofacts. Our multiple-choice tests, which emphasize recall of facts stripped of context, reward the students who can best remember the height of Mount Everest or how many continents there are, even though there is little actual value in knowing these “facts.” Yet we give them value by teaching them and testing them, and that is the real lesson that the student receives – that this fact is important, valued, and permanent. A mesofact, once learned in this way and reinforced, gains permanence in the mind of the student. Students construct narratives around the facts as a memory tool; the narrative itself may be false even if the fact is correct.
Medical school and residency does a great job of rewarding students who regurgitate mesofacts and punishing those who cannot. We assign value to many practices not just through good or bad test scores but we also attach visceral value to them through encouraging fear of litigation or other serious, professional repercussions.
Why do obstetricians continue to use magnesium as a tocolytic despite overwhelming and incontrovertible evidence that it lacks efficacy? Because the magnesium as a tocolytic “fact,” perhaps a prevailing mesofact in 1982, hasn’t been abandoned yet because so much value and importance has been assigned to it (“This is saving babies lives;” “If you don’t give it and the baby has a bad outcome, you will be sued and you will lose;” etc.) It’s hammered into young, impressionable minds and it was never presented as something that lacks certainty – it is a fact!
The basic sciences in medical school present prevailing theories as if they are established and unchanging “facts” and students are rarely told that those facts are subject to change. You can tell what year a person graduated medical school simply by what bag of basic science “facts” she carries. You can tell what year someone graduated residency similarly by what bundle of practices she uses.
In Obstetrics, we see many prevailing narratives which establish many practice patterns based on fundamental misperceptions. For example, “contractions cause labor,” “meconium causes meconium-aspiration syndrome,” “embolized amniotic fluid causes amniotic fluid syndrome,” “mechanical weakness causes cervical insufficiency,” or “late decelerations cause hypoxic ischemic encephalopathy.” These narratives, based on false or incomplete underlying mesofacts, in turn lead to practices like tocolytic usage, amnioinfusion to dilute meconium-stained amniotic fluid, avoidance of amniotomy during labor, cervical cerclage placement for women with a prior cold knife conization of the cervix, or cesarean delivery every time a late deceleration pops up on the monitor. A non-obstetric example might be the lipid hypothesis of heart disease, which, even though it has no good scientific support, still dominates the preventative health recommendations and practices of most primary care physicians.
Once a narrative is constructed which neatly explains the world around us or gives us some comfort when we are scared or uncertain, it is hard to move away from it. Deconstructing the narrative creates too many gaps in understanding, too much fear, or too much doubt, and this causes us discomfort. These narratives are the substance of our biases, and we become incapable of even seeing evidence that contradicts what we already believe. Our brains are expert rationalizers. The biases are the filters through which everything in the future is sifted.
In medicine, these narratives are deeply entrenched in medical school and residency. Most of the “facts” that are taught to residents or medical students are just the facts for that brief time; medical knowledge changes at a rate closer to stock market prices than to mountain heights. But even worse, many of these “facts” should’ve never been accepted as facts in the first place. There was never, ever, a shred of data that said that magnesium was an effective tocolytic; it’s only through its adoption as a mesofact that its use has persisted as long as it has.
This is how propaganda works: introduce a “fact,” repeat it many times, make it part of a complex narrative (that is dependent upon the “fact” being true), use that narrative to explain the world to your audience, and, voila!, your new “fact” will enjoy a long life because of its attachment to the narrative. This type of argument accounts for 99% of all political discourse but also a surprising amount of medical and scientific discourse. If you say something enough, if you repeat a fact or a falsehood enough, people will believe it. We tend to believe things that are familiar because our brains take shortcuts. It doesn’t matter why it is familiar though. Rejecting old information requires a lot of effort, and our brains are just lazy.
Herbert M. Shelton said,
It is always a much easier task to educate uneducated people than to re-educate the mis-educated.
Shelton should know – he was of the biggest quacks and purveyors of woo who ever lived and made his living conning and deceiving people. His lies, ranging from naturopathy to fruitarianism to vaccine denial, harmed millions of people. But his fiction seldom convinced a person who already had established beliefs to the contrary. Unfortunately, we face the same challenges when we try to re-educate the mis-educated.
In many ways, it is our desire to construct a narrative that explains the world to us that is our biggest cognitive weakness. It is during the construction of narratives that facts are memorialized, gaps are filled-in with fiction, and, by projecting out an incorrect view of the world from our minds onto reality (the Mind Projection Fallacy), we put up roadblocks to ever being able to perceive the truth and weigh evidence fairly. These false narratives make us close-minded. We can’t even understand our own biases or see our blind spots.
Just as your brain has cleverly filled in for you the missing part of the world you can’t see due to the retinal blind spot, so too your brain has constructed actual delusions in your thought processes so that the world comports with your mind in the most comfortable and efficient way. Psychologists call this “motivated reasoning,” meaning that you are motivated to interpret information in a manner that is consistent with your predetermined tendencies. Motivated reasoning sounds good, but it is not.
Our brains seek to know and understand, but many things in the world, particularly many things as complex as human health or social policy, are fraught with uncertainty. Rather than accept uncertainty, we again construct a comfortable narrative that explains the world. Right or wrong, never in doubt is the mantra of most of our minds. Remember, there are few things as dangerous as human intuition; most intuition is just delusion.
Not only do these narratives alter our ability to learn new information, they also lead to the persistence of ideas – false ideas. Essentially, because of mesofacts, your world view may be well out of date. You see this in physicians who practice medicine like they are stuck in the 1980s – even though they themselves likely ridiculed their then contemporaries who practiced medicine like they were stuck in the 1950s. This fallacy is worsened by what I call the normalcy fallacy – the belief that the world as it exists during your early life is the way the world should always exist and should have always existed. Each generation thinks that they have arrived and conquered all prior generations and each views the world in which they develop as the way the world should always and forever be. Each generation condemns all prior generations for their ignorance, close-mindedness, and lack of progress. Then, when that last generation matures, it condemns the next generation for its ignorance, close-mindedness, and lack of progress.
But the world does change. So can minds change? Or can we change minds? Can we move forward from falsities?
So, what can we do? How do we teach an old dog some new tricks? How can we be self-aware of our universal tendencies to do these things and avoid them? How do I know if I’m doing them?
There aren’t any easy answers.
Brendan Nyhan, from Dartmouth, has contributed to and summarized most of the literature related to challenging false beliefs. You can read his white paper about the question here, but I’ll summarize some of the important points and expand upon them.
Again, the question is, How can we change minds? How can we prevent people from believing falsities? You might want to convince your patients to vaccinate their children or stop using homeopathy to treat their medical conditions; I want to convince you stop using magnesium as a tocolytic. So how?
Educational level doesn’t help; the more educated people are, the more empowered they are to “rationalize” data to fit their prior beliefs. It might be a hard pill to swallow, but often those most guilty of holding false beliefs and being close-minded are the most educated in society. This is the group of folks that Nassim Nicholas Taleb calls Intellectuals Yet Idiots, and his brilliant essay here is worth a read (if you read it, ignore the GMO-bashing – though I support Taleb’s intellectual right to have his opinion more than I support Jenny McCarthy’s right to criticize vaccines). Academia is a perverse place where pseudo-intellectuals, convinced of their correctness, use their assumed position of authority and intellectual superiority to end arguments and despise outsiders; of course, this is a logical fallacy.
Information delivered from a high-profile or believable source doesn’t seem to make a difference either in most cases. People disregard data that they want to disregard regardless of who delivers it. At the same time, respected politicians or other famous compelling figures can alter public perception. If a politician already has your attention by supporting a number of narratives you believe in, then that politician can introduce other ideas that you will likely also believe in without any good reason to; by attachment, you will likely believe in the new narrative as strongly as the old one. In politics and in life, narrative – not facts – is everything.
You probably will immediately recognize who made the following statement because the narrative is so firmly entrenched in your mind (even though you may not have heard the speech):
After years of neglect, this administration has taken a strong stand to stiffen the protection of our borders. We are increasing border controls by 50 percent. We are increasing inspections to prevent the hiring of illegal immigrants. And tonight, I announce I will sign an executive order to deny federal contracts to businesses that hire illegal immigrants.
Let me be very clear about this: We are still a nation of immigrants; we should be proud of it. We should honor every legal immigrant here, working hard to become a new citizen. But we are also a nation of laws.
That’s right: President Bill Clinton. If you thought Donald Trump was the one who said it and you feel a little weird right now, that’s called cognitive dissonance. Okay, try this one:
I believe marriage is between a man and a woman. I am not in favor of gay marriage.
Correct. That was Barack Obama. I’ll stop now, but the point is that narrative, not facts, are the only important thing in politics. Famous and influential people can promote narratives and then nudge the facts or evidence that support those narratives one way or another.
Remember, it wasn’t a Nobel Prize winning scientist who raised the alarm about vaccine safety, it was a Playboy Playmate. People love celebrity and they are more influenced by a well-known person than by an authoritative person.
Framing a false perception or belief as something that is socially unacceptable (in other words, using peer pressure) probably does not work well either. In fact, it likely causes people to dig in and more strongly support the belief as they seek to protect their egos. An example of this is presenting anti-vaxxers as a fringe or marginalized group of nutjobs; that type of ad hominem attack usually results only in a stronger belief against vaccines. It is also patently absurd; they aren’t crazy – they suffer from the same mental block that you likely suffer from, just about a different issue than you. So, if they are crazy, we all are.
Unfortunately, this type of rhetoric is far too common. Debates on the Internet are characterized by memes and trolls. Most such rhetorical devices rely on ad hominem attacks and they don’t change anyone’s mind, but they are popular with supporters who feel morally superior because they have a different opinion than someone else. But if you are actually interested in persuasion, stay away from such nonsense.
People also resist arguments which threaten their identity or sense of self or which might threaten their preferences for a particular policy. So, for example, an environmental activist might not be comfortable with these facts:
- CO2 emissions have fallen dramatically over the last decade in the US;
- Fewer species are going extinct in America now than ever before;
- There’s less pollution today than ever before.
An environmental activist isn’t likely to believe these facts because they threaten her sense of self and they might lead to less focus (politically) on the things that she believes are important. But here’s a lesson: if you want to persuade her, start your presentation of these facts like this, “Because of environmental activists like you and good public policy, we can say today that…” Boom. It’s all about the narrative.
Nyhan found that educating people about the autism-vaccine myth did lead to fewer people believing in it but also led to fewer vaccinations! The narrative isn’t that vaccines cause autism, the narrative is that vaccines are bad. Simply updating one mesofact doesn’t change the whole narrative. Similarly, Nyhan and Reifler found that disabusing people of the idea that the flu vaccine can cause the flu also led to less flu vaccine utilization. The narrative is that the flu vaccine is bad; the idea that the flu shot causes the flu is just one excuse. Challenging people with facts just encouraged more resistance. So, the lesson here too is to stop assaulting people who have opinions different than your own with facts. You have to do something more than that.
For many of the same reasons, fact-checking doesn’t seem to work either. For one thing, people just go to the fact-checker that they know has the same bias that they do, so fact-checkers become little more than an an echo chamber that strengthens false ideas. In politics today, we see fact-checking has just devolved into playground theatrics with ad hominem after ad hominem. None of this noise persuades anyone; it is just red-meat for the wolves.
Fact-checking happens in medicine, too. Doctors look things up in textbooks, PubMed, or other places. But they construct their searches in such a way that they are likely to find evidence of what they already believe to be true; when they do find evidence, they stop looking. This is called search satisfying.
As an aside, one of the major pitfalls of science today is that scientists violate one of the most fundamental principles of the scientific method almost every day: Science is about disproving what you believe, not proving it. When you design a study, you should do so in the best possible way you can imagine to disprove your pet theory. When you do a Google or PubMed search, you should be looking for evidence that disproves your idea. But we almost never do. If you are a conservative, use a liberal fact-checker, and if you are a liberal do the opposite. You don’t have to believe them, but you do need to be used to having your ideas challenged. If you cannot defend your idea against the best counter arguments, your idea is probably wrong. But unless you deliberately look for counter arguments, your idea will never be thoroughly challenged. I have heard these type of discussions among “scientists” designing new studies: “Well we want to prove such and such so we should probably do such and such.” This is a major reason why so many published research findings are false.
What else can we do to change minds? We do know that alternative narratives that come from someone who is ideologically sympathetic has benefit (Berinsky). So, if you are a liberal, for example, you are more likely to believe something that Rachel Maddow tells you rather than Rush Limbaugh, even if it is the same thing and even if it disagrees with your current belief. In medicine, this means that refusing to see patients who don’t vaccinate their children is probably not the right way to convince them to vaccinate their children.
Instead of refusing to see patients who don’t take your advice (sounds very childish stated that way), try this next time:
“I can understand why you have concerns about vaccination. My partner and I also struggled with the idea. There’s so many things out there that you hear. Lots of scary stories. But we spent a lot of time researching and reading and ultimately came to the decision that vaccination is absolutely the right thing to do. We want our children to be as safe as possible and I’m sure you do too.”
That wasn’t so hard, was it? Find common ground. Empathize with them. It amazes me that some pediatricians don’t care at all about seeing mothers who smoke around their babies or parents who just don’t really care about their kids, yet a mother who is genuinely concerned and who has invested a lot of time and energy into trying to do what is best for her child isn’t welcome in their practice. The same goes for obstetricians who condescend to patients who are interested in home births or other birthing options.
Patients like this have a narrative, and you must build upon that narrative and fix the faulty parts.
“I’m glad you are interested in birthing alternatives; we have definitely over-medicalized the birth process and as a result babies and mothers are being harmed. The cesarean rate is way too high. We do way too many interventions. Here are my thoughts about how to fix some of this…”
See? Same narrative, gently nudged.
Often, we compete with very powerful counter-narratives. Have you ever watched a YouTube video of a child who was legitimately harmed by a vaccine? Yes, vaccines do kill and injure people. The videos are devastating and heartbreaking. Rather than minimize the emotions that are generated by this type of story, you should embrace them and carry them forward. But change the conclusion. Remember, the same facts can support opposite narratives. Accept the facts and change the narrative. Focus the genuine emotions of your concerned parents towards stories of children who were harmed or killed because they were not vaccinated. Give specific examples. Know the names of the children. Show the videos.
When people are forced to counter-argue against your version of the “facts,” the exercise of making the argument further reinforces their beliefs. Challenging people by arguing about facts rarely works. Telling people that they are wrong rarely works.
Here are two ways a vaccine conversation might go. First:
Parent: “We have decided not to vaccinate our child. We are just worried that her immune system can’t handle all of those antigens at once.”
Physician: “That’s just not true at all. The immune system is designed to handle thousands of antigens every day. Vaccines are the most valuable thing we do for children. They don’t cause autism or any other crazy crap you read on the Internet. In fact, they are so valuable, if you refuse vaccines, we will not see your child in this practice.”
The result of this conversation is that the patient leaves, child unvaccinated, and may not have access to a doctor in the community for when her baby gets sick. Here’s another way it could have gone:
Parent: “We have decided not to vaccinate our child. We are just worried that her immune system can’t handle all of those antigens at once.”
Physician: “Okay. I understand that there’s a lot of scary things about vaccines. I have seen some heart-breaking stories. Tell me what your understanding of the risks and benefits of vaccine are.”
The parent will then say something. She will feel heard and validated. She will get to make her point. Then the physician will have a chance to clarify some key misperceptions and nudge the narrative. She considers herself to be well-informed and educated about the issue. She didn’t make the decision to not vaccinate her child just because she saw a Facebook meme. How can you counter her position if you don’t even know what it is? Ask open-ended questions first. Don’t lead her to dig-in to her ideas even more.
You don’t need to win every point; the simpler the story the better. You just need to win one point and take control of the narrative. The more emotional and empathic the story, the better.
A man with conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point. – Leon Festinger
The most important thing we can do is nudge or replace the narrative. Rather than getting caught in the mud of the facts, for which there is a never-ending back and forth of evidence for and against, just present a powerful, emotional narrative.
“Vaccines have saved the lives of millions of people in the last 100 years. In 1850, half of all children died before the age of 5. Can you imagine what it must have been like to be a parent then? But vaccines have made that unthinkable today. Yes, there are some risks but fortunately they are rare and vaccine safety continues to improve.”
There are many ways to do it, but that new narrative provides a new perspective; it’s powerful, it’s compelling.
Make sure you understand the patient’s narrative in detail. Part of her narrative might be a distrust for Big Pharma and a fear that others are pumping her children full of poison in order to profit from her. You need to replace that narrative too:
“It’s disgusting what Andrew Wakefield did. Lawyers paid him over half a million dollars to do dangerous experiments on children and publish false research findings. What he did to those 12 children in the study just to make some money from lawyers is unconscionable. I’m glad they took his license away. It’s staggering how many children have been harmed by his greed.”
Focus on the facts that you think are important, not the myths that the patient might believe. You don’t have to argue about falsehoods, you just have to provide facts to construct a strong counter-narrative, one that is easier to believe and more compelling. This is the only way to displace the current narrative. When you focus on the falsehoods, you risk making the person feel stupid and forcing her to put up defenses that make your job easier.
When you fail to change her mind, don’t worry. Reach what consensus you can and provide compassionate care. She will see that you care (more than a stranger on the Internet) and eventually she will learn to trust you and your opinions.