Transcript for facts aren't enough: the psychology of false beliefs


(SOUNDBITE OF ARCHIVED NPR BROADCAST)

SHANKAR VEDANTAM, HOST:

This is HIDDEN BRAIN. I'm Shankar Vedantam. In 2012, Maranda Dynda had a lot on her plate. She was 18. All her friends were getting ready for college and busy being teenagers, but Maranda was on a different path. She was pregnant.

MARANDA DYNDA: So I was very nervous, and I felt really uneasy but excited at the same time.

VEDANTAM: To calm her anxieties, Maranda did some research on childbirth.

DYNDA: I decided I wanted a home birth.

VEDANTAM: For that, she would need a midwife. She searched for months for someone who could come to her apartment in rural Pennsylvania. Eventually, she found a woman who seemed perfect. We'll call her M, her first initial, to protect her privacy. From the very first moment they met, Maranda knew everything was going to be fine.

DYNDA: She said she had been a home birth midwife for over a decade.

VEDANTAM: She was no nonsense, the mother of eight.

DYNDA: She was very open and honest and friendly, so she definitely seemed like someone who I could trust.

VEDANTAM: About an hour into that first meeting, M brought up a question that struck Maranda as strange.

DYNDA: She said, have you ever considered not vaccinating?

VEDANTAM: Maranda hadn't. Vaccines had never crossed her mind.

DYNDA: So I asked her - you know, I was very confused. I was like, what do you mean you - why would I consider that?

VEDANTAM: M explained that, years ago, something bad had happened after she vaccinated her first child. She went on to describe a progression of events that lead some parents to a powerful but faulty conclusion. M told Maranda that right after her son got his shots, his development regressed. One minute, he was fine; the next, he was autistic. She said the light had left his eyes, so M decided not to vaccinate her other children.

DYNDA: And she very much implored me to do the same and to look into it, so I did.

(SOUNDBITE OF MUSIC)

VEDANTAM: Maranda started on Google. It led her to Facebook groups.

DYNDA: It's very easy to find them. So, yeah, even if you just Google, you know, support groups for parents who don't vaccinate, you will find a lot.

VEDANTAM: The moms in these groups echoed what M had told her.

DYNDA: And they welcomed me with open arms and immediately were just practically bombarding me with information, telling me your midwife's right, this is why you shouldn't vaccinate, this is why I don't vaccinate, this is what happened to my child who I did vaccinate versus my child who I didn't vaccinate - things like that.

VEDANTAM: Everyone was caring and attentive. They didn't just talk about vaccines. They talked about regular mom stuff, things that Maranda found hard to talk about with anyone else.

DYNDA: Diapers and birth plans and hospitals and midwives and breast pumps and stuff like that.

VEDANTAM: Maranda trusted them.

DYNDA: To me, it seemed so clear. It seemed like I had just found this secret information that only some people come across. And I thought, why would I not use this information? Why would I not use this to my benefit, to my child's benefit? So it did not take me very long at all before I was solidly saying I will not vaccinate my child when she is born.

(SOUNDBITE OF SONG, "TO RAMONA")

BOB DYLAN: (Singing) Ramona, come closer, shut softly your watery eyes.

VEDANTAM: She named her daughter Ramona after the song by Bob Dylan.

(SOUNDBITE OF SONG, "TO RAMONA")

DYLAN: (Singing) The pangs of your sadness will pass as your senses will rise.

DYNDA: Ramona, as a newborn, she was very active. She was very bright. She was very happy. She was a great baby, honestly. She was a wonderful baby.

VEDANTAM: When the doctors said it was time to vaccinate Ramona, Maranda was ready. She had a script she'd been practicing in her head for months.

DYNDA: And I said, no, thank you. We - I have decided that I do not want to vaccinate. Please, respect my opinions. Thank you very much.

VEDANTAM: For the next two years, Maranda continued to say no to vaccines. Occasionally, when she encountered information that conflicted with her decision, a pamphlet at the doctor's office, a website, she dismissed it.

DYNDA: I just very quickly went, that's not true. I don't agree with that. And I moved on.

VEDANTAM: At some point, though, her conviction started to waver. Those doting moms on Facebook, they had some weird beliefs.

DYNDA: People denying that AIDS exists, people saying that the reason there's gay people is vaccines - on and on and on with really crazy conspiracy theories.

VEDANTAM: And then it hit her. If she didn't believe those ideas, why was she trusting them on vaccines?

DYNDA: And I stepped back. I stopped going to the Facebook group as much, and I decided I needed to look at this issue from a purely logical perspective - no emotion in it, no, oh, my God, what if something happens to my baby? And I completely readdressed the issue all over again pretty much from the start.

(SOUNDBITE OF MUSIC)

VEDANTAM: Maranda started seeking out perspectives that the moms had urged her to avoid - information from the Centers for Disease Control and medical journals.

DYNDA: I started reading all kinds of things, basically opening my mind to more than just vaccines are bad to the other side of the coin.

VEDANTAM: It didn't take her long to change her mind. She got Ramona vaccinated. Looking back, Maranda can't believe how easy it was to embrace beliefs that were false.

DYNDA: And what I would say to someone who's about to become a new mom, especially if they're a young mom, is don't try to confirm your own fears online. It is so, so easy to Google, what if this happens, and find something that's probably not true that confirms your fear, that confirms your anxieties. Don't do that.

(SOUNDBITE OF MUSIC)

VEDANTAM: Maranda's story tells us a lot about the psychology of false beliefs, how they spread and how they persist even in the face of conflicting information. This week, we look at how we rely on people we trust to shape what we believe and why emotions can be more powerful than facts.

(SOUNDBITE OF MUSIC)

VEDANTAM: Tali Sharot is a professor at University College London. She is the author of "The Influential Mind: What The Brain Reveals About Our Power To Change Others." Tali is a mom and so she understands on a personal level why Maranda was so worried about the safety of her child. A few years ago, when her baby was just a few weeks old, Tali was listening to a Republican presidential debate.

(SOUNDBITE OF ARCHIVED RECORDING)

PRESIDENT DONALD TRUMP: You take this little, beautiful baby and you pump - I mean, it looks just like it's meant for a horse, not for a child.

VEDANTAM: Candidate Donald Trump had been asked a question about the safety of childhood vaccines.

(SOUNDBITE OF ARCHIVED RECORDING)

TRUMP: And we've had so many instances, people that work for me, just the other day, 2 years old, 2 1/2 years old, a child, a beautiful child, went to have the vaccine and came back and a week later got a tremendous fever, got very, very sick, now is autistic.

TALI SHAROT: So when I was listening to Trump at that debate, it really tapped into this fear that I had and the anxiety that I already had. And when he talked about this huge syringe, a horse-sized syringe that was going to go into the baby, in my mind, I could imagine this syringe inserted into my small, little child and all the bad things that could happen. And this was a very irrational reaction on my end because I know that there is not an actual link between autism and vaccines. But it's not enough to have the data. Ben Carson - Dr. Ben Carson - was on the other end.

(SOUNDBITE OF ARCHIVED RECORDING)

BEN CARSON: Well, let me put it this way. There has - there have been numerous studies, and they have not demonstrated that there is any correlation between vaccinations and autism.

SHAROT: But that wasn't enough because the data is not enough. And even if the data is based on very good science, it has to be communicated in a way that would really tap into people's needs, their desires. If people are afraid, we should address that.

VEDANTAM: I'm curious. When you sort of contrasted, you know, the weight of the evidence on the one hand and this very powerful image of the horse syringe and your 7-week-old baby on the other hand, how did you talk yourself into trusting the data over that emotional image?

SHAROT: What really helped is that I understood what was happening to me. Because this is what I study, I knew what my reaction was. I knew where it was coming from. I knew how it was going to affect me. And I think that awareness helped me to put it aside and say, OK, I know that I am anxious for the wrong reasons, and this is the action that I should take. It's a little bit when you're on a plane and there's turbulence and you get scared but telling yourself I know that turbulence is not actually anything that's dangerous, I know the statistics on safety in planes and so on, it helps. It helps people reduce their anxiety.

(SOUNDBITE OF MUSIC)

VEDANTAM: But facts don't always relieve our anxieties. Sometimes they harden our views. Some time ago, Tali wanted to test how people update their beliefs when confronted with new information. So she presented statements to two kinds of people - those who believe that climate change was real and those who were deniers. She found that for both groups, when the statement confirmed what they already thought, this strengthened their beliefs. But when it challenged their views, they ignored it. Tali says it's because of a powerful phenomenon known as confirmation bias.

SHAROT: Confirmation bias is our tendency to take in any kind of data that confirms our prior convictions and to disregard data that does not conform to what we already believe. And when we see data that doesn't conform to what we believe, what we do is we try to distance ourselves from it. We say, well, that data is not credible, right? It's not good evidence for what it's saying. So we're trying to reframe it to discredit it. All right. So I give an example in my book where if someone comes in and says, I just saw pink elephants flying in the sky and I have a very strong belief obviously that no pink elephants fly in the sky, I would then think that they're either delusional or they're lying. And there's good reason for me to believe that. So it's actually the correct approach to assess data in light of what you believe.

There's four factors that determine whether we're going to change our beliefs - our old belief, our confidence in that old belief, the new piece of data and our confidence in that piece of data. And the further away the piece of data is from what you already believe, the less likely it is to change your belief. And on average, as you go about the world, that is not a bad approach. However, it also means that it's really hard to change false beliefs. So if someone holds a belief very strongly but it is a false belief, it's very hard to change it with data.

VEDANTAM: So if data and facts don't work, what does?

(SOUNDBITE OF MUSIC)

VEDANTAM: How do you get people to buy the truth? Well, you could try scaring them.

(SOUNDBITE OF MONTAGE)

RUDY GIULIANI: The vast majority of Americans today do not feel safe.

HILLARY CLINTON: On Sunday, Americans woke up to a nightmare that's become mind-numbingly familiar.

TRUMP: This could be the great Trojan horse of all time.

VEDANTAM: Politicians use fear to get us to vote. TV programs use fear to get us to keep watching. Public health officials use fear to get us to quit smoking. I ask Tali whether fear might be an effective way to persuade people to change their minds and maybe even their behavior.

SHAROT: Fear works in two situations. It works when people are already stressed out. And it also works when what you're trying to do is get someone not to do something, an inaction. For example, if you try to get someone not to vaccinate their kids, fear may work. If there's, you know, an apple that looks bad, I don't eat it. Fear is actually not such a good motivator for inducing action, while hope is a better motivator, on average, for motivating action.

VEDANTAM: You talk about one study in your book where a hospital managed to get its workers to practice hand hygiene to get staff members to wash their hands regularly. But it turned out the most effective thing wasn't frightening the staff about the risks of transmitting infections. It was something else.

SHAROT: So in a hospital on the East Coast, a camera was installed to see how often medical staff actually sanitize their hand before and after entering a patient's room. And the medical staff knew that the camera was installed and yet only 1 in 10 medical staff sanitized their hands before and after entering a patient's room. But then an intervention was introduced - an electronic board that was put above each door, and it gave the medical staff in real time positive feedback. It showed them the percentage of medical staff that wash their hands in the current shift and the weekly rate as well. So anytime a medical staff will wash their hands, the numbers will immediately go up and there will be a positive feedback saying, you know, good job. And that affected the likelihood of people washing their hands significantly. It went up from 10% to 90%, and it stayed there.

Instead of using the normal approach, instead of saying, you know, you have to wash your hands because otherwise you'll spread the disease - basically instead of warning them of all the bad things that can happen in the future, which actually results in inaction, they gave them positive feedback.

(SOUNDBITE OF MUSIC)

VEDANTAM: I wrapped up my conversation with Tali by exploring another idea about how we might convince others to listen to views that conflict with their own. It had to do with a study of Princeton students who got their brains scanned while they listened to stirring, emotional speeches.

SHAROT: What they found was the brains of the different people listening to those speeches started synchronizing. So if we all listen, for example, to Kennedy's famous moon speech, our brains would likely look very much alike.

(SOUNDBITE OF ARCHIVED RECORDING)

JOHN F KENNEDY: Those who came before us made certain that this country rode the first waves of the Industrial Revolution.

SHAROT: And this is not only in regions that are important for language and hearing. It's also in regions that are important for emotion, in regions that are important for what's known as theory of mind - our ability to think about what other people are thinking - in regions that are important for associations. And you try to think, well, what's common to all these influential speeches that can cause so many people's activity to synchronize? And one of the most important things is emotion. If the storyteller or the person giving the speech is able to elicit emotion in the other person, then he's actually having somewhat of a control on that person's state of mind.

(SOUNDBITE OF ARCHIVED RECORDING)

KENNEDY: And this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it. We mean to lead it.

(APPLAUSE)

SHAROT: So think about it like this. If you're very sad and I'm telling you a joke, well, you're sad, so you're not going to perceive the joke as I perceive it when I'm happy. But if I'm able to first make you happy and then tell you the joke, well, then you'll perceive it more from my point of view. So by eliciting emotion, what you're able to do is change the perception of everything that comes after - to perceive information as the person who's giving this speech wants you to perceive it.

VEDANTAM: So you can see how this coupling, this idea that the audience's mind and the speaker's mind are in some ways coupled together, you can see how this could potentially be used to spread good information. You know, you have a great teacher in high school and you're captivated by the teacher and you're being pulled along by the story the teacher is telling you, maybe about history or maybe about geography. But you can also see equally how the same thing can work in the opposite direction, that you could be listening to a demagogue or you could be listening to somebody who has a very sort of captivating rhetorical style. And this person could also lead you astray in just the same way that the great teacher can lead you to knowledge and to positive things.

SHAROT: Absolutely. All the different factors that affect whether we will be influenced by one person or ignore another person are the same whether the person has good intentions or bad intentions, right? The factors that affect whether you're influential can be can you elicit emotion in the other person? Can you tell a story? Are you taking into account the state of mind of the person that's in front of you? Are you giving them data that confirms to their preconceived notions? All those factors that make one speech more influential than the other or more likely to create an impact can be used for good and can be used for bad.

VEDANTAM: Tali Sharot, I want to thank you for joining me on HIDDEN BRAIN today.

SHAROT: Thank you so much for having me.

(SOUNDBITE OF MUSIC)

VEDANTAM: We've been exploring why we cling to beliefs. After the break, we look at how we spread them, from person to person to person. We'll talk to a mathematician about the power of social networks to circulate ideas. Stay with us.

(SOUNDBITE OF MUSIC)

VEDANTAM: This is HIDDEN BRAIN. I'm Shankar Vedantam. During the Middle Ages, word spread to Europe about a peculiar plant found in Asia. This plant had a long stalk with heavy pods attached. When you cut those pods open, inside you would find a tiny little lamb.

(SOUNDBITE OF LAMB BLEATING)

CAILIN O'CONNOR: Complete with flesh and wool like a live animal lamb.

(SOUNDBITE OF LAMB BLEATING)

VEDANTAM: This creature, half plant, half animal, came to be known as the Vegetable Lamb of Tartary.

O'CONNOR: Various travel writers wrote that they had either heard about this or that they had eaten one of these lambs. And many of them said they had sawn the kind of downy wool from the lamb.

VEDANTAM: When these narratives made their way to Europe, people felt they had a view of a different world. Of course, no one in Europe had ever seen the Vegetable Lamb of Tartary because there was no such thing. But for centuries, people kept talking about this fantastical creature as if it were real. It even came up in scholarly works right next to pictures of oak trees and rabbits.

O'CONNOR: If people hadn't been telling each other about these things, nobody would believe that there were vegetable lambs because nobody had ever seen them, right?

(SOUNDBITE OF MUSIC)

O'CONNOR: And this is, by no means, a unique happening at that time.

VEDANTAM: At that time. Of course, we would never fall for vegetable lambs. We live in an era of science, of evidence-based reasoning, of calm, cool analysis.

(SOUNDBITE OF MUSIC)

VEDANTAM: But maybe there are vegetable lambs that persist even today even among highly trained scientists, physicians and researchers. Maybe there are spectacularly bad ideas that we haven't yet recognized as spectacularly bad.

(SOUNDBITE OF MUSIC)

VEDANTAM: Cailin O'Connor is a philosopher and mathematician at the University of California, Irvine. She studies how information, both good and bad, can pass from person to person. She is co-author with James Weatherall of the book "The Misinformation Age: How False Beliefs Spread."

Cailin, welcome to HIDDEN BRAIN.

O'CONNOR: Oh, thank you for having me.

VEDANTAM: So one of the fundamental premises in your book is that human beings are extremely dependent on the opinions and knowledge of other people. And this is what creates channels for fake news to flourish and spread. Let's talk about this idea. Can you give me some sense of our dependence on what you call the testimony of others?

O'CONNOR: So one reason we wrote this book is that we noticed that a lot of people thinking about fake news and false belief were thinking about problems with individual psychology, so the way we have biases and processing information, the fact that we're bad at probability. But if you think about the things you believe, almost every single belief you have has come from another person. And that's just where we get our beliefs because we're social animals. And that's really wonderful for us. That's why we have culture and technology. You know, that's how we went to the moon. But if you imagine this social spread of beliefs as opening a door, when you open a door for true beliefs to spread from person to person, you also open the door for false beliefs to spread from person to person. So it's this kind of double-sided coin.

VEDANTAM: And what's interesting, of course, is that if you close the door, you close the door to both, and if you open the door, you open the door to both.

O'CONNOR: That's right. So if you want to be social learners who can do the kinds of cultural things we can do, it has to be the case that you also have to have this channel by which you can spread falsehood and misinformation, too.

VEDANTAM: So as I was reading the book, I was reflecting on the things that I know or the things that I think I know, and I couldn't come up with a good answer for how I actually know that it's the Earth that revolves around the sun and not the other way around.

O'CONNOR: Yeah. That's right. Ninety-nine percent of the things you believe probably you have no direct evidence of yourself. You have to trust other people to find those things out, get the evidence and tell it to you. And so one thing that we talk a lot about in the book is the fact that we all have to ground our beliefs in social trust. So we have to decide what sources and what people we trust and therefore what beliefs we're going to take up because there's just this problem where we cannot go verify everything that we learned directly. We have to trust someone else to do that for us.

VEDANTAM: We trust the historian who teaches us about Christopher Columbus. We trust the images from NASA showing how our solar system is organized. Now, we say we know Columbus was Italian, and we know the Earth revolves around the sun. But, really, what we mean to say is we trust the teacher, and we trust NASA to tell us what is true.

O'CONNOR: And the social trust and ability to spread beliefs, I mean, it's remarkable what it's let humans do. You know, no other animal has this ability to sort of transfer ideas and knowledge dependably from person to person over generation after generation to accumulate that knowledge. But you do just see sometimes very funny examples of false beliefs being spread in the same way.

(SOUNDBITE OF MUSIC)

VEDANTAM: As a philosopher of science, Cailin studies how scientists communicate and share information. If we rely on scientists to tell us what to believe, who do they rely on? Turns out, other scientists. Now, showing that this is the case isn't easy. The process by which scientists change their minds on questions such as the spread of disease or the movement of objects through space is very complex. Studying this complex process can be mind-boggling. Say, for instance, Dr. A...

UNIDENTIFIED PERSON #1: Hello?

VEDANTAM: ...Talks to doctor B one day about her research.

UNIDENTIFIED PERSON #2: Hello.

VEDANTAM: It also turns out that Dr. B is collaborating with Dr. C who recently met Dr. D at a conference. Now, Dr. D frequently reads Dr. A's papers but doesn't know about Dr. C's research. A couple of years later, Dr. E reads what Dr. B has written about what Dr. A said in an article that Dr. C cited before Dr. F had even published her results.

(SOUNDBITE OF MUSIC)

O'CONNOR: Empirically, it's hard to study scientists because things like theory change will happen over the course of 10 or 20 years and involve thousands and thousands of interactions between different scientists. You know, how would you ever study that?

VEDANTAM: How would you ever study that? Because Cailin can't follow all these interactions, she recreates them in a computer simulation.

O'CONNOR: You'd want to think of it as a really kind of simplified representation of what's happening in the real world.

VEDANTAM: She creates groups of fictional scientists, and she gives them a series of rules, like who they can talk to and who they trust. These simulated scientists collect data and discuss their simulated research. Cailin sits back and watches what happens.

O'CONNOR: So one thing we find sometimes in these models is that one agent or scientist will get data supporting the false belief. They'll share it with the entire community of scientists. And then everyone will come to all believe the false thing at once and sort of ignore a better theory. And part of what happens there is this social spread of knowledge and belief causing everyone to turn away from a good theory. So if you have almost too much social influence within a community, that can be really bad because everyone can stop gathering data since the entire community is exposed to the same spurious results.

VEDANTAM: If I hear you correctly, what you're saying is that psychological factors can have an effect, but you can have the spread of bad information even in the absence of biases or stupidity.

O'CONNOR: Yeah. So one way that the models we look at are really useful is that you can kind of pare away things that are happening in the real world and see, well, suppose we didn't have any psychological biases. Suppose we were perfectly rational. Would we always come to the right answer in science and in our day-to-day lives and see that the answer is no?

(SOUNDBITE OF MUSIC)

VEDANTAM: Coming up - case studies from the world of supposedly rational scientific communities that show how good information sometimes fails to spread and how bad information can metastasize. I'm Shankar Vedantam, and this is NPR.

(SOUNDBITE OF MUSIC)

VEDANTAM: This is HIDDEN BRAIN. I'm Shankar Vedantam. Mathematician and philosopher Cailin O'Connor studies how information spreads through social networks. People who know and trust one another efficiently pass information back and forth and learn from one another. Unfortunately, the same rules of social trust can sometimes be a roadblock for the truth. Mary Wortley Montagu learned this lesson hundreds of years ago. She was an English aristocrat who found herself living for a while in what is modern-day Turkey.

O'CONNOR: So Mary Montagu seems to have been really enchanted by Turkish culture. You know, she was coming from England, an aristocratic culture there. In Turkey, she discovered these beautiful shopping centers, bath houses. She seemed to be - have been enchanted by bath houses where there would be a lot of women sort of lounging naked going in the hot water, drinking hot drinks together.

VEDANTAM: Another thing that struck Mary about Turkish women - they used an innovative technique to limit the spread of smallpox. It was called variolation.

O'CONNOR: What this involved - I mean, it's a bit like vaccination now. You would scratch maybe the arm of a patient and take pus from a smallpox pustule and put that pus into the scratch. So what would happen after you did that is that the patient would get a very mild smallpox infection. Some small percentage of patients would die but many, many fewer than who would die of an actual smallpox infection. And after they had that more mild infection, they would actually be immune to smallpox. So this was practiced commonly in Turkey - basically unheard of in England at the time.

Mary Montagu had herself had smallpox and survived when she was younger. She had lost a brother to smallpox. And so when she encountered variolation in Turkey, she decided, well, you know, why don't we do this in England? She had her own son variolated, and she decided she was going to try to spread this practice in her native country.

VEDANTAM: So when she returns to Britain, in some ways, Mary Montagu here functions like one of your agents in your computer models because you have, you know, one cluster over here in Turkey and one cluster over here in Britain. And essentially, you have an agent walking over from Turkey to Britain. And Mary Montagu says here's this wonderful idea. We can limit the spread of smallpox in Britain. Britain, in fact, at the time, was actually facing a smallpox crisis. How were her ideas received?

O'CONNOR: So her ideas were not received very well when she first came back. One thing we talk a lot about in the book is that almost everyone has what you might call a conformist bias. We don't like to publicly state things that are different from the people in our social networks. We don't like to have beliefs that are different from the people around us. It's somehow very socially uncomfortable to do that. And we don't like our actions to not conform with the people who we know and love.

So when she got back to England, you know, it was already the case that all these physicians in England didn't believe in variolation. They thought this was a crazy idea. And none of them were going to stand out from the pack of physicians and say, yeah, I'm the person who's going to try this or going to believe that this practice works because they were all busy conforming with each other.

VEDANTAM: And, of course, these ideas were coming from another country, a country with very different cultural practices that seemed in some ways very foreign. The idea and the country itself seemed very foreign.

O'CONNOR: That's right. So it's not just that it's a weird, new idea that none of them believe in their kind of in-group. It's also that it's coming from Turkey. And furthermore, it's coming from women in Turkey, so it was a practice mostly done by women. And a woman is bringing it to England as well, so they also don't really trust her as a woman and someone who's not a physician.

So social trust is a really important aspect in understanding how people form beliefs. Because we can't go out and figure out ourselves whether the things people tell us are true, usually we just always have to decide who to trust. And people have little shortcuts in how they do this. They tend to trust those who are more like them. They also tend to trust those who share beliefs and values and practices with them. So, for example, if you are a physician, you might tend to trust a physician. If you believe in homeopathy, you might tend to trust someone who believes in homeopathy. We all use these kinds of tricks. So what we saw in the variolation case with Mary Montagu, the physicians aren't going to trust this woman who doesn't share their beliefs and practices, who isn't much like them.

(SOUNDBITE OF MUSIC)

VEDANTAM: Now, you could argue that the physicians who rejected Mary Montagu's ideas were not behaving like real scientists. They weren't being dispassionate. They weren't being objective. They were bringing psychological biases into the picture - sexism, xenophobia, tribalism. In the real world, misinformation spreads because of some combination of network effects and psychological and cognitive biases.

You see the same thing in the case of the Hungarian physician Ignaz Semmelweis. He was an insider, a man and a doctor. He even had the assistance of scientific evidence to support his claims. But it turned out even these were not enough to overcome the barriers that confront the truth.

O'CONNOR: Ignaz Semmelweis was a physician living in Vienna. He was put in charge of this clinic, the first obstetrical clinic in Vienna. Next door was the second obstetrical clinic of Vienna. He was in charge of training new doctors in obstetrics, and at the second clinic, they were training midwives. And shortly after he took over, he realized that something really terrible was going on because in his clinic, 10% of the women were dying, mostly of childbed fever, while the midwives next door who presumably, you know, they would have thought they were - less expertise - only 3 to 4% of their patients were dying. So Semmelweis was obviously really worried about this. He had patients who would be begging on their knees to be transferred to the other clinic.

He had this kind of breakthrough moment when a colleague of his was conducting an autopsy and accidentally cut himself. And then shortly thereafter, he died of something that looked a lot like childbed fever. Semmelweis realized, well, I've got all these physicians who are conducting autopsies on cadavers and then immediately going and delivering babies. And he thought, well, maybe there's something transferred on their hands, and he called this cadaverous particles. Of course, now we know that that is bacteria, but they didn't have a theory of bacteria at the time. So he started requiring the physicians to wash their hands in a chlorinated solution, and the death rate in his clinic dropped way down.

VEDANTAM: And, of course, the way we think about science, we say, all right, we have - someone's discovered something wonderful. Everyone must have instantly adopted this brilliant, new idea.

O'CONNOR: You would think, right? And he has this wonderful evidence, right? It was 10%, he introduced the practice, goes down to 3%. But that's not what happened. So he published his ideas, and the other gentleman physicians did not take them up. In fact, they found them kind of offensive. They thought this is - you know, he's writing that we have dirty hands, we have unclean hands, but in fact, we're gentlemen. They also thought it was just really far out of the range of theories that could possibly be true, so they didn't believe him despite the really good evidence and the deep importance. You know, people's lives were really at stake. And it took decades for his handwashing practice to actually spread.

VEDANTAM: In fact, I understand that Semmelweis himself eventually suffered a nervous breakdown. How did his own story end?

O'CONNOR: So the way the story goes - though this is a little hard to verify - is that he was so frustrated that people weren't adopting his handwashing practice that he had a nervous breakdown as a result. He was put into a Viennese mental hospital where he was beaten by guards and died of blood poisoning a few weeks later.

(SOUNDBITE OF MUSIC)

VEDANTAM: We've seen how being an outsider or breaking with tradition can be barriers to the spread of a good scientific information, but you could argue that these examples were from a long-gone era of gentlemen physicians and amateur scientists. But even in the modern day of science, where researchers demand hard evidence to be convinced, it turns out that false, inaccurate and incomplete information can still take hold. In 1954, E.D. Palmer published a paper that changed how doctors thought about stomach ulcers.

O'CONNOR: So what he did was looked at a lot of stomachs, I believe somewhere in the range of a thousand, and he found that there were no bacteria whatsoever in the stomachs that he investigated. A lot of people at that time had been arguing over whether stomach ulcers were caused by stomach acid or some kind of bacteria. This was taken as really decisive evidence showing that OK, well, it can't be bacteria because everyone thought Palmer's study showed there are no bacteria in stomachs, so it absolutely must be stomach acid.

VEDANTAM: And, of course, in this case, Palmer was not trying to fabricate his data or make up data. He was sincerely arriving at what he thought was a very good conclusion.

O'CONNOR: That's right. And it seems that it just was a problem with his methodology. Of course, there are bacteria in our stomachs. He just didn't see them because of the way he was doing his particular experiment. This was not a fabrication at all.

VEDANTAM: One of the things that's interesting about this episode involving Palmer and the stomach ulcers is that as individuals essentially came over to believe what Palmer was telling them, there was a consensus that started to grow. And as each new person added to the consensus, it became a little bit stronger, which made it even harder to challenge.

O'CONNOR: Yeah. So although they had been arguing for decades about whether ulcers were caused by acid or by bacteria, at this point people started to share Palmer's results. Pretty much everybody saw them. And this consensus was arrived at. OK, it's acid. And everyone who had been studying the possibility that bacteria caused stomach ulcers stopped studying that.

VEDANTAM: Well, not everyone. Fast-forward a few decades to the early 1980s. In Australia, a physician named Barry Marshall grew skeptical of the acid theory. His experiment suggested that ulcers were caused by bacteria, not stomach acid. But this theory was met with stony-faced resistance. He couldn't even get his articles published. Scientists sniped at him behind his back even though, as it turned out, his data was far better than the stomach studies by E.D. Palmer.

(SOUNDBITE OF MUSIC)

VEDANTAM: Coming up - what Barry Marshall did to fight misinformation and what we can learn from his story about how to spread the truth.

(SOUNDBITE OF MUSIC)

VEDANTAM: Barry Marshall was frustrated that no one seemed willing to listen to his findings.

(SOUNDBITE OF ARCHIVED RECORDING)

BARRY MARSHALL: People were bleeding in my practice and dying from ulcers in my hospital. I could see it.

VEDANTAM: So he figured out a way to get everyone's attention.

(SOUNDBITE OF ARCHIVED RECORDING)

MARSHALL: The only person in the world at that time who could make an informed consent was me. So I had to be in my own experiment.

O'CONNOR: And so he did this demonstration.

VEDANTAM: He took bacteria from the stomach of one of his sick patients.

(SOUNDBITE OF ARCHIVED RECORDING)

MARSHALL: So we cultured a patient with gastritis.

VEDANTAM: He stirred it into a broth and then...

(SOUNDBITE OF ARCHIVED RECORDING)

MARSHALL: I drank the bacteria - 10 to the ninth colony-forming units.

O'CONNOR: He gave himself stomach ulcers. And then he later cured them with antibiotics in this publicity stunt, almost, to convince people that, in fact, ulcers were caused by bacteria.

(SOUNDBITE OF MUSIC)

VEDANTAM: Eventually, Barry Marshall and Robin Warren went on to win the Nobel Prize in medicine for their discoveries.

(SOUNDBITE OF MUSIC)

VEDANTAM: Mary Montagu, the woman who faced resistance in bringing variolation to England, never won a prestigious prize, but she also found a way to spread the truth. Like Barry Marshall, she found it had more to do with her sales pitch than with the evidence.

O'CONNOR: So in the end, she did something really smart, which took advantage of the ways that we use our social connections to ground our beliefs and our trust. So she ended up convincing Princess Caroline of Ansbach to variolate her own two small daughters and to do it in this kind of public way. So she got one of the most influential people in the entire country to engage in this practice. So that did two things. So No. 1, it made clear, you know, because she did in this kind of public way and her daughters were fine, it gave people evidence that this is, in fact, a safe practice, and it's a good idea. But it also made clear to people that if they want to conform to the norm, if they want to share a practice with this really influential person, then they should do the same thing. And after Princess Caroline did this, variolation spread much more quickly, especially among people who had a personal connection to either Mary Montagu or to the princess.

VEDANTAM: What's fascinating here is that this wasn't, in some ways, a rational way to solve the problem. It wasn't saying, look; there's really convincing evidence here. You're almost using a technique that's pretty close to propaganda.

O'CONNOR: It is a propaganda technique. Absolutely. So propagandists tend to be very savvy about the ways that people use their social connections to ground trust and knowledge and choose their beliefs. And they take advantage of those. In this case, it was using those - that social trust for good. But in many cases, people use it for bad. And if you look at the history of industrial propaganda in the U.S. or if you look at the way Russia conducted propaganda before the last election, people have taken advantage of these kinds of social ties and beliefs to try to convince us of whatever it is they're selling.

VEDANTAM: One last idea in how you counter bad information, Semmelweis, as we saw, did not succeed in persuading other doctors during his lifetime to wash their hands thoroughly before they were treating patients. But, of course, now that idea is widely adopted. What does that tell us, Cailin, about how science in some ways might be self-correcting? It might not be self-correcting at the pace that we want, but over time, it appears that good ideas do beat out the bad ones.

O'CONNOR: Yeah, so we have thousands and thousands of examples in science of exactly that happening, of good ideas beating out the bad ones. Of course, now we can look back and say, oh, well, that good idea won out and that good idea won out. We can't actually look at right now and know which of the ideas we believe now are correct ones or good ones. So there are actually philosophers of science like Larry Laudan and Kyle Stanford who argue for something called the pessimistic meta-induction, which is something like this - because scientific theories in the past have always eventually been overturned, we ought to think that our theories now will probably be overturned as well.

But there is actually an optimistic side to this, which is that if you look at many theories in the past, ones that were overturned, often the reason people believed them is that even if they were wrong, they were a good guide to action. Even the theory of stomach acid causing ulcers - well, if you treat stomach acid, it actually does help with ulcers. You know, it wasn't a completely unsuccessful theory. It's just that it wasn't totally right, and it wasn't as successful as the bacteria theory of ulcers because antibiotics do better.

VEDANTAM: One of the interesting implications about all of this is how we should think about the truth. And in some ways, I think the picture that I'm getting from you is a picture that says the truth is not a binary question. It's not, you know, is it true, is it false? I mean, some questions, of course, perhaps can be reduced to is it true, is it false? But, really, science is in the business of producing probability estimates for various claims. And I think what you're saying is that for us to actually be on the right side of the misinformation information divide, it's helpful for us to think in probabilistic terms rather than in binary terms.

O'CONNOR: Yeah, that's absolutely right. So we do think it's really important to think about belief in terms of degrees and evidence and believing something strongly enough. And part of the reason is that there has been this strategy where people who are trying to subvert our beliefs will say, but we're not sure about something. They'll say, evolution is just a theory or there's some doubt about global warming.

But, ultimately, not being sure about something is not what matters. We're never really 100% sure about anything. I mean, if you think about, think about any belief you could have - you know, that the sun will come up tomorrow. Well, it always has in the past, but that doesn't mean that 100% sure it will tomorrow. There's a really good chance it will tomorrow. We shouldn't be looking for certainty. Instead, we need to be saying to ourselves when do we have enough evidence to make good decisions?

VEDANTAM: Cailin O'Connor is a philosopher and mathematician at the University of California, Irvine. She studies how social networks can spread both good information and bad. Along with James Weatherall, she is co-author of the book "The Misinformation Age: How False Beliefs Spread." Cailin, thank you for joining me today on HIDDEN BRAIN.

O'CONNOR: Oh, thank you so much for having me.

(SOUNDBITE OF MUSIC)

VEDANTAM: This week's show was produced by Maggie Penman, Camila Vargas Restrepo and Laura Kwerel. Our team includes Parth Shah, Jenny Schmidt and Thomas Lu. Our supervising producer is Tara Boyle. This week, our unsung hero is Bryan Moffett of National Public Media. NPM is the group that sells our sponsorship messages. Bryan's a great mix of liberal and conservative. He's always liberal in his encouragement and conservative in his promises. As the saying goes, he under promises and over delivers. Thanks, Bryan.

For more HIDDEN BRAIN, you can find us on Facebook and Twitter. You can find information about the research we discuss on this show on our website, npr.org/hiddenbrain. If you like this episode, please think of one friend who might enjoy our show and share it with them. I'm Shankar Vedantam, and this is NPR. Transcript provided by NPR, Copyright NPR.