Programmed to believe: Why people buy into falsehoods
Loading...
The saying goes that there’s one born every minute, but according to census data it’s really more like every eight seconds. In other words, we’re all suckers. Psychologists say it takes more of our mental resources to identify a statement as false than to label it as true, a phenomenon that they chalk up to our evolutionary history as socially cooperative animals, ones who, for the most part, speak honestly with each other. Politicians and advertisers often try to exploit this bias toward belief, through techniques such as repetition, which can add an air of truth to a statement over time. Nobody, it seems, is completely immune from factual errors about the world. “Of course I'd love people to be more skeptical,” says Gus Cooney, a psychologist at Harvard University in Cambridge, Mass. “But I think being a little more humble and a little more tolerant might be the right answer.”
Why We Wrote This
In a “post-truth” world, the media has struggled with how to address false political statements. Our reporter took a step back and examined why falsehoods persist.
If you’ve ever held a political conviction, you’ve probably found yourself asking, “How on Earth can those people be so gullible?”
Not your people, of course, the reasonable and well-informed ones who share your politics. No, we mean the millions on the other side, the ones who have chosen to stick their tinfoil-covered heads in the sand to swallow the tripe-flavored Kool-Aid. Those people.
Except it turns out that, under the right conditions, any of us can be played. Behavioral scientists say that many of our beliefs about the world, true as they may be, are more likely to be determined by something other than our power to discriminate truth from fiction. And that’s not necessarily a bad thing.
Why We Wrote This
In a “post-truth” world, the media has struggled with how to address false political statements. Our reporter took a step back and examined why falsehoods persist.
“There's significant evidence that we evolved as a cooperative species, and trust is central to that,” says Pamela Paxton, a sociologist at the University of Texas at Austin who studies social capital. “There are a lot of benefits that we get, even in terms of efficiency, from high levels of trust.”
A bias toward belief
It all has to do with the way we process information. Until relatively recently, most psychologists implicitly adopted a model of thought developed in the 17th century by French philosopher René Descartes, who, in his “Fourth Meditation,” divided the process of believing into two distinct faculties. First, what he called the “intellect” comprehends an idea. Next, the intellect presents the idea to the “will,” which then attempts to determine whether the idea is true or not.
Descartes’s model makes intuitive sense, but beginning in the late 1980s and early 1990s, psychologists began to see flaws with it. A series of experiments, led by psychologist Daniel Gilbert, that distracted participants while asking them to identify sets of true and false sentences found that the distraction interfered with the participants’ ability to identify the false statements, but not the true ones.
Dr. Gilbert proposed a different model of believing, one consistent with that of Descartes’s rival, the Dutch philosopher Baruch (Benedict) Spinoza, who proposed that comprehending an idea and believing it to be true are actually the same process. Rejecting an idea, Spinoza said, requires an additional step that uses up additional mental resources.
Put another way, a belief is like an automatic email newsletter: you have to go out of your way to figure out how to unsubscribe.
From an efficiency standpoint, this arrangement makes sense. “If you're trying to determine whether something you heard is true or false,” says Lisa Fazio, a psychologist at Vanderbilt University in Nashville, Tenn., “you can do this kind of labor-intensive search through your long-term memory to see what you know about the topic and whether or not it seems reasonable, or you can just go with this kind of gut-level feeling of, ‘That seems true.’ ”
And your gut works just fine, in most cases. “Most of the time, most humans are trying to tell each other things that are true and useful to them,” says Gus Cooney a researcher at Harvard University in Cambridge, Mass., who studies the psychology of conversations. Dr. Cooney says he suspects that this was even more true in our distant evolutionary past, when humans lived in small, stable groups and there was no advertising.
In the modern world, however, our natural credulity can be hacked. “The problem comes when we when we take that rule that's useful in one domain and then we misapply in a domain where people are trying to intentionally mislead us,” says Cooney.
Say it again
The top three most common hacks, exploited by politicians for centuries, are repetition, repetition, and repetition. The more times we hear an assertion, say psychologists, the less effort it takes to wrap our minds around it, and humans, for some reason, confuse that ease with truthfulness.
“If you just repeat things,” says Professor Fazio, “you can make them gain this processing fluency while not actually changing the truth of the statement.”
So does this mean that the solution to political misinformation is for people to become less trusting of each other? Probably not. “Trust eases interactions between people,” says Professor Paxton, “so you don't have to rely on third parties to ensure or watch or regulate.”
Indeed, in 2010 economists found that places that rank high in interpersonal trust, such as the Nordic countries, generally have fewer barriers to starting a business than low-trusting countries, such as Brazil and Uganda. More importantly, people living in more trusting societies generally say they are happier.
In the United States, trust has been steadily declining over the past 40 years, a phenomenon that Paxton attributes in part to headline-grabbing scandals. “I suspect it may just be that, with more and more news,” she says, “we're just more aware constantly of negative information about institutions.”
Instead of harboring mistrust, Fazio suggests that, when you’re presented with a statement, you can “pause and think about it.”
“The one thing that we found that helps people to notice these errors, is to really slow down and read the article, like a fact checker where they have to circle the errors in the story,” she says. “And those things work.”
But even this method, says Fazio, won’t catch every factual error. There’s no perfect solution, which means that we’ll have to learn to live with some of us being wrong sometimes.
“Of course I'd love people to be more skeptical, and approach the world like empiricists and scientists,” says Cooney. “But I think being a little more humble and a little more tolerant might be the right answer.”
Spinoza, who lived through the end of a bloody religious war that consumed most of Europe, likely would have agreed. In his “Ethics,” written in 1664 or 1665, he wrote that his doctrine that seeing and believing are one and the same ”teaches us to hate no man, neither to despise, to deride, to envy, or to be angry with any.”