The Power of Confirmation Bias and the Credibility of Belief

david_myers
Author
Author
0 0 2,904

The conservative New York Times columnist Ross Douthat spoke for many in being astounded by “the sheer scale of the belief among conservatives that the [2020 presidential] election was really stolen,” which he attributed partly to “A strong belief [spurring] people to go out in search of evidence” for what they suppose.

Douthat alluded to confirmation bias—our well-established tendency, when assessing our beliefs, to seek information that supports rather than challenges them.

What’s the basis for this big idea, which has become one of social psychology’s gifts to public awareness? And should appreciating its power to sustain false beliefs cause us to doubt our own core beliefs?

In a pioneering study that explored our greater eagerness to seek evidence for rather than against our ideas, psychologist Peter Wason gave British university students a set of three numbers (2-4-6) and told them that the series illustrated a rule. Their task was to discover the rule by generating their own three-number sequences, which Wason would confirm either did or didn’t conform to the rule. After the students tested enough to feel certain they had the rule, they were to announce it.

Imagine being one of Wason’s study participants. What might you suppose the rule to be, and what number strings might you offer to test it?

The outcome? Most participants, though seldom right, were never in doubt. Typically, they would form a wrong idea (such as “counting by twos?”) and then test it by searching for confirming evidence: “4-6-8?” “Yes, that conforms.” “20-22-24?” “Yes.” “200-202-204?” “Yes again.” “Got it. It’s counting by twos.” To discover Wason’s actual rule (any three ascending numbers), the participants should also have attempted to disconfirm their hunch by imagining and testing alternative ideas.

Confirmation bias also affects our social beliefs. In several experiments, researchers Mark Snyder and William Swann tasked participants with posing questions to someone that would reveal whether that person was extraverted. The participants’ typical strategy was to seek information that would confirm extraversion. They would more likely ask “What would you do if you wanted to liven things up at a party?” than “What factors make it hard for you to really open up to people?” Vice versa for those assessing introversion. Thus, participants typically would detect in a person whatever trait they were assessing. Seek and ye shall find.

In everyday life, too, once having formed a belief—that vaccines cause autism, that people can choose or change their sexual orientation, that the election was rigged—we prefer and seek information that verifies our belief.

The phenomenon is politically bipartisan. Across various issues, both conservatives and liberals avoid learning the other side’s arguments about topics such as climate change, guns, and same-sex marriage. If we believe that systemic racism is (or is not) rampant, we will gravitate toward news sources, Facebook friends, and evidence that confirms our view, and away from sources that do not. Robert Browning understood: “As is your sort of mind, / So is your sort of search: you’ll find / What you desire.”

Confirmation bias supplements another idea from social psychology—belief perseverance, a sister sort of motivated reasoning. In one provocative experiment, a Stanford research team led by Craig Anderson invited students to consider whether risk-takers make good or bad firefighters. Half viewed cases of a venturesome person succeeding as a firefighter, and a cautious person not succeeding; the other half viewed the reverse. After the students formed their conclusion, the researchers asked them to explain it. “Of course,” one group reflected, “risk-takers are braver.” To the other group, the opposite explanation seemed equally obvious: “Cautious people have fewer accidents.”

When informed that the cases they’d viewed were fake news made up for the experiment, did the students now return to their pre-experiment neutrality? No—because after the fake information was discredited, the students were left with their self-generated explanations of why their initial conclusion might be true. Their new beliefs, having grown supporting legs, thus survived the discrediting. As the researchers concluded, “People often cling to their beliefs to a considerably greater extent than is logically or normatively warranted.”

So, does confirmation bias + belief perseverance preclude teaching an old dogma new tricks? Does pondering our beliefs, and considering why they might be true, close us to dissonant truths? Mindful of the self-confirming persistence of our beliefs (whether true or false), should we therefore doubt everything?

Once formed, it does take more compelling persuasion to change a belief (“election fraud was rampant”) than it did to create it. But there are at least two reasons we need not succumb to a nihilistic belief in nothing.

First, evidence-based critical thinking works. Some evidence will change our thinking. If I believe that Reno is east of Los Angeles, that Atlanta is east of Detroit, and that Rome is south of New York, a look at a globe will persuade me that I am wrong, wrong, and wrong. I may once have supposed that child-rearing techniques shape children’s personalities, that the crime rate has been rising for years, or that traumatic experiences get repressed, but evidence has shown me otherwise. Recognizing that none of us are infallible little gods, we all, thankfully, have at least some amount of intellectual humility.

Moreover, seeking evidence that might disconfirm our convictions sometimes strengthens them. I once believed that close, supportive relationships predict happiness, that aerobic exercise boosts mental health, and that wisdom and emotional stability grow with age—and the evidence now enables me to believe these things with even greater confidence. Curiosity is not the enemy of conviction.

Second, explaining a belief does not explain it away. Knowing why you believe something needn’t tell us anything about your belief’s truth or falsity. Consider: If the psychology of belief causes us to question our own beliefs, it can also cause others to question their opposing beliefs, which are themselves prone to confirmation bias and belief perseverance. Psychological science, for example, offers both a psychology of religion and a “psychology of unbelief” (an actual book title). If both fully complete their work—by successfully explaining both religion and irreligion—that leaves open the question of whether theism or atheism is true.

Archbishop William Temple recognized the distinction between explaining a belief and explaining it away when he was challenged after an Oxford address: “Well, of course, Archbishop, the point is that you believe what you believe because of the way you were brought up.” To which the archbishop replied, “That is as it may be. But the fact remains that you believe that I believe what I believe because of the way I was brought up, because of the way you were brought up.”

Finally, let’s remember: If we are left with uncertainty after welcoming both confirming and disconfirming evidence, we can still venture a commitment. As French author Albert Camus reportedly said, sometimes life beckons us to make a 100 percent commitment to something about which we are 51 percent sure—to a cause worth embracing, or even to a belief system that helps make sense of the universe, gives meaning to life, connects us in supportive communities, provides a mandate for morality and selflessness, and offers hope in the face of adversity and death.

So yes, belief perseverance solidifies newly formed ideas as invented rationales outlast the evidence that inspired them. And confirmation bias then sustains our beliefs as we seek belief-confirming evidence. Nevertheless, evidence-based thinking can strengthen true beliefs, or at least give us courage, amid lingering doubt, to make a reasoned leap of faith. As St. Paul advised, “Test everything; hold fast to what is good.”

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com; follow him on Twitter: @DavidGMyers.)

Tags (1)
About the Author
David Myers has spent his entire teaching career at Hope College, Michigan, where he has been voted “outstanding professor” and has been selected by students to deliver the commencement address. His award-winning research and writings have appeared in over three dozen scientific periodicals and numerous publications for the general public. He also has authored five general audience books, including The Pursuit of Happiness and Intuition: Its Powers and Perils. David Myers has chaired his city's Human Relations Commission, helped found a thriving assistance center for families in poverty, and spoken to hundreds of college and community groups. Drawing on his experience, he also has written articles and a book (A Quiet World) about hearing loss, and he is advocating a transformation in American assistive listening technology (see www.hearingloop.org).