Brain Illusions: The Backfire Effect from a Neuroscience Perspective
What Is the Backfire Effect?
One of the most fascinating cognitive biases is the backfire effect, which is a consequence of the broader psychological phenomenon known as group polarization. Group polarization occurs when people with opposing views interpret new information in a biased way. The interpretation of facts depends on each person’s prior beliefs and attitudes. As a result, when confronted with objective reality, people’s opinions can diverge even further.
While the effects of this cognitive bias are well-documented, scientists have sought to understand its underlying mechanisms. What happens in the brain when someone encounters facts that contradict their beliefs? Why do people sometimes reject facts and become even more entrenched in their views, displaying the backfire effect?
Group Polarization Explained
A classic experiment on opinion polarization works as follows. Participants are told that one basket contains 60% red balls and 40% black balls, while another basket has the opposite ratio. Then, a ball of a third color (e.g., white) is drawn, and participants are asked to estimate which basket it came from. One group must state their opinion aloud after each draw, while another group only gives their opinion at the end.
The experiment showed that the first group became increasingly confident that the white balls came from a specific basket, leading to stronger polarization. The “silent” group did not show this effect. Researchers believe that polarization occurs when people are required to state their opinions publicly, making those opinions more extreme than decisions made privately. Some experts also suggest that people can reinforce their beliefs simply by thinking about them, even without new evidence.
The Backfire Effect in Action
The backfire effect is a cognitive bias that occurs in an individual brain, either during group polarization or independently. The term “backfire effect” was first used by Brendan Nyhan and Jason Reifler in their 2006 paper “When Corrections Fail: The Persistence of Political Misperceptions,” with a revised version published in Political Behavior in 2010 (doi: 10.1007/s11109-010-9112-2).
In one experiment, participants were given an article containing a false fact. One group received only the false information, while another group received the same article with a correction at the end. Participants were then asked factual questions and for their opinions. The false fact used was the claim that Iraq had weapons of mass destruction before the U.S. invasion, followed by a correction. The article included a real quote from President Bush in October 2004, which implied the existence of such weapons.
Another study tested whether public support for the Iraq invasion was driven by fear of death after 9/11 and repeated media references to mortality (“mortality salience”).
The results largely confirmed the backfire effect. Corrections had little to no effect on average, but when political views were considered, a clear polarization emerged. People with very liberal views were less likely to believe the false claim after seeing the correction, while conservatives paradoxically became even more convinced that Iraq had weapons of mass destruction. In other words, official corrections only strengthened their original beliefs.
Corrections had no statistically significant effect on people with moderately liberal or centrist views. Researchers highlighted the surprising effect corrections had on conservatives—those whose core beliefs were challenged. This is a clear demonstration of the backfire effect.
Experts interpret this as a difference in trust toward information sources. People who experience the backfire effect likely trust the source of the false information more than the source of the correction. Receiving new, truthful information from a less-trusted source only reinforces their trust in the original, false source and their pre-existing beliefs.
Subsequent experiments have confirmed the backfire effect as a cognitive bias, especially among people with deeply held beliefs. When confronted with information that contradicts their views, they become even more entrenched.
fMRI Results: The Brain’s Response to Contradictory Information
In 2016, neuroscientists Jonas T. Kaplan, Sarah I. Gimbel, and Sam Harris from the University of Southern California’s Brain and Creativity Institute conducted an fMRI study on people with strong political beliefs. Participants were shown facts that contradicted their beliefs while their brain activity was monitored. The researchers found that the same brain areas activated as when facing a physical threat. The results were published in Nature (doi: 10.1038/srep39589).
Brain scans showed that areas highlighted in red and yellow were activated when participants encountered facts contradicting their political beliefs. Blue and green areas were activated when non-political beliefs were challenged.
In simple terms, during political arguments, the rational part of the brain can “shut down.” When people are faced with the possibility that their political beliefs might be wrong, they react instinctively, as if facing a physical threat.
“The reaction we see in the brain is very similar to what would happen if someone encountered a bear in the woods,” explained Sarah Gimbel in the podcast You Are Not So Smart — 93. The Backfire Effect — Part One. “Your brain generates an instant, automatic fight-or-flight response, and your body prepares to defend itself.”
According to the researchers, some values are so central to a person’s identity that the brain treats abstract ideas as threats to physical existence. “Remember, the brain’s first and foremost job is protection,” says Jonas Kaplan. “The brain is a big, complex, and sophisticated self-defense machine—not just for physical protection, but for psychological self-defense as well. Once something becomes part of our psychological self-identity, I think it falls under the same protective mechanisms the brain uses for the body.”
Modern psychology and neuroscience have studied how neutral facts can become part of a person’s psychological self-identity, sometimes intentionally through state ideology. Even technical or scientific topics, like air temperature or CO2 levels, can become politicized and trigger these mechanisms.
How the Brain Changes Beliefs
The USC researchers also studied what happens in the brain when beliefs change. They found a small area in the orbitofrontal cortex whose activity positively correlated with the degree of belief change (area A in their illustration). Another area in the dorsomedial prefrontal cortex negatively correlated with belief change (area B). A bar chart (C) showed the average degree of belief change by topic.
Despite individual differences, the threat response to political beliefs was similar across participants. For non-political beliefs, those more resistant to changing their minds showed greater activity in the dorsal anterior cingulate cortex and the amygdala when processing contradictory information. Activity in the posterior insula and ventral anterior areas did not show a significant correlation with belief change.
Interestingly, the fMRI study did not show clear signs of the backfire effect. After being presented with facts, participants temporarily showed a slight decrease in conviction on political topics and a more significant decrease on non-political topics. A follow-up survey weeks later found that only the effect on non-political topics persisted.
Possible Therapy and Practical Advice
In the future, scientists may learn to help people with deeply entrenched political beliefs—such as those convicted of political crimes—by stimulating specific brain regions and providing truthful information. This could help shift political beliefs out of the brain’s reflexive self-defense zone and activate rational thinking.
It’s important to understand the nature of cognitive biases and remember that the brain’s primary function is self-protection, not logical reasoning. If you encounter someone whose physiological defense mechanisms are activated, it’s crucial to reassure them that they are safe. This can reduce stress and normalize hormone levels.
During further conversation, avoid topics that are central to the person’s psychological self-identity and may trigger a defensive response. Instead, discuss pleasant or neutral topics that activate brain areas responsible for pleasure, memory, and rational thinking.
Researchers note that extreme cognitive inflexibility in the face of new information is not always maladaptive. There can be benefits to protecting one’s most useful beliefs, and changing mental models without good reason can also cause problems.