The fact is, though, we're not controlled solely by the higher-cognitive parts of our brains. We are also at the mercy of our emotions and biases, not to mention a set of perceptual apparati that work well enough most of the time, but are hardly without their own faults and (sometimes literal) blind spots.
This is why the backfire effect occurs. A pair of psychologists, Brendan Nyhan and Jason Reifler, found that most people, after being confronted with evidence against their prior beliefs, will espouse those beliefs more strongly:
Nyhan and Reifler found a backfire effect in a study of conservatives. The Bush administration claimed that tax cuts would increase federal revenue (the cuts didn't have the promised effect). One group was offered a refutation of this claim by prominent economists that included current and former Bush administration officials. About 35 percent of conservatives told about the Bush claim believed it. The percentage of believers jumped to 67 when the conservatives were provided with the refutation of the idea that tax cuts increase revenue. (from The Skeptic's Dictionary)As a blogger, this makes it hard to know how to approach controversial topics. By calmly and dispassionately citing evidence against silly claims, am I having the effect of making the True Believers double down on their position? If so, how could I approach things differently?
A study published this week in The Proceedings of the National Academy of Sciences provides the answer. To convince people of the error of their ways, agree with them, strenuously, following their beliefs to whatever absurd end they drive you, and without once uttering a contrary word.
Psychologists Eran Halperin, Boaz Hameiri, and Roni Porat of the Interdisciplinary Center Herzliya in Israel were looking at a way to alter attitudes between Israelis and Palestinians -- a goal as monumental as it is laudable. Given the decades that have been spent in futile negotiations between these two groups, always approached from a standpoint of logic, rationality, and compromise, Halperin, Hameiri, and Porat decided to try a different tack.
150 Israeli volunteers were split into two groups -- one was shown video clips of neutral commercials, the other video clips that related the Israeli/Palestinian conflict back to the values that form the foundation of the Israeli self-identity. In particular, the clips were based on the idea that Israel has a god-given right to exist, and is the most deeply moral society in the world. But instead of taking the obvious approach that attacks against Palestinians (including innocent civilians) called into question the morality of the Israeli stance, the videos followed these concepts to their logical conclusion -- that the conflict should continue, even if innocent Palestinians died, because of Israel's inherent moral rectitude.
And attitudes changed. The authors of the study report that members of the experimental group showed a 30% higher willingness to reevaluate their positions on the issue, as compared to the control group. They showed a greater openness to discussion of the opposing side's narrative, and a greater likelihood of voting for moderate political candidates. And the attitude change didn't wear off -- the subjects still showed the same alteration in their beliefs a year later. Hameiri writes:
The premise of most interventions that aim to promote peacemaking is that information that is inconsistent with held beliefs causes tension, which may motivate alternative information seeking. However, individuals—especially during conflict—use different defenses to preserve their societal beliefs. Therefore, we developed a new paradoxical thinking intervention that provides consistent—though extreme—information, with the intention of raising a sense of absurdity but not defenses.So apparently, Stephen Colbert is on the right track.
I find the whole thing fascinating, if a little frustrating. Being a science-geek-type, I have always lived in hope that rational argument and hard data would eventually win.
It appears, however, that it doesn't, always. It may be that for the deepest, most lasting changes in attitude, we have to take those beliefs we are trying to change, and force them to their logical ends, and hope that after that, the absurdity will speak for itself.