Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, May 16, 2016

Analyzing the backfire

One of the most frustrating phenomena for us skeptics is the backfire effect.

The backfire effect is the documented tendency that people, when confronted with logic or evidence against their beliefs, actually hold those beliefs more strongly afterwards.  Being presented with a good argument, apparently, often has exactly the opposite effect from what we'd want.

This is understandably difficult for people like me, whose writing centers around getting people to reconsider their understanding of the universe using the tools of rationality and critical thinking.  But some recently released research has given us at least some comprehension of why the backfire effect occurs.

Entitled "Identity and Epistemic Emotions during Knowledge Revision: A Potential Account for the Backfire Effect," by Gregory J. Trevors, Krista R. Muis, Reinhard Pekrun, Gale M. Sinatra, and Philip H. Winne, the research was published a couple of months ago in the journal Discourse Processes.  The researchers designed an intriguing test to demonstrate not only that the backfire effect occurs (something that has, after all, been known for years) but to give us some understanding of what causes it.

Prior research had suggested that the resistance we have to changing our understanding comes from the fact that being challenged brings up the whole network of why we had those beliefs in the first place.  In effect, it reminds us of why we think what we do instead of triggering us to reconsider.  The result is a mental arms race -- a contest between what we already believed and the new information.  Given that the new information is usually understood more weakly, the old framework usually wins, and the fact of its having been considered and retained gives it the sense of being even more strongly correct than it was before.

The new research by Trevors et al. focuses on a different facet of this frustrating tendency.  What their study shows is that it is the emotion elicited by being challenged that triggers the backfire effect.  When we feel that our beliefs, and therefore (on some level) our core identity, is being attacked, the negative emotions that arise cause us to shy away and cling to our prior understanding.

Specifically, what the team did was to look at people's attitudes toward GMOs, a subject rife with misinformation and sensationalistic appeals to fear.  They first assessed the participants' attitudes toward GMOs, then gave them an assessment to gauge how strongly they felt about the issue of dietary purity.  They then gave the participants a passage to read that argued against the anti-GMO position, and afterwards asked them questions designed to measure not only how (or if) their ideas had changed, but how they responded emotionally while reading the passage.

[image courtesy of photographer Rosalee Yagihara and the Wikimedia Commons]

Perhaps unsurprisingly, the anti-GMOers who ranked dietary purity as a strong motivator were the most angered by reading the passage -- and they experienced the backfire effect the most strongly.  The weaker the emotional response, even if the participant was anti-GMO to begin with, the smaller the backfire.

I'm not sure that this is heartening.  So many of the ideas that we skeptics fight are deeply ingrained in people's idea about how the world works -- and therefore, on some level, entangled with their core identity.  To quote the Research Digest of the British Psychological Society, which reviewed the study:
If persuasion is most at risk of backfire when identity is threatened, we may wish to frame arguments so they don’t strongly activate that identity concept, but rather others.  And if, as this research suggests, the identity threat causes problems through agitating emotion, we may want to put off this disruption until later: Rather than telling someone (to paraphrase the example in the study) "you are wrong to think that GMOs are only made in labs because…", arguments could firstly describe cross-pollination and other natural processes, giving time for this raw information to be assimilated, before drawing attention to how this is incompatible with the person's raw belief – a stealth bomber rather than a whizz-bang, so to speak.
Which is hard to do when the emotional charge on both sides is strong, as is so often the case.  The bottom line, though, is that we humans are fundamentally not particularly rational creatures -- something worth remembering when we are trying to change minds.

1 comment:

  1. I've been reading Scott Adams' (from Dilbert fame) blog for the past several months as he has an interesting perspective on Donald Trump. He calls him a Master Persuader. Apparently, Scott has studied persuasion and hypnosis and he recognizes that Trump is doing things right in the persuasion area. One of the things Scott mentioned when discussing how persuasion works is that identity beats reason. So this study is right in line with what Scott says. Being a skeptic myself, I wonder how we as skeptics might use this information to start changing the messages we put out so that we can persuade more people.