Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, September 2, 2017

Political backfires

The good news from yesterday's post, wherein we learned some ways of fighting the backfire effect and convincing people to change their minds, was immediately counterbalanced by a new (and discouraging) study out of Denmark that showed that for politicians, the more data they have access to, the worse backfire effect becomes.

A team at Aarhus University led by Martin Baekgaard was studying motivated reasoning, which is the thought process we engage in when we are presented with information either supporting or refuting our prior beliefs.  In the first part of the experiment, test subjects were given test score data from two schools, A and B, and asked to evaluate which was more successful.  A different set of test subjects was given the same data, but one of the two schools was labeled "Public School A" and the other "Private School B" -- like in the United States, the relative merits of public vs. private schools is a topic of heated debate.

This first bit of research generated results that were unsurprising.  When the two schools were given anonymous tags, the data was evaluated fairly by both people who supported public schools and those who supported private schools.  When they were labeled, however, the backfire effect kicked in, and the test subjects' prior opinions skewed their analysis of the results.

So far, nothing we didn't already know.  But the second part of the experiment not only looked at the quantity of data provided, and compared the results of 1,000 test subjects from a variety of professions as compared to 954 career politicians.  And this gave some results that were, to put it mildly, interesting.  Let me give it to you in the authors' own words:
Does evidence help politicians make informed decisions even if it is at odds with their prior beliefs?  And does providing more evidence increase the likelihood that politicians will be enlightened by the information?  Based on the literature on motivated political reasoning and the theory about affective tipping points, this article hypothesizes that politicians tend to reject evidence that contradicts their prior attitudes, but that increasing the amount of evidence will reduce the impact of prior attitudes and strengthen their ability to interpret the information correctly.  These hypotheses are examined using randomized survey experiments with responses from 954 Danish politicians, and results from this sample are compared to responses from similar survey experiments with Danish citizens.  The experimental findings strongly support the hypothesis that politicians are biased by prior attitudes when interpreting information.  However, in contrast to expectations, the findings show that the impact of prior attitudes increases when more evidence is provided.
Yes, you read that right.  Politicians, like other people, are prone to falling into the backfire effect.  But unlike the rest of us, the more data they're given, the worse the backfire effect becomes.  Show a politician additional evidence, and all you're doing is making sure that (s)he stays planted even more firmly.

Baekgaard et al. propose a reason for this result, and I suspect they're correct; most politicians are, by their very nature, partisan, and have been elected because of strongly supporting a particular political agenda.  Since the backfire effect occurs when people double down on their beliefs because of feeling threatened, it stands to reason that politicians -- whose jobs depend on their beliefs being right -- would experience a greater sense of threat when they find they're wrong than the rest of us do.

But that leaves us with the rather alarming result that the people who are directing policy and making decisions for an entire electorate are going to be the ones whose response to the data is worst.

"The Great Presidential Puzzle" by James Albert Wales (1880) [image courtesy of the Wikimedia Commons]

And, of course, this result is borne out by what we see around us.  Here in the United States, it seems like every time new studies are performed and new data generated, the determination of politicians to shout "damn the facts, full speed ahead!" only gets stronger.  Which can explain why any of a number of crazy policies have been implemented, ones that fly in the face of every rational argument there is.

But in the words of Charlie Brown, "Now that I know that, what do I do?"  And my answer is: beats the hell out of me.  As I said in a previous post, I think nothing's going to change until the voters wise up, and that won't happen until we have a more educated citizenry.

And heaven only knows what it'll take for that to come about.

No comments:

Post a Comment