Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, August 30, 2018

Going to the source

One of the hardest things for skeptics to fight is the tendency by some people to swallow any damnfool thing they happen to see online.

I had credited this tendency to gullibility.  If you see a catchy meme implying that if you drink a liter of vinegar a day, your arthritis will be cured ("Doctors hate this!  Get well with this ONE WEIRD TRICK!"), and think it sounds plausible, it's just because you don't have the background in science (or logic) to sift fact from fiction.

It turns out, the truth is apparently more complex than this.

According to a trio of psychologists working at the Johannes Gutenberg University Mainz and the Université Libre de Bruxelles, the problem isn't that silly ideas sound plausible to some people; it's that their mindset causes them to weight all information sources equally -- that one guy's blog is just as reliable as a scientific paper written by experts in the field.

(And yes, I'm fully aware of the irony of One Guy writing that in his blog.)

[Image licensed under the Creative Commons Karen Thibaut, Belmans in labo, CC BY-SA 3.0]

The paper, "Using Power as a Negative Cue: How Conspiracy Mentality Affects Epistemic Trust in Sources of Historical Knowledge," was written by Roland Imhoff, Pia Lamberty, and Olivier Klein, and appeared in the Personality and Social Psychology Bulletin a couple of months ago.  The authors write:
Classical theories of attitude change point to the positive effect of source expertise on perceived source credibility persuasion, but there is an ongoing societal debate on the increase in anti-elitist sentiments and conspiracy theories regarding the allegedly untrustworthy power elite.  In one correlational and three experimental studies, we tested the novel idea that people who endorse a conspiratorial mind-set (conspiracy mentality) indeed exhibit markedly different reactions to cues of epistemic authoritativeness than those who do not: Whereas the perceived credibility of powerful sources decreased with the recipients’ conspiracy mentality, that of powerless sources increased independent of and incremental to other biases, such as the need to see the ingroup in particularly positive light.  The discussion raises the question whether a certain extent of source-based bias is necessary for the social fabric of a highly complex society.
So people with a "conspiracy mentality" fall for conspiracies not because they're ignorant or gullible, but because their innate distrust of authority figures causes them to trust everyone equally -- they often frame it as being "open-minded" or "unbiased" -- regardless of what the credentials, background, expertise, or (even) sanity of the source.

In an interview in PsyPost, study co-author Roland Imhoff explained the angle they took on this perplexing social issue:
The very idea for the study was born in a joint discussion with my co-author Olivier Klein at a conference of social psychological representations of history.  We were listening to talks about all kinds of construals, biases and narratives about what happened in the ancient or not so ancient past.   Having the public debate about ‘alternative facts’ from after Trump’s inauguration still in the back of our minds, we wondered: how do we even know what we know, how do we know who to trust when it comes to events we all have not experienced in first person? 
While previous research had insisted that this is predominantly a question of trusting ingroup sources (i.e., my government, my national education institutions), we had a lingering suspicion that people who endorse conspiracy theories might have a different system of epistemic trust: not trusting those who are in power (and allegedly corrupt).
Which points out a problem I'd always found baffling -- why, to many people, is "being an intellectual elite" a bad thing?  It was one of the (many) epithets I heard hurled at Barack Obama -- that being Harvard-educated, he couldn't possibly care about, or even be aware, of the problems of ordinary middle-class America.  Conversely, this card was played the other way by George W. Bush.  He was a "regular guy," the type of fellow you could enjoy having a beer with on Saturday night and discussing the latest sports statistics.

And my thought was: don't you want our leaders to be smarter than you are?  I mean, seriously.  I know that I and the guys I have a beer with on Saturday night aren't qualified to run the country.  (And to my bar buddies, no disrespect intended.)  There's no way in hell I'm smart enough to be president.  One of the things I want in the people we elect to office is that they are smart -- smart enough to make good decisions based on actual factual knowledge.

That, apparently, is not the norm, which the election of Donald Trump -- clearly one of the least-qualified people ever to hold the highest office in the land -- illustrated with painful clarity.  But it wasn't only a flip of the middle finger at the Coastal Elites that got him there.  The study by Imhoff et al. suggests that it was because of a pervasive tendency to treat all sources of information as if they were equal.

"[T]he data consistently suggests [people with a conspiracy mentality] just ignore source characteristics," Imhoff said.  "To them a web blog is as trustworthy as an Oxford scholar.  As we have formulated, they have terminated the social contract of epistemic trust, that we should believe official sources more than unofficial ones."

I blame part of this on people like Rush Limbaugh, Sean Hannity, Ann Coulter, and (of course) Alex Jones, who have gone out of their way for years to convince everyone that the powers-that-be are lying to you about everything.  Now, the powers-that-be do lie sometimes.  Also, being an Oxford scholar is no guarantee against being wrong.  But if you cherry-pick your examples, and then act as if those instances of error or dishonesty are not only universal, but are deliberate attempts to hoodwink the public for nefarious purposes -- you've set up a vicious cycle where the more facts and evidence you throw at people, the less they trust you.

As I've pointed out before: if you can teach people to disbelieve the hard data, it's Game Over.  After that, you can convince them of anything.

******************************************

This week's Skeptophilia book recommendation is from one of my favorite thinkers -- Irish science historian James Burke.  Burke has made several documentaries, including Connections, The Day the Universe Changed, and After the Warming -- the last-mentioned an absolutely prescient investigation into climate change that came out in 1991 and predicted damn near everything that would happen, climate-wise, in the twenty-seven years since then.

I'm going to go back to Burke's first really popular book, the one that was the genesis of the TV series of the same name -- Connections.  In this book, he looks at how one invention, one happenstance occurrence, one accidental discovery, leads to another, and finally results in something earthshattering.  (One of my favorites is how the technology of hand-weaving led to the invention of the computer.)  It's simply great fun to watch how Burke's mind works -- each of his little filigrees is only a few pages long, but you'll learn some fascinating ins and outs of history as he takes you on these journeys.  It's an absolutely delightful read.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]




No comments:

Post a Comment