Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label Roland Imhoff. Show all posts
Showing posts with label Roland Imhoff. Show all posts

Thursday, August 30, 2018

Going to the source

One of the hardest things for skeptics to fight is the tendency by some people to swallow any damnfool thing they happen to see online.

I had credited this tendency to gullibility.  If you see a catchy meme implying that if you drink a liter of vinegar a day, your arthritis will be cured ("Doctors hate this!  Get well with this ONE WEIRD TRICK!"), and think it sounds plausible, it's just because you don't have the background in science (or logic) to sift fact from fiction.

It turns out, the truth is apparently more complex than this.

According to a trio of psychologists working at the Johannes Gutenberg University Mainz and the Université Libre de Bruxelles, the problem isn't that silly ideas sound plausible to some people; it's that their mindset causes them to weight all information sources equally -- that one guy's blog is just as reliable as a scientific paper written by experts in the field.

(And yes, I'm fully aware of the irony of One Guy writing that in his blog.)

[Image licensed under the Creative Commons Karen Thibaut, Belmans in labo, CC BY-SA 3.0]

The paper, "Using Power as a Negative Cue: How Conspiracy Mentality Affects Epistemic Trust in Sources of Historical Knowledge," was written by Roland Imhoff, Pia Lamberty, and Olivier Klein, and appeared in the Personality and Social Psychology Bulletin a couple of months ago.  The authors write:
Classical theories of attitude change point to the positive effect of source expertise on perceived source credibility persuasion, but there is an ongoing societal debate on the increase in anti-elitist sentiments and conspiracy theories regarding the allegedly untrustworthy power elite.  In one correlational and three experimental studies, we tested the novel idea that people who endorse a conspiratorial mind-set (conspiracy mentality) indeed exhibit markedly different reactions to cues of epistemic authoritativeness than those who do not: Whereas the perceived credibility of powerful sources decreased with the recipients’ conspiracy mentality, that of powerless sources increased independent of and incremental to other biases, such as the need to see the ingroup in particularly positive light.  The discussion raises the question whether a certain extent of source-based bias is necessary for the social fabric of a highly complex society.
So people with a "conspiracy mentality" fall for conspiracies not because they're ignorant or gullible, but because their innate distrust of authority figures causes them to trust everyone equally -- they often frame it as being "open-minded" or "unbiased" -- regardless of what the credentials, background, expertise, or (even) sanity of the source.

In an interview in PsyPost, study co-author Roland Imhoff explained the angle they took on this perplexing social issue:
The very idea for the study was born in a joint discussion with my co-author Olivier Klein at a conference of social psychological representations of history.  We were listening to talks about all kinds of construals, biases and narratives about what happened in the ancient or not so ancient past.   Having the public debate about ‘alternative facts’ from after Trump’s inauguration still in the back of our minds, we wondered: how do we even know what we know, how do we know who to trust when it comes to events we all have not experienced in first person? 
While previous research had insisted that this is predominantly a question of trusting ingroup sources (i.e., my government, my national education institutions), we had a lingering suspicion that people who endorse conspiracy theories might have a different system of epistemic trust: not trusting those who are in power (and allegedly corrupt).
Which points out a problem I'd always found baffling -- why, to many people, is "being an intellectual elite" a bad thing?  It was one of the (many) epithets I heard hurled at Barack Obama -- that being Harvard-educated, he couldn't possibly care about, or even be aware, of the problems of ordinary middle-class America.  Conversely, this card was played the other way by George W. Bush.  He was a "regular guy," the type of fellow you could enjoy having a beer with on Saturday night and discussing the latest sports statistics.

And my thought was: don't you want our leaders to be smarter than you are?  I mean, seriously.  I know that I and the guys I have a beer with on Saturday night aren't qualified to run the country.  (And to my bar buddies, no disrespect intended.)  There's no way in hell I'm smart enough to be president.  One of the things I want in the people we elect to office is that they are smart -- smart enough to make good decisions based on actual factual knowledge.

That, apparently, is not the norm, which the election of Donald Trump -- clearly one of the least-qualified people ever to hold the highest office in the land -- illustrated with painful clarity.  But it wasn't only a flip of the middle finger at the Coastal Elites that got him there.  The study by Imhoff et al. suggests that it was because of a pervasive tendency to treat all sources of information as if they were equal.

"[T]he data consistently suggests [people with a conspiracy mentality] just ignore source characteristics," Imhoff said.  "To them a web blog is as trustworthy as an Oxford scholar.  As we have formulated, they have terminated the social contract of epistemic trust, that we should believe official sources more than unofficial ones."

I blame part of this on people like Rush Limbaugh, Sean Hannity, Ann Coulter, and (of course) Alex Jones, who have gone out of their way for years to convince everyone that the powers-that-be are lying to you about everything.  Now, the powers-that-be do lie sometimes.  Also, being an Oxford scholar is no guarantee against being wrong.  But if you cherry-pick your examples, and then act as if those instances of error or dishonesty are not only universal, but are deliberate attempts to hoodwink the public for nefarious purposes -- you've set up a vicious cycle where the more facts and evidence you throw at people, the less they trust you.

As I've pointed out before: if you can teach people to disbelieve the hard data, it's Game Over.  After that, you can convince them of anything.

******************************************

This week's Skeptophilia book recommendation is from one of my favorite thinkers -- Irish science historian James Burke.  Burke has made several documentaries, including Connections, The Day the Universe Changed, and After the Warming -- the last-mentioned an absolutely prescient investigation into climate change that came out in 1991 and predicted damn near everything that would happen, climate-wise, in the twenty-seven years since then.

I'm going to go back to Burke's first really popular book, the one that was the genesis of the TV series of the same name -- Connections.  In this book, he looks at how one invention, one happenstance occurrence, one accidental discovery, leads to another, and finally results in something earthshattering.  (One of my favorites is how the technology of hand-weaving led to the invention of the computer.)  It's simply great fun to watch how Burke's mind works -- each of his little filigrees is only a few pages long, but you'll learn some fascinating ins and outs of history as he takes you on these journeys.  It's an absolutely delightful read.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]




Thursday, May 31, 2018

In on the secret

In yesterday's post, we considered how a feeling of being in power can dull people's capacity for empathy and compassion.  Today, we're going to look at how a desperation to feel unique, smart, and "in the know" can lead people to believe in baseless conspiracy theories.

Which once again brings us to Donald Trump.

The day before yesterday, Trump launched into new and unexplored vistas of paranoia by claiming that Robert Mueller's investigation of the Trump campaign's alleged collusion with Russian agents is itself going to meddle in the midterm elections this fall -- in order to favor Democrats.

Let's start with the fact that it'd be pretty odd if Mueller did this, because he's a registered Republican.  Not that Trump accepts this, either; every other tweet claims that anyone connected to the Russia investigation must be a Democrat, and apparently, he (through his mouthpieces over at Fox News) have convinced his followers that the whole thing is just a big Democratic conspiracy.  If you don't believe me, here's the exact quote:
The 13 Angry Democrats (plus people who worked 8 years for Obama) working on the rigged Russia Witch Hunt, will be MEDDLING with the mid-term elections, especially now that Republicans (stay tough!) are taking the lead in Polls. There was no Collusion, except by the Democrats!
It's an open question whether Trump himself believes this, or if he's manipulating his fan base in a calculated fashion so that he and his cronies can stay in power.  What's certain, though, is that his supporters believe it.  Never mind that there's no evidence; never mind that the facts themselves argue against its being true.

It's like he's Jesus, you know?  The new bumper sticker should say, "Trump said it, I believe it, and that settles it."

Which I find pretty mystifying.  I know there's some sunk-cost fallacy going on here; these people have already put an inordinate amount of time and energy into getting this guy elected, so to admit now that it's all been a big mistake is a bridge too far.  Easier to believe that Dear Leader is being targeted by a shadowy Deep State cohort of evil doers.

But some recent research has found that there are two other reasons people fall for conspiracies.  And what they suggest is a little frightening.

A study by Daniel Sullivan, Mark J. Landau, and Zachary K. Rothschild in the Journal of Personality and Social Psychology found that people's inclination to believe in conspiracies correlates negatively with their sense of being in control of their circumstances.  People who think that their lives are controlled by unpredictable and chaotic events -- the weather, natural disasters, random crime, arbitrary decisions by leaders -- are more likely to believe that there are evil conspiracies at work.  Which makes sense; if you feel like you're in control of your destiny, it makes less sense that there are Puppet Masters pulling your strings.

But there's more to it.  According to a recent study by Roland Imhoff and Pia Karoline Lamberty of the Johannes Gutenberg Universität Mainz that appeared in the European Journal of Social Psychology, a belief in conspiracy theories also correlates strongly with a need to feel unique.  The authors write:
Adding to the growing literature on the antecedents of conspiracy beliefs, this paper argues that a small part in motivating the endorsement of such seemingly irrational beliefs is the desire to stick out from the crowd, the need for uniqueness.  Across three studies, we establish a modest but robust association between the self‐attributed need for uniqueness and a general conspirational mindset (conspiracy mentality) as well as the endorsement of specific conspiracy beliefs.  Following up on previous findings that people high in need for uniqueness resist majority and yield to minority influence, [our research] experimentally shows that a fictitious conspiracy theory received more support by people high in conspiracy mentality when this theory was said to be supported by only a minority (vs. majority) of survey respondents.  Together, these findings support the notion that conspiracy beliefs can be adopted as a means to attain a sense of uniqueness.
Imhoff, writing about his and Lamberty's research in the online magazine Quartz, says:
Belief in conspiracies can serve to set oneself apart from the ignorant masses—a self-serving boast about one’s exclusive knowledge.  Adherence to conspiracy theory might not always be the result of some perceived lack of control, but rather a deep-seated need for uniqueness...  [Consider] the often vocal, evangelising conduct of actual conspiracy theorists, their claims to superior insight, and their degradation of non-believers as ignorant sheep (German conspiracy theorists label the uninformed masses Schlafschaf, literally ‘sleepsheep’).
So the reason people who fall for Donald Trump's wild conspiratorial claims, and those of other big names in the conspiracy theory world (such as Alex Jones and David Icke), is largely (1) that they feel powerless in their own lives, so someone must be causing the bad shit that happens, and (2) that they have a deep desire to be one of the ones who has it all figured out.

[Image licensed under the Creative Commons Christopher DOMBRES, CONSPIRACY THEORIES, CC BY-SA 4.0]

Whether Trump believes his own lunatic tweets, then, turns out to be irrelevant.  In the minds of his followers, it creates a rather horrifying trifecta of irrationality -- revering a figure who has become a stand-in for God himself, feeling like there are powerful forces responsible for all of the negative things in the world (including the attacks on Dear Leader), and a desperation not to be duped.  And the irony is, the direct result is that they are being duped, by a guy who was in over his head from day one and has made one blitheringly idiotic move after another, all the while claiming that any negative reaction is "Fake News" and any bad outcomes are because his Grand Plans are being subverted by either the Deep State or the Democrats, depending on what day it is.

The worst part is I don't know what the hell you can do about this.  As Imhoff put it: "Seeing evil plots at play behind virtually any world event is not only an effort to make sense of the world.  It can also be gratifying in and of itself: It grants one the allure of exclusive knowledge that sets one apart from the sleeping sheep."

Which can be coupled with the observation I've made here more than once that you can't logic your way out of a belief you didn't logic your way into.

************************

This week's recommended book is one that blew me away when I first read it, upon the urging of a student.  By groundbreaking neuroscientist David Eagleman, Incognito is a brilliant and often astonishing analysis of how our brains work.  In clear, lucid prose, Eagleman probes the innermost workings of our nervous systems -- and you'll learn not only how sophisticated it is, but how easy it can be to fool.