Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, January 28, 2023

The roots of conspiracy

It's all too easy to dismiss conspiracy theorists as just being dumb, and heaven knows I've fallen into that often enough myself.

Part of the problem is that if you know any science, so many conspiracy theories just seem... idiotic.  That 5G cell towers cause COVID.  That eating food heated up in a microwave causes cancer.  As we just saw last week, that Satan's throne is located in Geneva and that's why the physicists at CERN are up to no good.

And sure, there's a measure of ignorance implicit in most conspiracy theories.  To believe that Buffalo Bills player Damar Hamlin's on-field collapse was caused by the COVID vaccine -- as both Charlie Kirk and Tucker Carlson stated -- you have to be profoundly ignorant about how vaccines work.  (This claim led to a rash of people on Twitter who demanded that anything with mRNA in it be officially banned, apparently without realizing that mRNA is in every living cell and is a vital part of your protein-production machinery.  And, therefore, it is not only everywhere in your body, it's present in every meat or vegetable you've ever consumed.)

But simple ignorance by itself doesn't explain it.  After all, we're all ignorant about a lot of stuff; you can't be an expert in everything.  I, for example, know fuck-all about business and economics, which is why it's a subject I never touch here at Skeptophilia (or anywhere else, for that matter).  I'm fully aware of my own lack of knowledge on the topic, and therefore anything I could say about it would have no relevance whatsoever.

Scientists have been trying for years to figure out why some people fall for conspiracies and others don't.  One theory which at least partially explains it is that conspiracy theorists tend to score higher than average in the "dark triad" of personality traits -- narcissism, sociopathy, and black-and-white thinking -- but that isn't the whole answer, because there are plenty of people who score high on those assessments who don't espouse crazy ideas.

But now a psychologist at the University of Regina, Gordon Pennycook, thinks he has the right answer.

The defining characteristic of a conspiracy theorist isn't ignorance, narcissism, or sociopathy; it's overconfidence.

Pennycook designed a clever test to suss out people's confidence levels when given little to nothing to go on.  He showed volunteers photographs that were blurred beyond recognition, and asked them to identify what the subject of the photo was.  ("I don't know" wasn't an option; they had to choose.)  Then, afterward, they were asked to estimate the percentage of their guesses they thought they'd gotten right.

That self-assessment correlated beautifully with belief in conspiracy theories.

"Sometimes you're right to be confident," Pennycook said.  "In this case, there was no reason for people to be confident...  This is something that's kind of fundamental.  If you have an actual, underlying, generalized overconfidence, that will impact the way you evaluate things in the world."

The danger, apparently, is not in simple ignorance, but in ignorance coupled with "of course I understand this."  It reminds me of the wonderful study done by Leonid Rozenblit and Frank Keil about a phenomenon called the illusion of explanatory depth -- that many of us have the impression we understand stuff when we actually have no idea.  (Rozenblit and Keil's examples were common things like the mechanisms of a cylinder lock and a flush toilet, how helicopters fly and maneuver, and how a zipper works.)  Most of us could probably venture a guess about those things, but would add, "... I think" or "... but I could be wrong." 

The people predisposed to belief in conspiracy theories, Pennycook says, are the ones who would never think of adding the disclaimer.

That kind of overconfidence, often crossing the line into actual arrogance, seems to be awfully common.  I was just chatting a couple of weeks ago with my athletic trainer about that -- he told me that all too often he runs into people who walk into his gym and proceed to tell him, "Here's what I think I should be doing."  I find that attitude baffling, and so does he.  I said to him, "Dude, I'm hiring you because you are the expert.  Why the hell would I pay you money if I already knew exactly how to get the results I want?"

He said, "No idea.  But you'd be surprised at how often people come in with that attitude."  He shook his head.  "They never last long here."

The open question, of course, is how you inculcate in people a realistic self-assessment of what they do know, and an awareness that there's lots of stuff about which they might not be right.  In other words, a sense of intellectual humility.  To some extent, I think the answer is in somehow getting them to do some actual research (i.e. not just a quick Google search to find Some Guy's Website that confirms what they already believed).  For example, reading scientific papers, finding out what the actual experts have discovered.  Failing that -- and admittedly, a lot of scientific papers are tough going for non-specialists -- at least reading a damn Wikipedia page on the topic.  Yeah, Wikipedia isn't perfect, but the quality has improved dramatically since it was founded in 2001; if you want a quick overview of (for example) the Big Bang theory, then just read the first few paragraphs of the Wikipedia page on the topic, wherein you will very quickly find that it does not mean what the creationists are so fond of saying, that "nothing exploded and made everything."

Speaking of being overconfident on a topic about which they clearly know next to nothing.

In any case, I'll just exhort my readers -- and I'm reminding myself of this as well -- always to keep in mind the phrase "I could be wrong."  And yes, that applies even to your most dearly held beliefs.  It doesn't mean actively doubting everything; I'm not trying to turn you into wishy-washy wafflers or, worse, outright cynics.  But periodically holding our own beliefs up to the cold light of evidence is never a bad thing.

As prominent skeptic (and professional stage magician) Penn Jillette so trenchantly put it: "Don't believe everything you think."

****************************************


No comments:

Post a Comment