Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label overconfidence. Show all posts
Showing posts with label overconfidence. Show all posts

Tuesday, June 17, 2025

The view from the fringe

We've dealt with a lot of conspiracy theories here at Skeptophilia.  Amongst the more notable:

It's easy to assume that all of these are born of a lack of factual knowledge and understanding of the principles of logical induction.  I mean, if you have even the most rudimentary grasp of how weather works, you'd see that HAARP (the High-frequency Active Auroral Research Program, located in Alaska) couldn't possibly affect the path of hurricanes in the south Atlantic.

Especially since it was shut down in 2014.

But however ridiculously illogical some conspiracy theories are -- the Earth is flat, the Moon landings were faked, the Sun is a giant mirror reflecting laser light from an alien spaceship -- there are people who fervently believe them, and will hang onto those beliefs like grim death.  Anyone who disagrees must either be a "sheeple" or else in on the conspiracy themselves for their own nefarious reasons.

If I had to rank the people I least like to argue with, conspiracy theorists would beat out even young-Earth creationists.  They take "I believe this even though there's no evidence" and amplify it to "I believe this because there's no evidence."  After all, super-powerful conspirators wouldn't just go around leaving a bunch of evidence around, would they?  Of course not.

So q.e.d., as far as I can tell.

[Image is in the Public Domain]

It turns out, though, that it's more complicated than a simple lack of scientific knowledge.  A paper that came out this week in the journal Personality and Social Psychology Bulletin describes a study led by psychologist Gordon Pennycook of Cornell University, which found that -- even controlling for other factors, like intelligence, analytical thinking skills, and emotional stability -- conspiracy theorists were united by two main characteristics: overconfidence and a mistaken assumption that the majority of people agree with them.

The correlation was striking.  Asked whether their conspiratorial beliefs were shared by a majority of Americans, True Believers said "yes" 93% of the time (the actual average value for the conspiracies studied is estimated at 12%).  And the overconfidence extended even to tasks unrelated to their particular set of fringe beliefs.  Given an ordinary assessment of logic, knowledge of current events, or mathematical ability, the people who believe conspiracy theories consistently (and drastically) overestimated how well they'd scored.

"The tendency to be overconfident in general may increase the chances that someone falls down the rabbit hole (so to speak) and believes conspiracies," Pennycook said.  "In fact, our results counteract a prevailing narrative about conspiracy theorists: that they know that they hold fringe beliefs and revel in that fact...  Even people who believed very fringe conspiracies, such as that scientists are conspiring to hide the truth about the Earth being flat, thought that their views were in the majority.  Conspiracy believers – particularly overconfident ones – really seem to be miscalibrated in a major way.  Not only are their beliefs on the fringe, but they are very much unaware of how far on the fringe they are."

Which brings up the troubling question of how you counteract this.  My dad used to say, "There's nothing more dangerous than confident ignorance," and there's a lot of truth in that.

So how do you change a belief when it's woven together with the certainty that you're (1) in the right, and (2) in the majority?

It would require a shift not only in seeing the facts more clearly and seeing other people more clearly, but seeing yourself more clearly.  And that, unfortunately, is a tall order.

It reminds me of the pithy words of Robert Burns, which seems like a good place to end:

O, would some power the giftie gi'e us
To see ourselves as others see us;
It would frae many a blunder free us,
An' foolish notion.
****************************************


Saturday, January 28, 2023

The roots of conspiracy

It's all too easy to dismiss conspiracy theorists as just being dumb, and heaven knows I've fallen into that often enough myself.

Part of the problem is that if you know any science, so many conspiracy theories just seem... idiotic.  That 5G cell towers cause COVID.  That eating food heated up in a microwave causes cancer.  As we just saw last week, that Satan's throne is located in Geneva and that's why the physicists at CERN are up to no good.

And sure, there's a measure of ignorance implicit in most conspiracy theories.  To believe that Buffalo Bills player Damar Hamlin's on-field collapse was caused by the COVID vaccine -- as both Charlie Kirk and Tucker Carlson stated -- you have to be profoundly ignorant about how vaccines work.  (This claim led to a rash of people on Twitter who demanded that anything with mRNA in it be officially banned, apparently without realizing that mRNA is in every living cell and is a vital part of your protein-production machinery.  And, therefore, it is not only everywhere in your body, it's present in every meat or vegetable you've ever consumed.)

But simple ignorance by itself doesn't explain it.  After all, we're all ignorant about a lot of stuff; you can't be an expert in everything.  I, for example, know fuck-all about business and economics, which is why it's a subject I never touch here at Skeptophilia (or anywhere else, for that matter).  I'm fully aware of my own lack of knowledge on the topic, and therefore anything I could say about it would have no relevance whatsoever.

Scientists have been trying for years to figure out why some people fall for conspiracies and others don't.  One theory which at least partially explains it is that conspiracy theorists tend to score higher than average in the "dark triad" of personality traits -- narcissism, sociopathy, and black-and-white thinking -- but that isn't the whole answer, because there are plenty of people who score high on those assessments who don't espouse crazy ideas.

But now a psychologist at the University of Regina, Gordon Pennycook, thinks he has the right answer.

The defining characteristic of a conspiracy theorist isn't ignorance, narcissism, or sociopathy; it's overconfidence.

Pennycook designed a clever test to suss out people's confidence levels when given little to nothing to go on.  He showed volunteers photographs that were blurred beyond recognition, and asked them to identify what the subject of the photo was.  ("I don't know" wasn't an option; they had to choose.)  Then, afterward, they were asked to estimate the percentage of their guesses they thought they'd gotten right.

That self-assessment correlated beautifully with belief in conspiracy theories.

"Sometimes you're right to be confident," Pennycook said.  "In this case, there was no reason for people to be confident...  This is something that's kind of fundamental.  If you have an actual, underlying, generalized overconfidence, that will impact the way you evaluate things in the world."

The danger, apparently, is not in simple ignorance, but in ignorance coupled with "of course I understand this."  It reminds me of the wonderful study done by Leonid Rozenblit and Frank Keil about a phenomenon called the illusion of explanatory depth -- that many of us have the impression we understand stuff when we actually have no idea.  (Rozenblit and Keil's examples were common things like the mechanisms of a cylinder lock and a flush toilet, how helicopters fly and maneuver, and how a zipper works.)  Most of us could probably venture a guess about those things, but would add, "... I think" or "... but I could be wrong." 

The people predisposed to belief in conspiracy theories, Pennycook says, are the ones who would never think of adding the disclaimer.

That kind of overconfidence, often crossing the line into actual arrogance, seems to be awfully common.  I was just chatting a couple of weeks ago with my athletic trainer about that -- he told me that all too often he runs into people who walk into his gym and proceed to tell him, "Here's what I think I should be doing."  I find that attitude baffling, and so does he.  I said to him, "Dude, I'm hiring you because you are the expert.  Why the hell would I pay you money if I already knew exactly how to get the results I want?"

He said, "No idea.  But you'd be surprised at how often people come in with that attitude."  He shook his head.  "They never last long here."

The open question, of course, is how you inculcate in people a realistic self-assessment of what they do know, and an awareness that there's lots of stuff about which they might not be right.  In other words, a sense of intellectual humility.  To some extent, I think the answer is in somehow getting them to do some actual research (i.e. not just a quick Google search to find Some Guy's Website that confirms what they already believed).  For example, reading scientific papers, finding out what the actual experts have discovered.  Failing that -- and admittedly, a lot of scientific papers are tough going for non-specialists -- at least reading a damn Wikipedia page on the topic.  Yeah, Wikipedia isn't perfect, but the quality has improved dramatically since it was founded in 2001; if you want a quick overview of (for example) the Big Bang theory, then just read the first few paragraphs of the Wikipedia page on the topic, wherein you will very quickly find that it does not mean what the creationists are so fond of saying, that "nothing exploded and made everything."

Speaking of being overconfident on a topic about which they clearly know next to nothing.

In any case, I'll just exhort my readers -- and I'm reminding myself of this as well -- always to keep in mind the phrase "I could be wrong."  And yes, that applies even to your most dearly held beliefs.  It doesn't mean actively doubting everything; I'm not trying to turn you into wishy-washy wafflers or, worse, outright cynics.  But periodically holding our own beliefs up to the cold light of evidence is never a bad thing.

As prominent skeptic (and professional stage magician) Penn Jillette so trenchantly put it: "Don't believe everything you think."

****************************************