Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label Gordon Pennycook. Show all posts
Showing posts with label Gordon Pennycook. Show all posts

Saturday, January 28, 2023

The roots of conspiracy

It's all too easy to dismiss conspiracy theorists as just being dumb, and heaven knows I've fallen into that often enough myself.

Part of the problem is that if you know any science, so many conspiracy theories just seem... idiotic.  That 5G cell towers cause COVID.  That eating food heated up in a microwave causes cancer.  As we just saw last week, that Satan's throne is located in Geneva and that's why the physicists at CERN are up to no good.

And sure, there's a measure of ignorance implicit in most conspiracy theories.  To believe that Buffalo Bills player Damar Hamlin's on-field collapse was caused by the COVID vaccine -- as both Charlie Kirk and Tucker Carlson stated -- you have to be profoundly ignorant about how vaccines work.  (This claim led to a rash of people on Twitter who demanded that anything with mRNA in it be officially banned, apparently without realizing that mRNA is in every living cell and is a vital part of your protein-production machinery.  And, therefore, it is not only everywhere in your body, it's present in every meat or vegetable you've ever consumed.)

But simple ignorance by itself doesn't explain it.  After all, we're all ignorant about a lot of stuff; you can't be an expert in everything.  I, for example, know fuck-all about business and economics, which is why it's a subject I never touch here at Skeptophilia (or anywhere else, for that matter).  I'm fully aware of my own lack of knowledge on the topic, and therefore anything I could say about it would have no relevance whatsoever.

Scientists have been trying for years to figure out why some people fall for conspiracies and others don't.  One theory which at least partially explains it is that conspiracy theorists tend to score higher than average in the "dark triad" of personality traits -- narcissism, sociopathy, and black-and-white thinking -- but that isn't the whole answer, because there are plenty of people who score high on those assessments who don't espouse crazy ideas.

But now a psychologist at the University of Regina, Gordon Pennycook, thinks he has the right answer.

The defining characteristic of a conspiracy theorist isn't ignorance, narcissism, or sociopathy; it's overconfidence.

Pennycook designed a clever test to suss out people's confidence levels when given little to nothing to go on.  He showed volunteers photographs that were blurred beyond recognition, and asked them to identify what the subject of the photo was.  ("I don't know" wasn't an option; they had to choose.)  Then, afterward, they were asked to estimate the percentage of their guesses they thought they'd gotten right.

That self-assessment correlated beautifully with belief in conspiracy theories.

"Sometimes you're right to be confident," Pennycook said.  "In this case, there was no reason for people to be confident...  This is something that's kind of fundamental.  If you have an actual, underlying, generalized overconfidence, that will impact the way you evaluate things in the world."

The danger, apparently, is not in simple ignorance, but in ignorance coupled with "of course I understand this."  It reminds me of the wonderful study done by Leonid Rozenblit and Frank Keil about a phenomenon called the illusion of explanatory depth -- that many of us have the impression we understand stuff when we actually have no idea.  (Rozenblit and Keil's examples were common things like the mechanisms of a cylinder lock and a flush toilet, how helicopters fly and maneuver, and how a zipper works.)  Most of us could probably venture a guess about those things, but would add, "... I think" or "... but I could be wrong." 

The people predisposed to belief in conspiracy theories, Pennycook says, are the ones who would never think of adding the disclaimer.

That kind of overconfidence, often crossing the line into actual arrogance, seems to be awfully common.  I was just chatting a couple of weeks ago with my athletic trainer about that -- he told me that all too often he runs into people who walk into his gym and proceed to tell him, "Here's what I think I should be doing."  I find that attitude baffling, and so does he.  I said to him, "Dude, I'm hiring you because you are the expert.  Why the hell would I pay you money if I already knew exactly how to get the results I want?"

He said, "No idea.  But you'd be surprised at how often people come in with that attitude."  He shook his head.  "They never last long here."

The open question, of course, is how you inculcate in people a realistic self-assessment of what they do know, and an awareness that there's lots of stuff about which they might not be right.  In other words, a sense of intellectual humility.  To some extent, I think the answer is in somehow getting them to do some actual research (i.e. not just a quick Google search to find Some Guy's Website that confirms what they already believed).  For example, reading scientific papers, finding out what the actual experts have discovered.  Failing that -- and admittedly, a lot of scientific papers are tough going for non-specialists -- at least reading a damn Wikipedia page on the topic.  Yeah, Wikipedia isn't perfect, but the quality has improved dramatically since it was founded in 2001; if you want a quick overview of (for example) the Big Bang theory, then just read the first few paragraphs of the Wikipedia page on the topic, wherein you will very quickly find that it does not mean what the creationists are so fond of saying, that "nothing exploded and made everything."

Speaking of being overconfident on a topic about which they clearly know next to nothing.

In any case, I'll just exhort my readers -- and I'm reminding myself of this as well -- always to keep in mind the phrase "I could be wrong."  And yes, that applies even to your most dearly held beliefs.  It doesn't mean actively doubting everything; I'm not trying to turn you into wishy-washy wafflers or, worse, outright cynics.  But periodically holding our own beliefs up to the cold light of evidence is never a bad thing.

As prominent skeptic (and professional stage magician) Penn Jillette so trenchantly put it: "Don't believe everything you think."

****************************************


Tuesday, December 8, 2015

Profound bullshit

Considering what I write about six times a week, it's nice to have some validation on occasion.

The topic comes up because of a paper by Gordon Pennycook, James Allan Cheyne, Nathaniel Barr, Derek J. Koehler, and Jonathan A. Fugelsang that just came out in the journal Judgment and Decision Making, and which has the wonderful title, "On the Reception and Detection of Pseudo-Profound Bullshit."

I want all of you to read the original paper, because it's awesome, so I'll try my hardest not to steal their fire.  But you all have to see the first line of the abstract before I go any further:
Although bullshit is common in everyday life and has attracted attention from philosophers, its reception (critical or ingenuous) has not, to our knowledge, been subject to empirical investigation.
Just reading that made me want to weep tears of joy.

I have spent so many years fighting the mushy, sort-of-scientificky-or-something verbiage of the purveyors of woo-woo that to see the topic receive attention in a peer-reviewed journal did my poor jaded little heart good.  Especially when I found out that the gist of the paper was that if you take someone who is especially skilled at generating bullshit -- like say, oh, Deepak Chopra, for example  -- and compare his actual writings to phrases like those generated by the Random Deepak Chopra Quote Generator, test subjects couldn't tell them apart.

More specifically, people who ranked high on what Pennycook et al. have christened the "Bullshit Receptivity Scale" (BSR) tended to rate everything as profound, whether or not it made the least bit of sense:
The present study represents an initial investigation of the individual differences in receptivity to pseudo-profound bullshit.  We gave people syntactically coherent sentences that consisted of random vague buzzwords and, across four studies, these statements were judged to be at least somewhat profound.  This tendency was also evident when we presented participants with similar real-world examples of pseudo-profound bullshit.  Most importantly, we have provided evidence that individuals vary in conceptually interpretable ways in their propensity to ascribe profundity to bullshit statements; a tendency we refer to as “bullshit receptivity”.  Those more receptive to bullshit are less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine.
That... just... leaves me kind of choked up.

No, it's okay.  I'll be all right in a moment.  *sniffle*

[image courtesy of the Wikimedia Commons]

Then, there's this passage from the conclusion:
This is a valuable first step toward gaining a better understanding of the psychology of bullshit.  The development of interventions and strategies that help individuals guard against bullshit is an important additional goal that requires considerable attention from cognitive and social psychologists.  That people vary in their receptivity toward bullshit is perhaps less surprising than the fact that psychological scientists have heretofore neglected this issue.  Accordingly, although this manuscript may not be truly profound, it is indeed meaningful.
 I don't think I've been this happy about a scholarly paper since I was a graduate student in linguistics and found the paper by John McCarthy in Language called "Prosodic Structure and Expletive Infixation," which explained why you can say "abso-fuckin-lutely" but not "ab-fuckin-solutely."

The paper by Pennycook et al. has filled a void, in that it makes a point that has needed making for years -- that it's not only important to consider what makes someone a bullshitter, but what makes someone an, um, bullshittee.  Because people fall for platitude-spewing gurus like Chopra in droves, as evidenced by the fact that he's still giving talks to sold-out crowds, and making money hand over fist from selling books filled with lines like "The key to the essence of joy co-creates the expansion of creativity."

Which, by the way, was from the Random Deepak Chopra Quote Generator.  Not, apparently, that anyone can tell the difference.

And it brings me back to the fact that what we really, truly need in public schools is a mandatory course in critical thinking.  Because learning some basic principles of logic is the way you can immunize yourself against this sort of thing.  It may, in fact, be the only way.

Anyhow, I direct you all to the paper linked above.  The Pennycook et al. one, I mean.  Although the paper by John McCarthy is also pretty fan-fuckin-tastic.