Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label bullshit. Show all posts
Showing posts with label bullshit. Show all posts

Saturday, April 10, 2021

Bullshitometry

Having spent 32 years as a high school teacher, I developed a pretty sensitive bullshit detector.

It was a necessary skill.  Kids who have not taken the time to understand the topic being studied are notorious for bullshitting answers on essay questions, often padding their writing with vague but sciency-sounding words.  An example is the following, which is verbatim (near as I can recall) from an essay on how photosynthesis is, and is not, the reverse of aerobic cellular respiration:
From analyzing photosynthesis and the process of aerobic cellular respiration, you can see that certain features are reversed between the two reactions and certain things are not.  Aerobic respiration has the Krebs Cycle and photosynthesis has the Calvin Cycle, which are also opposites in some senses and not in others.  Therefore, the steps are not the same.  So if you ran them in reverse, those would not be the same, either.
I returned this essay with one comment: "What does this even mean?"  The student in question at least had the gumption to admit he'd gotten caught.  He grinned sheepishly and said, "You figured out that I had no idea what I was talking about, then?"  I said, "Yup."  He said, "Guess I better study next time."

I said, "Yup."

Developing a sensitive nose for bullshit is critical not only for teachers, because there's a lot of it out there, and not just in academic circles.  Writer Scott Berkun addressed this in his wonderful piece, "How to Detect Bullshit," which gives some concrete suggestions about how to figure out what is USDA grade-A prime beef, and what is the cow's other, less pleasant output.  One of the best is simply to ask the questions, "How do you know that?", "Who else has this opinion?", and "What is the counter-argument?"

You say your research will revolutionize the field?

Says who?  Based on what evidence?

He also says to be very careful whenever anyone says, "Studies show," because usually if studies did show what the writer claims, (s)he'd be specific about what those studies were.  Vague statements like "studies show" are often a red flag that the claim doesn't have much in its favor.

Remember Donald Trump's "People are telling me" and "I've heard from reliable sources" and "A person came up to me at my last rally and said"?

Those mean, "I just now pulled this claim out of my ass."

Using ten-dollar buzzwords is also a good way to cover up the fact that you're sailing close to the wind.  Berkun recommends asking, "Can you explain this in simpler terms?"  If the speaker can't give you a good idea of what (s)he's talking about without resorting to jargon, the fancy verbiage is fairly likely to be there to mislead.

This is the idea behind BlaBlaMeter, a website I discovered a while back, into which you can cut-and-paste text and get a score (from 0 to 1.0) for how much bullshit it contains.  I'm not sure what the algorithm does besides detecting vague filler words, but it's a clever idea.  It'd certainly be nice to have a rigorous way to detect it when you're being bamboozled with words.



The importance of being able to detect fancy-sounding nonsense was highlighted by the acceptance of a paper for the International Conference on Atomic and Nuclear Physics -- when it turned out that the paper had been created by hitting iOS Autocomplete over and over.  The paper, written (sort of) by Christoph Bartneck, associate professor at the Human Interface Technology laboratory at the University of Canterbury in New Zealand, was titled "Atomic Energy Will Have Been Made Available to a Single Source" (the title was also generated by autocomplete), and contained passages such as:
The atoms of a better universe will have the right for the same as you are the way we shall have to be a great place for a great time to enjoy the day you are a wonderful person to your great time to take the fun and take a great time and enjoy the great day you will be a wonderful time for your parents and kids.
Which, of course, makes no sense at all.  In this case, I wonder if the reviewers simply didn't bother to read the paper -- or read a few sample sentences and found that they (unlike the above) made reasonable sense, and said, "Looks fine to me."

Although I'd like to think that even considering my lack of expert status on atomic and nuclear physics, I'd have figured out that what I was looking at was ridiculous.

On a more serious note, there's a much more pressing reason that we all need to arm ourselves against bullshit, because so much of what's on the internet is outright false.  A team of political fact-checkers was hired by Buzzfeed News to sift through claims on politically partisan Facebook pages, and found that on average, a third of the claims made by partisan sites were outright false.  And lest you think one side was better than the other, the study found that both right and left were making a great many unsubstantiated, misleading, or wrong claims.  And we're not talking about fringe-y wingnut sites here; these were sites that if you're on Facebook you see reposts from on a daily basis -- Occupy Democrats, Breitbart, AlterNet, Fox News, The Blaze, The Other 98%, NewsMax, Addicting Info, Right Wing News, and U.S. Uncut.

What this means is that when you see posts from these sites, there is (overall) about a 2/3 chance that what you're seeing is true.  So if you frequent those pages -- or, more importantly, if you're in the habit of clicking "share" on every story that you find mildly appealing -- you damn well better be able to figure out which third is wrong.

The upshot of it is, we all need better bullshit filters.  Given that we are bombarded daily by hundreds of claims from the well-substantiated to the outrageous, it behooves us to find a way to determine which is which.

And, if you're curious, a 275-word passage from this Skeotphilia post was rated by BlaBlaMeter as having a bullshit rating of 0.13.  Which I find reassuring.  Not bad, considering the topic I was discussing.

**************************************

This week's Skeptophilia book-of-the-week is a bit of a departure from the usual science fare: podcaster and author Rose Eveleth's amazing Flash Forward: An Illustrated Guide to the Possibly (and Not-So-Possible) Tomorrows.

Eveleth looks at what might happen if twelve things that are currently in the realm of science fiction became real -- a pill becoming available that obviates the need for sleep, for example, or the development of a robot that can make art.  She then extrapolates from those, to look at how they might change our world, to consider ramifications (good and bad) from our suddenly having access to science or technology we currently only dream about.

Eveleth's book is highly entertaining not only from its content, but because it's in graphic novel format -- a number of extremely talented artists, including Matt Lubchansky, Sophie Goldstein, Ben Passmore, and Julia Gförer, illustrate her twelve new worlds, literally drawing what we might be facing in the future.  Her conclusions, and their illustrations of them, are brilliant, funny, shocking, and most of all, memorable.

I love her visions even if I'm not sure I'd want to live in some of them.  The book certainly brings home the old adage of "Be careful what you wish for, you may get it."  But as long as they're in the realm of speculative fiction, they're great fun... especially in the hands of Eveleth and her wonderful illustrators.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, March 10, 2021

Shooting the bull

There's a folk truism that goes, "Don't try to bullshit a bullshitter."

The implication is that people who exaggerate and/or lie routinely, either to get away with things or to create an overblown image of themselves, know the technique so well that they can always spot it in others.  This makes bullshitting a doubly attractive game; not only does it make you slick, impressing the gullible and allowing you to avoid responsibility, it makes you savvy and less likely to be suckered yourself.

Well, a study published this week in The British Journal of Social Psychology, conducted by Shane Littrell, Evan Risko, and Jonathan Fugelsang, has shown that like many folk truisms, this isn't true at all.

In fact, the research supports the opposite conclusion.  At least one variety of regular bullshitting leads to more likelihood of falling for bullshit from others.

[Image licensed under the Creative Commons Inkscape by Anynobody, composing work: Mabdul ., Bullshit, CC BY-SA 3.0]

The researchers identified two main kinds of bullshitting, persuasive and evasive.  Persuasive bullshitters exaggerate or embellish their own accomplishments to impress others or fit in with their social group; evasive ones dance around the truth to avoid damaging their own reputations or the reputations of their friends.

Because of the positive shine bullshitting has with many, the researchers figured most people who engage either type wouldn't be shy about admitting it, so they used self-reporting to assess the bullshit levels and styles of the eight hundred participants.  They then gave each a more formal measure of cognitive ability, metacognitive insight, intellectual overconfidence, and reflective thinking, then a series of pseudo-profound and pseudoscientific statements mixed in with real profound and truthful statements, to see if they could tell them apart.

The surprising result was that the people who were self-reported persuasive bullshitters were significantly worse at detecting pseudo-profundity than the habitually honest; the evasive bullshitters were better than average.

"We found that the more frequently someone engages in persuasive bullshitting, the more likely they are to be duped by various types of misleading information regardless of their cognitive ability, engagement in reflective thinking, or metacognitive skills," said study lead author Shane Littrell, of the University of Waterloo.  "Persuasive BSers seem to mistake superficial profoundness for actual profoundness.  So, if something simply sounds profound, truthful, or accurate to them that means it really is.  But evasive bullshitters were much better at making this distinction."

Which supports a contention that I've had for years; if you lie for long enough, you eventually lose touch with what the truth is.  The interesting fact that persuasive and evasive bullshitting aren't the same in this respect might be because evasive bullshitters engage in this behavior because they're highly sensitive to people's opinions, both of themselves and of others.  This would have the effect of making them more aware of what others are saying and doing, and becoming better at sussing out what people's real motives are -- and whether they're being truthful or not.  But persuasive bullshitters are so self-focused that they aren't paying much attention to what others say, so any subtleties that might clue them in to the fact they they're being bullshitted slip right by.

I don't know whether this is encouraging or not.  I'm not sure if the fact that it's easier to lie successfully to a liar is a point to celebrate by those of us who care about the truth.  But it does illustrate the fact that our common sense about our own behavior sometimes isn't very accurate.  As usual, approaching questions from a skeptical scientific angle is the best.

After all, no form of bullshit can withstand that.

****************************************

Last week's Skeptophilia book-of-the-week was about the ethical issues raised by gene modification; this week's is about the person who made CRISPR technology possible -- Nobel laureate Jennifer Doudna.

In The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race, author Walter Isaacson describes the discovery of how the bacterial enzyme complex called CRISPR-Cas9 can be used to edit genes of other species with pinpoint precision.  Doudna herself has been fascinated with scientific inquiry in general, and genetics in particular, since her father gave her a copy of The Double Helix and she was caught up in what Richard Feynman called "the joy of finding things out."  The story of how she and fellow laureate Emmanuelle Charpentier developed the technique that promises to revolutionize our ability to treat genetic disorders is a fascinating exploration of the drive to understand -- and a cautionary note about the responsibility of scientists to do their utmost to make certain their research is used ethically and responsibly.

If you like biographies, are interested in genetics, or both, check out The Code Breaker, and find out how far we've come into the science-fiction world of curing genetic disease, altering DNA, and creating "designer children," and keep in mind that whatever happens, this is only the beginning.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Thursday, August 16, 2018

Selflessness, sociality, and bullshit

About two and a half years ago, a team of researchers released a landmark paper entitled "On the Reception and Detection of Pseudo-Profound Bullshit."  The gist of the paper, which I wrote about in Skeptophilia, is that people with lower cognitive ability and vocabulary are more prone to getting taken in by meaningless intellecto-babble -- statements that sound profound but actually don't mean anything.

This is the sort of thing that Deepak Chopra has become famous for, which led to the creation of the Random Deepak Chopra Quote Generator.  (My latest visit to it produced "The unexplainable embraces new life, and our consciousness constructs an expression of balance."  I've read some real Chopra, and I can say with certainty that it's damned hard to tell the difference.)

Now, a followup paper, authored by Arvid Erlandsson, Artur Nilsson, Gustav Tinghög, and Daniel Västfjäll of the University of Linköping (Sweden) -- interestingly, none of whom were involved in the earlier study -- has shown that not only does poor bullshit detection correlate with low cognition (which is hardly surprising), it correlates with high selfishness and low capacity for compassion.

In "Bullshit-Sensitivity Predicts Prosocial Behavior," which came out in the online journal PLoS One two weeks ago, the authors write:
Although bullshit-sensitivity has been linked to other individual difference measures, it has not yet been shown to predict any actual behavior.  We therefore conducted a survey study with over a thousand participants from a general sample of the Swedish population and assessed participants’ bullshit-receptivity (i.e. their perceived meaningfulness of seven bullshit sentences) and profoundness-receptivity (i.e. their perceived meaningfulness of seven genuinely profound sentences), and used these variables to predict two types of prosocial behavior (self-reported donations and a decision to volunteer for charity)...  [L]ogistic regression analyses showed that... bullshit-receptivity had a negative association with both types of prosocial behavior.  These relations held up for the most part when controlling for potentially intermediating factors such as cognitive ability, time spent completing the survey, sex, age, level of education, and religiosity.  The results suggest that people who are better at distinguishing the pseudo-profound from the actually profound are more prosocial.
"To our knowledge, we are the first study that links reactions to bullshit to an actual behavior rather than to self-reported measures. We also measure prosociality in two different ways, which makes the findings more robust and generalizable," said Arvid Erlandsson, who co-authored the study, in an interview in PsyPost.  "We see this finding as a small but interesting contribution to a fun and quickly emerging field of research rather than something groundbreaking or conclusive.  We are open with the fact that the results were found in exploratory analyses, and we cannot currently say much about the underlying mechanisms...  Future studies could potentially test causality (e.g. see whether courses in critical thinking could make people better at distinguishing the actually profound from the pseudo-profound and whether this also influences their prosociality compared to a control group)."

[Image licensed under the Creative Commons -- This vector image was created with Inkscape by Anynobody, composing work: Mabdul., Bullshit, CC BY-SA 3.0]

As you might expect, I find all of this fascinating.  I'm not sure it's all that encouraging, however; what this implies is that bullshit will tend to sucker stupid mean people (sorry, I'm not a researcher in psychology, and I just can't keep writing "low-cognition, low-prosocial individuals" without rolling my eyes).  And if you add that to the Dunning-Kruger effect -- the well-studied tendency of people with low ability to overestimate how good they are at something -- you've got a perfect storm of unpleasant behavior.

Stupid mean people who think they're better than the rest of us, and who will not only fall for nonsense, but will be unwilling to budge thereafter regardless of the facts.

Sound like some people in red hats we keep seeing on the news?

So my chortles of delight over the Erlandsson et al. paper were tempered by my knowledge that we here in the United States are currently watching the results play out for real, and it's scaring the hell out of a good many of us.

I'm not sure what, if anything, can be done about this.  Promote critical thinking in schools is a good place to start, but education budgets are being slashed pretty much everywhere, which certainly isn't conducive to adding new and innovative programs.  Other than that, we just have to keep coming back to facts, evidence, and logic, and hoping that someone -- anyone -- will listen.

*****************************

I picked this week's Skeptophilia book recommendation because of the devastating, and record-breaking, fires currently sweeping across the American west.  Tim Flannery's The Weather Makers is one of the most cogent arguments I've ever seen for the reality of climate change and what it might ultimately mean for the long-term habitability of planet Earth.  Flannery analyzes all the evidence available, building what would be an airtight case -- if it weren't for the fact that the economic implications have mobilized the corporate world to mount a disinformation campaign that, so far, seems to be working.  It's an eye-opening -- and essential -- read.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]





Saturday, October 22, 2016

Bullshitometry

As a teacher, I've developed a pretty sensitive bullshit detector.

It's a necessary skill.  Kids who have not taken the time to understand the topic being studied are notorious for bullshitting answers on essay questions, often padding their writing with vague but sciency-sounding words.  An example is the following, which is verbatim (near as I can recall) from an essay on how photosynthesis is, and is not, the reverse of aerobic cellular respiration:
From analyzing photosynthesis and the process of aerobic cellular respiration, you can see that certain features are reversed between the two reactions and certain things are not.  Aerobic respiration has the Krebs Cycle and photosynthesis has the Calvin Cycle, which are also opposites in some senses and not in others.  Therefore, the steps are not the same.  So if you ran them in reverse, those would not be the same, either.
I returned this essay with one comment:  "What does this even mean?"  The student in question at least had the gumption to admit he'd gotten caught.  He grinned sheepishly and said, "You figured out that I had no idea what I was talking about, then?"  I said, "Yup."  He said, "Guess I better study next time."

I said, "Yup."

Developing a sensitive nose for bullshit is critical not only for teachers, because there's a lot of it out there, and not just in academic circles.  Writer Scott Berkun addressed this in his wonderful piece, "How to Detect Bullshit," which gives some concrete suggestions about how to figure out what is USDA grade-A prime beef, and what is the cow's other, less pleasant output.  One of the best is simply to ask the questions, "How do you know that?", "Who else has this opinion?", and "What is the counter-argument?"

You say your research will revolutionize the field?

Says who?  Based on what evidence?

He also says to be very careful whenever anyone says, "Studies show," because usually if studies did show what the writer claims, (s)he'd be specific about what those studies were.  Vague statements like "studies show" are often a red flag that the claim doesn't have much in its favor.

Using ten-dollar buzzwords is also a good way to cover up the fact that you're sailing pretty close to the wind.  Berkun recommends asking, "Can you explain this in simpler terms?"  If the speaker can't give you a good idea of what (s)he's talking about without resorting to jargon, the fancy verbiage is fairly likely to be there to mislead.

This is the idea behind BlaBlaMeter, a website I found out about from a student of mine, into which you can cut-and-paste text and get a score (from 0 to 1.0) for how much bullshit it contains.  I'm not sure what the algorithm does besides detecting vague filler words, but it's a clever idea.  It'd certainly be nice to have a rigorous way to detect it when you're being bamboozled with words.


The importance of being able to detect fancy-sounding nonsense was highlighted just this week by the acceptance of a paper for the International Conference on Atomic and Nuclear Physics -- when it turned out that the paper had been created by hitting iOS Autocomplete over and over.  The paper, written (sort of) by Christoph Bartneck, associate professor at the Human Interface Technology laboratory at the University of Canterbury in New Zealand, was titled "Atomic Energy Will Have Been Made Available to a Single Source" (the title was also generated by autocomplete), and contained passages such as:
The atoms of a better universe will have the right for the same as you are the way we shall have to be a great place for a great time to enjoy the day you are a wonderful person to your great time to take the fun and take a great time and enjoy the great day you will be a wonderful time for your parents and kids.
Which, of course, makes no sense at all.  In this case, I wonder if the reviewers simply didn't bother to read the paper -- or read a few sample sentences and found that they (unlike the above) made reasonable sense, and said, "Looks fine to me."

Although I'd like to think that even considering my lack of expert status on atomic and nuclear physics, I'd have figured out that what I was looking at was ridiculous.

On a more serious note, there's a much more pressing reason that we all need to arm ourselves against bullshit, because so much of what's on the internet is outright false.  A team of political fact-checkers was hired by Buzzfeed News to sift through claims on politically partisan Facebook pages, and found that on average, a third of the claims made by partisan sites were outright false.  And lest you think one side was better than the other, the study found that both right and left were making a great many unsubstantiated, misleading, or wrong claims.  And we're not talking about fringe-y wingnut sites here; these were sites that if you're on Facebook you see reposts from on a daily basis -- Occupy Democrats, Eagle Rising, Freedom Daily, The Other 98%, Addicting Info, Right Wing News, and U.S. Uncut.

What this means is that when you see posts from these sites, there is (overall) about a 2/3 chance that what you're seeing is true.  So if you frequent those pages -- or, more importantly, if you're in the habit of clicking "share" on every story that you find mildly appealing -- you damn well better be able to figure out which third is wrong.

The upshot of it is, we all need better bullshit filters.  Given that we are bombarded daily by hundreds of claims from the well-substantiated to the outrageous, it behooves us to find a way to determine which is which.

And, if you're curious, a 275-word passage from this Skeotphilia post was rated by BlaBlaMeter as having a bullshit rating of 0.13.  Which I find reassuring.  Not bad, considering the topic I was discussing.

Tuesday, December 8, 2015

Profound bullshit

Considering what I write about six times a week, it's nice to have some validation on occasion.

The topic comes up because of a paper by Gordon Pennycook, James Allan Cheyne, Nathaniel Barr, Derek J. Koehler, and Jonathan A. Fugelsang that just came out in the journal Judgment and Decision Making, and which has the wonderful title, "On the Reception and Detection of Pseudo-Profound Bullshit."

I want all of you to read the original paper, because it's awesome, so I'll try my hardest not to steal their fire.  But you all have to see the first line of the abstract before I go any further:
Although bullshit is common in everyday life and has attracted attention from philosophers, its reception (critical or ingenuous) has not, to our knowledge, been subject to empirical investigation.
Just reading that made me want to weep tears of joy.

I have spent so many years fighting the mushy, sort-of-scientificky-or-something verbiage of the purveyors of woo-woo that to see the topic receive attention in a peer-reviewed journal did my poor jaded little heart good.  Especially when I found out that the gist of the paper was that if you take someone who is especially skilled at generating bullshit -- like say, oh, Deepak Chopra, for example  -- and compare his actual writings to phrases like those generated by the Random Deepak Chopra Quote Generator, test subjects couldn't tell them apart.

More specifically, people who ranked high on what Pennycook et al. have christened the "Bullshit Receptivity Scale" (BSR) tended to rate everything as profound, whether or not it made the least bit of sense:
The present study represents an initial investigation of the individual differences in receptivity to pseudo-profound bullshit.  We gave people syntactically coherent sentences that consisted of random vague buzzwords and, across four studies, these statements were judged to be at least somewhat profound.  This tendency was also evident when we presented participants with similar real-world examples of pseudo-profound bullshit.  Most importantly, we have provided evidence that individuals vary in conceptually interpretable ways in their propensity to ascribe profundity to bullshit statements; a tendency we refer to as “bullshit receptivity”.  Those more receptive to bullshit are less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine.
That... just... leaves me kind of choked up.

No, it's okay.  I'll be all right in a moment.  *sniffle*

[image courtesy of the Wikimedia Commons]

Then, there's this passage from the conclusion:
This is a valuable first step toward gaining a better understanding of the psychology of bullshit.  The development of interventions and strategies that help individuals guard against bullshit is an important additional goal that requires considerable attention from cognitive and social psychologists.  That people vary in their receptivity toward bullshit is perhaps less surprising than the fact that psychological scientists have heretofore neglected this issue.  Accordingly, although this manuscript may not be truly profound, it is indeed meaningful.
 I don't think I've been this happy about a scholarly paper since I was a graduate student in linguistics and found the paper by John McCarthy in Language called "Prosodic Structure and Expletive Infixation," which explained why you can say "abso-fuckin-lutely" but not "ab-fuckin-solutely."

The paper by Pennycook et al. has filled a void, in that it makes a point that has needed making for years -- that it's not only important to consider what makes someone a bullshitter, but what makes someone an, um, bullshittee.  Because people fall for platitude-spewing gurus like Chopra in droves, as evidenced by the fact that he's still giving talks to sold-out crowds, and making money hand over fist from selling books filled with lines like "The key to the essence of joy co-creates the expansion of creativity."

Which, by the way, was from the Random Deepak Chopra Quote Generator.  Not, apparently, that anyone can tell the difference.

And it brings me back to the fact that what we really, truly need in public schools is a mandatory course in critical thinking.  Because learning some basic principles of logic is the way you can immunize yourself against this sort of thing.  It may, in fact, be the only way.

Anyhow, I direct you all to the paper linked above.  The Pennycook et al. one, I mean.  Although the paper by John McCarthy is also pretty fan-fuckin-tastic.