Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label debunkers. Show all posts
Showing posts with label debunkers. Show all posts

Wednesday, March 9, 2022

Should've seen that coming

Self-proclaimed psychics hated James Randi, the venerable debunker of all things paranormal, who died in October 2020 at the honorable age of 92.  On one hand, it's obvious why; he loathed charlatans, especially those who in plying their trade rip off the gullible to the tune of thousands of dollars.  But honestly, there's a way in which Randi shouldn't have been so detested by the psychics.  After all, he wasn't saying, "Your claim is false and you're lying," he said, "Show me under controlled conditions that you can do what you say you can do."  Which you'd think is fair enough.  Given how many people out there claim to have paranormal abilities, it seems like at least one or two of them would have made a credible case (especially since the James Randi Foundation was offering a million dollar prize for the first person who could succeed).

But no.  Not one single person ever met the minimum criteria for scientifically-admissible evidence; in fact, very few psychics even took the bait.  A few of them said they wouldn't put themselves in the situation of having to demonstrate their ability in a situation where Randi's "atmosphere of suspicion and distrust" would interfere with the psychic resonant energy fields (or whatever), but most of them wisely decided to stay silent on the matter and ignore the challenge completely.

And it worked.  Being a psychic is as lucrative as ever.

[Image licensed under the Creative Commons Gunnshots (Don), Psychic reading, CC BY-SA 2.0]

Of course, since what the psychics do is make predictions, we don't even need Randi's method to check and see if there's anything to their claims; we can merely look back at the yearly predictions, and see what percentage of them were correct -- and if that hit rate exceeds what we'd expect from pure chance.

Which is exactly what a group of skeptics in Australia did.  The Great Australian Psychic Prediction Project, which just announced their results last week, analyzed 3,800 predictions made in the past twenty years by 207 self-styled psychics, and put each into one of five categories:
  • Expected (such as Simon Turnbull's prediction in 2000 that "one area that is going to do fantastic stuff is the internet, specifically areas like shopping.")
  • Too vague to call (such as Sarah Yip's statement in October 2020, "Who will win the U.S. election? … the numerology shows that both Mr. Trump and Mr. Biden have a chance of winning the next U.S. presidential election.  It is still up to the people to decide.")
  • Unknown/unverifiable (the smallest category, comprising only a little over two percent of the candidate claims)
  • Correct
  • Flat-out wrong (my favorite of those is Sarah Kulkens's 2007 claim that "Using anti-gravity to lift heavy objects will become a reality instead of a dream.")
The results are interesting, to say the least.  The "flat-out wrong" category amounted to 53% of the total, which doesn't seem too bad until you look only at the claims that were either verifiable and correct, or verifiable and wrong -- at which point the "wrong" category balloons to 83%.

Not a very impressive showing.

This gets even worse when you consider the major world events that every one of the 207 psychics involved in the study missed entirely.  These included:
  • the 9/11 attacks
  • the 2003 burn-up on reentry of the space shuttle Columbia
  • the 2004 earthquake and tsunami in the Indian Ocean that killed over 200,000 people
  • the 2011 Fukushima earthquake, tsunami, and nuclear disaster
  • Notre Dame Cathedral burning down in 2019
  • the outbreak of the COVID-19 pandemic
You'd think that events of this magnitude would have caused at least a small disturbance in The Force, or whatever the hell they claim is happening, but no.  The psychics were as caught off guard as the rest of us.

I'm all for keeping an open mind about things, but at some point you have to conclude that a complete absence of hard evidence means there's nothing there to see.  On one hand, I understand why people want psychic abilities to be real; it gives some kind of plan or pattern to what seems otherwise like a chaos-riddled reality.  But as my grandma used to tell me, "Wishin' don't make it so."  I've never found that the universe is under any obligation to conform to what I'd like to be true.

Or, as science writer and novelist Ann Druyan said, much more eloquently:
[Science] is a never-ending lesson in humility.  The vastness of the universe—and love, the thing that makes the vastness bearable—is out of reach to the arrogant.  This cosmos only fully admits those who listen carefully for the inner voice reminding us to remember we might be wrong.  What’s real must matter more to us than what we wish to believe.  But how do we tell the difference?

I know a way to part the curtains of darkness that prevent us from having a complete experience of nature.  Here it is, the basic rules of the road for science: Test ideas by experiment and observation.  Build on those ideas that pass the test.  Reject the ones that fail.  Follow the evidence wherever it leads.  And question everything, including authority.

**************************************

Wednesday, October 11, 2017

Course correction

I suppose you could say that everything I write here at Skeptophilia has the same overarching theme; how to tell truth from falsehood, how to recognize spurious claims, how to tell if you're being had.  But helping people to do this is an uphill struggle, and just how uphill was highlighted by a meta-analysis published last week in the Journal of the Association for Psychological Science, which had the rather dismal conclusion that we debunkers are kind of fucked no matter what we do.

Of course, being academics, they didn't state it that way.  Here's how the authors phrased it:
This meta-analysis investigated the factors underlying effective messages to counter attitudes and beliefs based on misinformation.  Because misinformation can lead to poor decisions about consequential matters and is persistent and difficult to correct, debunking it is an important scientific and public-policy goal. This meta-analysis revealed large effects for presenting misinformation, debunking, and the persistence of misinformation in the face of debunking.  Persistence was stronger and the debunking effect was weaker when audiences generated reasons in support of the initial misinformation.  A detailed debunking message correlated positively with the debunking effect.  Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.
Put more simply, the authors, Man-pui Sally Chan, Christopher R. Jones, and Kathleen Hall Jamieson of the University of Pennsylvania, and Dolores Albarracín of the University of Illinois at Urbana-Champaign, found that when confronting misinformation, a detailed response generates some degree of correction -- but makes some people double down on their incorrect understanding.

So it's yet another verification of the backfire effect, which makes it a little hard to see how we skeptics are supposed to move forward.  And the problem becomes even worse when people have been taught to distrust sources that could potentially ameliorate the problem; I can't tell you how many times I've seen posts stating that sites like Snopes and FactCheck.org are flawed, hopelessly biased, or themselves have an agenda to pull the wool over people's eyes.

It's like I've said before: once you convince people to doubt the facts, and that everyone is lying, you can convince them of anything.

[image courtesy of photographer John Snape and the Wikimedia Commons]

"The effect of misinformation is very strong," said co-author Dolores Albarracín.  "When you present it, people buy it.  But we also asked whether we are able to correct for misinformation.  Generally, some degree of correction is possible but it’s very difficult to completely correct."

The authors weren't completely doom-and-gloom, however, and made three specific recommendations for people dedicated to skepticism and the truth.  These are:
  • Reduce arguments that support misinformation: the media needs to be more careful about inadvertently repeating or otherwise giving unwarranted credence to the misinformation itself.
  • Engage audiences in scrutiny and counterarguing of information: schools, especially, should promote skepticism and critical thinking.  It is beneficial to have the audience involved in generating counterarguments -- further supporting the general idea of "teach people how to think, not what to think."
  • Introduce new information as part of the debunking message: give evidence and details.  Even though "misinformation persistence" is strong even in the face of detailed debunking, there was a positive correlation between detailed information and correction of misapprehension.  So: don't let the backfire effect stop you from fighting misinformation.
It may be an uphill battle, but it does work, and is certainly better than the alternative, which is giving up.  As Albarracín put it: "What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect."

I think the most frustrating part of all this for me is that there are biased media sources.  Lots of them.  Some of them (so-called "clickbait") post bullshit to drive up ad revenue; others are simply so ridiculously slanted that anything they publish should be independently verified every single time.  And because people tend to gravitate toward media that agree with what they already thought was true, sticking with sources that conform to your own biases makes it unlikely that you'll see where you're wrong (confirmation bias), and will allow you to persist in that error because you're surrounding yourself by people who are saying the same thing (the echo-chamber effect).

And that one, I don't know how to address.  It'd be nice if the fringe media would act more responsibly -- but we all know that's not going to happen any time soon.  So I'll just end with an exhortation for you to broaden the media you do read -- if you're conservative, check out the arguments on MSNBC every once in a while (and give them serious thought; don't just read, scoff, and turn away).  Same if you're a liberal; hit Fox News on occasion.  It may not change your mind, but at least it'll make it more likely that you'll discover the holes in your own thinking.

Wednesday, May 31, 2017

The fact of the matter

A couple of days ago I made the mistake of participating in that most fruitless of endeavors: an online argument with a total stranger.

It started when a friend of mine posted the question of whether the following quote was really in Hillary Clinton's book, It Takes a Village:


It isn't, of course, and a quick search was enough to turn up the page on Snopes that debunks the claim.  I posted the link, and my friend responded with a quick thanks and a comment that she was glad to have the straight scoop so that she wasn't perpetuating a falsehood.  And that should have been that.

And it would have been if some guy hadn't commented, "Don't trust Snopes!!!"  A little voice in the back of my head said, "Don't take the bait...", but a much louder one said, "Oh, for fuck's sake."  So I responded, "Come on.  Snopes is one of the most accurate fact-checking sites around.  It's been cross-checked by independent non-partisan analysts, and it's pretty close to 100% correct."

The guy responded, "No, it's not!"

You'd think at this point I'd have figured out that I was talking to someone who learned his debate skills in Monty Python's Argument Clinic, but I am nothing if not persistent.  I found the analysis I had referred to in my previous comment, and posted a clip from a summary of it on the site Skeptical Science:
Jan Harold Brunvand, a folklorist who has written a number of books on urban legends and modern folklore, considered the site so comprehensive in 2004 as to obviate launching one of his own.[10] 
David Mikkelson, the creator of the site, has said that the site receives more complaints of liberal bias than conservative bias,[23] but insists that the same debunking standards are applied to all political urban legends.  In 2012, FactCheck.org reviewed a sample of Snopes’ responses to political rumors regarding George W. Bush, Sarah Palin, and Barack Obama, and found them to be free from bias in all cases.  FactCheck noted that Barbara Mikkelson was a Canadian citizen (and thus unable to vote in US elections) and David Mikkelson was an independent who was once registered as a Republican.  “You’d be hard-pressed to find two more apolitical people,” David Mikkelson told them.[23][24]  In 2012, The Florida Times-Union reported that About.com‘s urban legends researcher found a “consistent effort to provide even-handed analyses” and that Snopes’ cited sources and numerous reputable analyses of its content confirm its accuracy.[25]
And he responded, "I disagree with you, but I respect your right to your opinion."

At that point, I gave up.

But I kept thinking about the exchange, particularly his use of the word "opinion."  It's an odd way to define the term, isn't it?  It's an opinion that I think single-malt scotch tastes good with dark chocolate.  It's an opinion that I detest the song "Stayin' Alive."

But whether Snopes is accurate or not is not an opinion.  It is either true, or it is not.  It's a little like the "flat Earth" thing.  If you believe, despite the overwhelming evidence, that the Earth is anything but an oblate spheroid, that is not "your opinion."

You are simply "wrong."

Now, I hasten to add that I don't think all of my own beliefs are necessarily correct.  After all, I haven't cross-checked Snopes myself, so I'm relying on the expertise of Brunvand et al. and trusting that they did their job correctly.  To the best of my knowledge, Snopes is accurate; and if anyone wants me to think otherwise, they need to do more than say "No, it isn't" every time I open my mouth.

But to call something like that an "opinion" implies that we all have our own sets of facts, even though many of them contradict each other, with the result that we all do what writer Kathryn Schulz calls "walking around in our little bubbles of being right about everything."  It's a little frightening how deep this mindset goes -- up to and including Donald Trump's shrieking "Fake news!" every time he hears something about him or his administration that he doesn't like.

I can understand wanting reality to be a different way than it is.  Hell, I'd rather teach Defense Against the Dark Arts at Hogwarts than biology in a public high school.  But wishin' don't make it so, as my grandma used to say, and once you grow up you need to face facts and admit it when you're wrong.  And, most importantly, recognize that the evidence won't always line up with your desires.  As President John Adams put it, "Facts are stubborn things.  Whatever our wishes, inclinations, and passions, they cannot alter the facts and the evidence."