Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label backfire effect. Show all posts
Showing posts with label backfire effect. Show all posts

Tuesday, August 5, 2025

Backfire

One of the many things that baffles me about my fellow humans is how hard it is for people to say, "Well, I guess I was wrong, then."

I mean, I'm sure I've got as many idées fixes as the next guy.  There are parts of my worldview I'm pretty attached to, and models for how the universe works that I would be absolutely astonished to find out were incorrect.  But I'd like to think that if I were to be presented with hard evidence, I'd have no choice but to shrug aside my astonishment and say it.

"Well, I guess I was wrong, then."

This attitude, however, seems to be in the minority.  Many more people will hang onto their preconceived notions like grim death, sometimes even denying evidence that is right in front of their eyes.  I distinctly recall one student who, despite being a young-earth creationist, elected to take my AP Biology class, about which I was up front that it was taught from an evolutionary perspective.  She was quite friendly and not at all antagonistic, and one time I asked her what her basis was for rejecting the evolutionary model.  Did she doubt the evidence?  Did it strike her as an illogical stance?  Did the whole thing simply not make sense to her?

No, she assured me -- she knew the evidence was real (and overwhelming); the whole argument was impeccably logical and made good sense to her.

She simply didn't believe it.  Despite all of the science she knew (and excelled at; she damn near aced my class, which was no mean feat), she simply knew that the Bible was literally true.

I didn't question further -- my aim, after all, was understanding, not conversion -- but I left the conversation feeling nothing but puzzlement.  My only conclusion was that she defined the word knowledge very differently than I did.

To take a less emotionally-charged example, let me tell you a story about a man named John Murray Spear.

Spear was born in Boston, Massachusetts in 1804, and from a young age attended the Universalist Church, eventually studying theology and being ordained.  He became minister of the congregation in Barnstable, and using his position fought for a bunch of righteous causes -- women's rights, labor reform, the abolition of slavery, and the elimination of the death penalty.

Spear, though, had another set of interests that were a little... odder.

He was a thoroughgoing Spiritualist, believing not only in the afterlife -- after all, that belief is shared by most Christian sects -- but that the spirits of the dead could and did communicate with the living.  He delved into the writings of Emanuel Swedenborg and Franz Mesmer, both of whom had attempted to give a scientific basis for spirit survival (and for the related belief in a soul as a substance or energy independent of the body).  Spear's obsession eventually brought him into conflict with the Universalist Church leaders, and in the end he followed his heart, giving up his ministerial position and breaking all ties with the church.

In 1852 he wrote a tract in which he claimed to be in contact with a group called the "Association of Electrizers," which included not only Spear's namesake, the Universalist minister John Murray, but Thomas Jefferson, John Quincy Adams, Benjamin Franklin, and Benjamin Rush.

You have probably already figured out that all of these men were dead at the time.

[Image is in the Public Domain]

This didn't stop Spear.  He produced documents with texts from Murray et al., and which included their signatures.  Asked by skeptics how ghosts could sign their names, Spear claimed that okay, he'd held the pen, but the ghosts had guided his hand.  The texts contained information on how to combine technology and Spiritualism to create a source of energy that would elevate humanity to new levels, so he set about building a machine in a shed on a hill in Lynn, Massachusetts that he claimed would release a "New Motive Power" using a "messianic perpetual motion machine."

Whatever the fuck that means.

So Spear and a few followers got to work building their machine out of copper wire, zinc plates, magnets, and one (1) dining room table.  After months of effort, Spear and an unnamed woman he called "the New Mary" held a ceremony where they "ritualistically birthed" the machine in an attempt to give it life.  Then they turned it on.

Nothing happened.

After a couple more abortive attempts to get it going, Spear's Spiritualist friends got fed up, destroyed the machine, and told Spear he could go to hell.

This is the point where you'd think anyone would have said that magic phrase -- "Well, I guess I was wrong, then."  Not Spear.  Spear became even more determined.  He seemed to follow that famous example of a faulty logical chain, "Many geniuses were laughed at in their time.  I'm being laughed at, so I must be a genius."  He kept at it for another two decades, never achieving success, which you no doubt could have predicted by my use of the phrase "perpetual motion machine."  It was only in 1872 that he said he'd received a message from the Association of Electrizers telling him it was time to retire.

But until his death in Philadelphia in 1887, he handed out business cards to all and sundry saying, "Guided and assisted by beneficent Spirit-Intelligences, Mr. S. will examine and prescribe for disease of body and mind, will delineate the character of persons when present, or by letter, and indicate their future as impressions are given him; will sketch the special capacities of young persons...  Applications to lecture, or hold conversations on Spiritualism, will be welcomed."

On the one hand, you have to admire Spear's tenacity.  On the other... well, how much evidence do you need?  Surely on some level he was aware that he was making it all up, right?  He doesn't seem to have simply been mentally ill; his writings on other topics show tremendous lucidity.

But he had an idea that he wanted to be true so badly that he just couldn't resign himself to its falsehood.

I have to wonder, though, if there might be a strain of that in all of us.  How would I react if I learned something that completely overturned my understanding?  Would I really shift ground as easily as I claim, or would I cling tenaciously to my preconceived notions?  I wonder how big a catastrophe in my thinking it would take to make me rebel, and like my long-ago student, say, "Okay, I see it, I understand it, but I don't believe it"?

It's easy for me to chuckle at Spear with his Association of Electrizers and New Motive Forces and messianic perpetual motion machines, but honestly, it's because I already didn't believe in any of that stuff.  Maybe I'm as locked into my worldview as he was.  As journalist Kathryn Schulz put it, "Okay, we all know we're fallible, but in a purely theoretical sense.  Try to think of one thing, right now, that you're wrong about.  You can't, can you?"

The facile response is, "Of course not, because if I knew I was wrong, I would change my mind," but I think that misses the point.  We all are trapped in our own conceptual frameworks, and fight like mad when anything threatens them.  The result is that most of us can be presented with arguments showing us that we're wrong, and we still don't change our minds.  Sometimes, in fact, being challenged makes us hang on even harder.  It's so common that psychologists have invented a name for the phenomenon -- the backfire effect.

Perhaps Spear is not that much of an aberration after all.  And how is this desperate clinging to being right at the heart of the political morass we currently find ourselves in here in the United States?

Once again, how much evidence do you need

So those are my rather depressing thoughts for the day.  A nineteenth-century Spiritualist, and an attitude that is still all too common today.  At least, for all Spear's unscientific claptrap, he still found time to support some important causes, which is more than I can say for the modern crop of evidence-deniers.

****************************************


Tuesday, December 19, 2023

Apocalypse ongoing

A while back, I wrote about the strange and disheartening research by Leon Festinger, Henry Riecken, and Stanley Schachter, the upshot of which is that frequently when there is powerful evidence against a deeply-held belief, the result is that the belief gets stronger.

It's called the backfire effect.  The Festinger et al. study looked at a cult that centered around a belief that the world was going to end on a very specific date.  When the Big Day arrived, the cult members assembled at the leader's house to await the end.  Many were in severe emotional distress.  At 11:30 P.M., the leader -- perhaps sensing things weren't going the way he thought they would -- secluded himself to pray.  And at five minutes till midnight, he came out of his room with the amazing news that because of their faith and piety, God told him he'd decided to spare the world after all.

The astonishing part is that the followers didn't do what I would have done, which is to tell the leader, "You are either a liar or a complete loon, and I am done with you."  They became even more devoted to him.  Because, after all, without him instructing them to keep the vigil, God would have destroyed the world, right?

Of course right.

The peculiar fact-resistance a lot of people have can reach amazing lengths, as I found out when a loyal reader of Skeptophilia sent me a link a couple of days ago having to do with the fact that people are still blathering on about the 2012 Mayan Apocalypse.  Remember that?  Supposedly the Mayan Long Count Calendar indicated that one of their long time-cycles (b'ak'tuns) was going to end on December 21, 2012, and because of that there was going to be absolute chaos.  Some people thought it would be the literal end of the world; the more hopeful types thought it would be some kind of renewal or Celestial Ascension that would mark the beginning of a new spiritual regime filled with peace, love, and harmony.

The problem was -- well, amongst the many problems was -- the fact that if you talked to actual Mayan scholars, they told you that the interpretation of the Long Count Calendar was dependent not only on translations of uncertain accuracy, but an alignment of that calendar with our own that could have been off in either direction by as much as fifty years.  Plus, there was no truth to the claim that the passage into the next b'ak'tun was anything more than a benchmark, same as going from December 31 to January 1.

Mostly what I remember about the Mayan Apocalypse is that evening, my wife and I threw an End-of-the-World-themed costume party.


Although the party was a smashing success, what ended up happening apocalypse-wise was... nothing.  December 22, 2012 dawned, and everyone just kept loping along as usual.  There were no asteroid impacts, nuclear wars, or alien invasions, and the giant tsunami that crested over the Himalayas in the catastrophically bad movie 2012 never showed up.

Which is a shame, because I have to admit that was pretty cool-looking.

So -- huge wind-up, with thousands of people weighing in, and then bupkis.  What's an apocalyptoid to do, in the face of that?

Well, according to the article my friend sent -- their response has been sort of along the lines of Senator George Aiken's solution to the Vietnam War: "Declare victory and go home."  Apparently there is a slice of true believers who think that the answer to the apocalypse not happening back in 2012 is that...

... the apocalypse did too happen.

I find this kind of puzzling.  I mean, if the world ended, you'd think someone would have noticed.  But that, they say, is part of how we know it actually happened.  Otherwise, why would we all be so oblivious?

The parallels to Festinger et al. are a little alarming.

The mechanisms of how all this worked are, unsurprisingly, a little sketchy.  Some think we dropped past the event horizon of a black hole and are now in a separate universe from the one we inhabited pre-2012.  Others think that we got folded into a Matrix-style simulation, and this is an explanation for the Mandela effect.  A common theme is that it has something to do with the discovery by CERN of the Higgs boson, which also happened in 2012 and therefore can't be a coincidence.

Some say it's significant that ever since then, time seems to be moving faster, so we're hurtling ever more quickly toward... something.  They're a little fuzzy on this part.  My question, though, is if time did speed up, how could we tell?  The only way you'd notice is if time in one place sped up by comparison to time in a different place, which is not what they're claiming.  They say that time everywhere is getting faster, to which I ask: getting faster relative to what, exactly?

In any case, the whole thing makes me want to take Ockham's Razor and slit my wrists with it.

So that's our dive in the deep end for the day.  No need to worry about the world ending, because it already did.  The good news is that we seem to be doing okay despite that, if you discount the possibility that we could be inside a black hole.

Me, I'm not going to fret about it.  I've had enough on my mind lately.  Besides, if the apocalypse happened eleven years ago, there's nothing more to be apprehensive about, right?

Of course right.

****************************************



Tuesday, May 18, 2021

Tweets and backfires

Let me ask you a hypothetical question.

You're over on Twitter, and you post a link making a political claim of some sort.  Shortly thereafter, you get responses demonstrating that the claim your link made is completely false.  Would you...

  1. ... delete the tweet, apologize, and be more careful about what you post in the future?
  2. ... shrug, say "Meh, whatever," and continue posting at the same frequency/with the same degree of care?
  3. ... flip off the computer and afterward be more likely to post inflammatory and/or false claims?

I know this sounds like a setup, and it is, but seriously; why wouldn't everyone select answer #1?  As I discussed in a post just a few days ago, we all make mistakes, and we all hate the feeling of finding out we're in error.  So given that most animal species learn to avoid choices that lead to experiencing pain, why is the answer actually more commonly #3?


I'm not just making a wild claim up myself in order to have a topic to blog about.  The fact that most people increase their rate of promulgating disinformation after they've been caught at it is the subject of a paper that was presented last week at the CHI Conference on Human Factors in Computing Systems called, "Perverse Downstream Consequences of Debunking: Being Corrected by Another User for Posting False Political News Increases Subsequent Sharing of Low Quality, Partisan, and Toxic Content in a Twitter Field Experiment."  The title could pretty much function as the abstract; in an analysis of two thousand Twitter users who post political tweets, the researchers looked at likelihood of posting false information after having errors pointed out online, and found, amazingly enough, a positive correlation.

"We find causal evidence that being corrected decreases the quality, and increases the partisan slant and language toxicity, of the users’ subsequent retweets," the authors write.  "This suggests that being publicly corrected by another user shifts one's attention away from accuracy -- presenting an important challenge for social correction approaches."

"Challenge" isn't the right word; it's more like "tendency that's so frustrating it makes anyone sensible want to punch a wall."  The researchers, Mohsen Mosleh (of the University of Exeter) and Cameron Martel, Dean Eckles, and David Rand (of the Massachusetts Institute of Technology), have identified the twenty-first century iteration of the backfire effect -- a well-studied phenomenon showing that being proven wrong makes you double down on whatever your claim was.  But here, it apparently makes you not only double down on that claim, but on every other unfounded opinion you have.

In what universe does being proven wrong make you more confident?

I swear, sometimes I don't understand human psychology at all.  Yeah, I guess you could explain it by saying that someone who has a dearly-held belief questioned is more motivated in subsequent behavior by the insecurity they're experiencing than by any commitment to the truth, but it still makes no sense to me.  The times I've been caught out in an error, either here at Skeptophilia or elsewhere, were profoundly humbling and (on occasion) outright humiliating, and the result was (1) I apologized for my error, and (2) I was a hell of a lot more careful what I posted thereafter.

What I didn't do was to say "damn the torpedoes, full speed ahead."

This does pose a quandary.  Faced with a false claim on social media, do we contradict it?  I don't have the energy to go after every piece of fake news I see; I usually limit myself to posts that are explicitly racist, sexist, or homophobic, because I can't in good conscience let that kind of bullshit go unchallenged.  But what if the outcome is said racist, sexist, or homophobe being more likely to post such claims in the future?

Not exactly the result I'm looking for, right there.

So that's our discouraging piece of research for today.  I honestly don't know what to do about a tendency that is so fundamentally irrational.  Despite all of our science and technology, a lot of our behavior still seems to be caveman-level.  "Ogg say bad thing about me.  Me bash Ogg with big rock."

***********************************

Too many people think of chemistry as being arcane and difficult formulas and laws and symbols, and lose sight of the amazing reality it describes.  My younger son, who is the master glassblower for the chemistry department at the University of Houston, was telling me about what he's learned about the chemistry of glass -- why it it's transparent, why different formulations have different properties, what causes glass to have the colors it does, or no color at all -- and I was astonished at not only the complexity, but how incredibly cool it is.

The world is filled with such coolness, and it's kind of sad how little we usually notice it.  Colors and shapes and patterns abound, and while some of them are still mysterious, there are others that can be explained in terms of the behavior of the constituent atoms and molecules.  This is the topic of the phenomenal new book The Beauty of Chemistry: Art, Wonder, and Science by Philip Ball and photographers Wenting Zhu and Yan Liang, which looks at the chemistry of the familiar, and illustrates the science with photographs of astonishing beauty.

Whether you're an aficionado of science or simply someone who is curious about the world around you, The Beauty of Chemistry is a book you will find fascinating.  You'll learn a bit about the chemistry of everything from snowflakes to champagne -- and be entranced by the sheer beauty of the ordinary.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]


Tuesday, August 20, 2019

It's the end of the world, if you notice

I have commented more than once about my incredulity with regards to end-of-the-world predictions.  Despite the fact that to date, they have had a 100% failure rate, people of various stripes (usually of either the ultra-religious persuasion or the woo-woo conspiracy one) continue to say that not only is the world doomed, they know exactly when, how, and why.  (If you don't believe me, take a look at the Wikipedia page for apocalyptic predictions, which have occurred so often they had to break it down by century.)

As far as why this occurs -- why repeated failure doesn't make the true believers say, "Well, I guess that claim was a bunch of bullshit, then" -- there are a variety of reasons.  One is a sort of specialized version of the backfire effect, which occurs when evidence against a claim you believe strongly leaves you believing it even more strongly.  Way back in 1954 psychologists Leon Festinger, Henry Riecken, and Stanley Schachter infiltrated a doomsday cult, and in fact Festinger was with the cult on the day they'd claimed the world was going to end.  When 11:30 PM rolled around and nothing much was happening, the leader of the cult went into seclusion.  A little after midnight she returned with the joyous news that the cult's devotion and prayers had averted the disaster, and god had decided to spare the world, solely because of their fidelity.

Hallelujah!  We better keep praying, then!

(Note bene: The whole incident, and the analysis of the phenomenon by Festinger et al., is the subject of the fascinating book When Prophecy Fails.)

Despite this, the repeated failure of an apocalyptic prophecy can cause your followers to lose faith eventually, as evangelical preacher Harold Camping found out.  So the people who believe this stuff often have to engage in some fancy footwork after the appointed day and hour arrive, and nothing happens other than the usual nonsense.

Take, for example, the much-publicized "Mayan apocalypse" on December 21, 2012 that allegedly was predicted by ancient Mayan texts (it wasn't) and was going to herald worldwide natural disasters (it didn't).  The True Believers mostly retreated in disarray when December 22 dawned, as well they should have.  My wife and I threw a "Welcoming In The Apocalypse" costume party on the evening of December 21, and I have to admit to some disappointment when the hour of midnight struck and we were all still there.  But it turns out that not all of the Mayan apocalyptoids disappeared after the prediction failed; one of them, one Nick Hinton, says actually the end of the world did happen, as advertised...

... but no one noticed.

Hinton's argument, such as it is, starts with a bit of puzzling over why you never hear people talking about the 2012 apocalypse any more.  (Apparently "it didn't happen" isn't a sufficient reason.)  Hinton finds this highly peculiar, and points out that this was the year CERN fired up the Large Hadron Collider and discovered the Higgs boson, and that this can't possibly be a coincidence.  He wonders if this event destroyed the universe and/or created a black hole, and then "sucked us in" without our being aware of it.

[Image licensed under the Creative Commons Lucas Taylor / CERN, CMS Higgs-event, CC BY-SA 3.0]

Me, I think I'd notice if I got sucked into a black hole.  They're kind of violent places, as I described yesterday in my post about Sagittarius A*.  But Hinton isn't nearly done with his explanation.  He writes:
There's the old cliché argument that "nothing has felt right" since 2012.  I agree with this... [E]ver since then the world seems to descend more and more into chaos each day.  Time even feels faster.  There's some sort of calamity happening almost daily.  Mass shootings only stay in the headlines for like 12 hours now.  Did we all die and go to Hell?...  Like I've said, I think we live in a series of simulations.  Perhaps the universe was destroyed by CERN and our collective consciousness was moved into a parallel universe next door.  It would be *almost* identical.
Of course, this is a brilliant opportunity to bring out the Mandela effect, about which I've written before.  The idea of the Mandela effect is that people remember various stuff differently (such as whether Nelson Mandela died in prison, whether it's "Looney Tunes" or "Loony Tunes" and "The Berenstein Bears" or "The Berenstain Bears," and so forth), and the reason for this is not that people's memories in general suck, but that there are alternate universes where these different versions occur and people slip back and forth between them.

All of which makes me want to take Ockham's Razor and slit my wrists with it.

What I find intriguing about Hinton's explanation is not all the stuff about CERN, though, but his arguing that the prediction didn't fail because he was wrong, but that the world ended and six-billion-plus people didn't even notice.  Having written here at Skeptophilia for almost nine years, I'm under no illusions about the general intelligence level of humanity, but for fuck's sake, we're not that unobservant.  And even if somehow CERN did create an alternate universe, why would it affect almost nothing except for things like the spelling of Saturday morning cartoon titles?

So this is taking the backfire effect and raising it to the level of performance art.  This is saying that it is more likely that the entire population of the Earth was unaware of a universe-ending catastrophe than it is that you're wrong.

Which is so hubristic that it's kind of impressive.

But I better wind this up, because I've got to prepare myself for the next end of the world, which (according to the late psychic Jeane Dixon) was going to occur in January of 2020.  Which only gives me a few months to get ready.  So many apocalypses, so little time.

*****************************

This week's Skeptophilia book recommendation is a must-read for anyone interested in astronomy -- Finding Our Place in the Universe by French astrophysicist Hélène Courtois.  Courtois gives us a thrilling tour of the universe on the largest scales, particularly Laniakea, the galactic supercluster to which the Milky Way belongs, and the vast and completely empty void between Laniakea and the next supercluster.  (These voids are so empty that if the Earth were at the middle of one, there would be no astronomical objects near enough or bright enough to see without a powerful telescope, and the night sky would be completely dark.)

Courtois's book is eye-opening and engaging, and (as it was just published this year) brings the reader up to date with the latest information from astronomy.  And it will give you new appreciation when you look up at night -- and realize how little of the universe you're actually seeing.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]






Thursday, August 15, 2019

Doubling down on error

Is it just me, or is the defining hallmark of discourse these days a steadfast refusal to admit when you're wrong?

Surprisingly enough I'm not referring here to Donald Trump, who has raised a casual disdain for the truth to near-mythic proportions.  What's even more astonishing, though, is his followers' determination to believe everything he says, even when it contradicts what he just said.  Trump could say, "The sky is green!  It is also purple-and-orange plaid!  And I didn't say either of those things!  Also, I am not here!" and his devotees would just nod and smile and comment on what an honest and godly man he is and how great America is now that we've been abandoned by all our allies and the national debt is a record 22 trillion dollars.

In this case, though, I'm referring to two Republican policy wonks who apparently wouldn't believe climate change was happening if the entire continent spontaneously burst into flame.  The first was Matt Schlapp, head of the American Conservative Union, who was pissed off by Bernie Sanders publicly calling Trump an idiot for not accepting climate change, and responded in a tweet, "They can’t even predict if it will rain on tues but we are certain about the weather 12 yrs from now."

This is such an egregious straw man that it's almost a work of art.  In 21 words, we find the following:
  • Weather ≠ climate.  For fuck's sake.  We've been through this how many times before?
  • Meteorologists are, actually, quite good at predicting when and where it will rain.  Weather is a complex affair, so they don't always get it right, but if the evening weather report says your annual family picnic tomorrow is going to get a drenching, you should probably pay attention.
  • Knowing the climatic trends tells you exactly nothing about "the weather twelve years from now."  Cf. my earlier comment about how weather ≠ climate.
  • Predictions and trends don't imply certainty.  Ever.  But if 99% of working climatologists believe that anthropogenic climate change is happening, and that it's going to have drastic negative effects not only on the environment but ourselves, I'm gonna listen to them rather than to a guy whose main occupation seems to be sneering at people he disagrees with.
Then there was writer and pontificator Dinesh d'Souza, who posted a video of kangaroos hopping about in the snow with the caption, "Global warming comes to Australia.  Unless you want to believe your lying eyes!"

Unsurprisingly, within minutes d'Souza was excoriated by hundreds of people letting him know that (1) the Earth is spherical, implying that (2) there are these things called "hemispheres," which (3) cause the seasons, and (4) since Australia is in the opposite one than North America, they're experiencing winter right now.  Also, he was informed more than once that the largest mountain range in Australia is named "the Snowy Mountains," and it's for an analogous reason that the Rocky Mountains got their name by virtue of being composed largely of rocks.

A grove of native trees in New South Wales, Australia.  They're called "snow gums."  Guess why?  [Image licensed under the Creative Commons Thennicke, Snow gums, Dead Horse Gap NSW Australia, CC BY-SA 4.0]

What gets me about this is not that two laypeople made a mistake about science.  That is gonna happen because (let's face it) science can be hard.  What I find astonishing is that when confronted with multitudes of fact-based objections, neither man said, "Wow, that sure was a dumb statement!  What a goober I am."  Both of them took the strategy of "Death Before Backing Down," and I can nearly guarantee that this incident will not change their minds one iota, and that (given the opportunity) they will make equally idiotic statements next time.

Look, I'm not claiming I'm infallible.  Far from it.  But what I will say is that if I'm wrong, I'll admit it -- and if it's in print (as here at Skeptophilia) I'll post a correction or retraction, or (if the error was egregious enough) delete the post entirely.  I've done so more than once over the nine years I've had this blog, and although admitting you're mistaken is never pleasant, it's absolutely critical to honest... everything.

But that seems to be a lost art lately.  The attitude these days is, "If someone proves you're wrong, keep saying the same thing, only be more strident."  Evidently truth these days isn't about who has the stronger evidence, but who yells the loudest.  It's no wonder the American citizenry is, as a whole, so misinformed, especially on scientific matters -- in science the touchstone is not volume but factual support.

And that seems to be the last thing any of these people are looking at.

***********************************

This week's Skeptophilia book recommendation is sheer brilliance -- Jenny Lawson's autobiographical Let's Pretend This Never Happened.  It's an account of her struggles with depression and anxiety, and far from being a downer, it's one of the funniest books I've ever read.  Lawson -- best known from her brilliant blog The Blogess -- has a brutally honest, rather frenetic style of writing, and her book is sometimes poignant and often hilarious.  She draws a clear picture of what it's like to live with crippling social anxiety, an illness that has landed Lawson (as a professional author) in some pretty awkward situations.  She looks at her own difficulties (and those of her long-suffering husband) through the lens of humor, and you'll come away with a better understanding of those of us who deal day-to-day with mental illness, and also with a bellyache from laughing.

[Note: If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, March 30, 2019

The outrage machine

I had a puzzling, and frustrating, exchange on social media a couple of days ago.

An acquaintance of mine posted the following, without comment:


I found this annoying enough that I responded something to the effect that I teach in a public school, where I have (1) seen students saying grace before eating lunch and no one has batted an eyelash, (2) we recite the Pledge of Allegiance every single day, as do all public school students in the entire United States, and (3) before December break, I hear both students and staff wishing each other "Merry Christmas."

Now me, if I made a statement and a person who should know demonstrated conclusively I was wrong, I would admit it and retreat in disarray.  Predictably, that's not what happened here.  She responded -- and this is a direct quote -- "I was not saying this was necessarily true, it just feels that like the majority of our schools have gotten so far away from God and Country that one can feel persecuted for the above things."

So let me get this straight.  It's not true, but you feel like it's true, and you're feeling persecuted for being a patriotic Christian despite the fact that  75% of Americans identify as Christian and it's still damn near a prerequisite for winning an election in the United States.

What this implies is that your feelings about something are more important than whether it's actually true.  Which I find more than a little troubling.  Even more troubling is the way the media have capitalized on this tendency, avoiding facts wherever possible in favor of whipping people into a frenzy with emotional appeals.  Lately, it seems like the main church people belong to in the United States is the Church of Our Lady of Perpetual Outrage.

The part of this I find the most baffling is how people seem to want to feel outraged.  I mean, think about it.  Suppose I was fearful that there was a rabies outbreak in my neighborhood.  Maybe there was even the allegation that someone had deliberately introduced rabies into wild animals nearby.  Then, it turns out that it's not true -- there are no cases of rabies, and the rumors of a deliberately-created epidemic are false, and everything is safe.  The whole thing was a hoax.

I don't know about you, but I wouldn't feel more fearful, I'd feel relieved.  I'd probably be angry at the person who started the hoax, but what it wouldn't do is make me double down on how dangerous everything was and how everyone needed to be scared of the horrifying rabies epidemic.

Here, though, my assurance that what this person feared -- that public schools were actively suppressing patriotism and Christianity -- was false had exactly the opposite effect.  "Okay, it's not true," she seemed to be saying, "but we still need to act like it is!"  And the people who are perpetuating the falsehoods aren't looked upon as liars or hoaxers, they're seen as heroic mavericks who are rallying the troops for a desperate last stand defending all that's sacred.

Which I don't understand at all.  I, personally, don't like liars, and I hate feeling outraged.  I much prefer it when my fellow humans turn out to be more kind and caring and tolerant and understanding than I thought they were.  It's hard for me to understand someone who apparently feels the opposite.

All of which highlights the fact that I don't really understand people at all.  Especially, I don't get the appeal of tribalism, but it's clearly a powerful force -- and relies as much on teaming up against a common enemy (even a perceived one) than it does on finding commonalities within the group.  So all in all, my online exchange was a fruitless exercise, as these things so often are -- but it does explain a lot about the current state of things in the United States.

**************************************

I've been a bit of a geology buff since I was a kid.  My dad was a skilled lapidary artist, and made beautiful jewelry from agates, jaspers, and turquoise, so every summer he and I would go on a two-week trip to southern Arizona to find cool rocks.  It was truly the high point of my year, and ever since I have always given rock outcroppings and road cuts more than just the typical passing glance.

So I absolutely loved John McPhee's four-part look at the geology of the United States -- Basin and Range, Rising From the Plains, In Suspect Terrain, and Assembling California.  Told in his signature lucid style, McPhee doesn't just geek out over the science, but gets to know the people involved -- the scientists, the researchers, the miners, the oil-well drillers -- who are vitally interested in how North America was put together.  In the process, you're taken on a cross-country trip to learn about what's underneath the surface of our country.  And if, like me, you're curious about rocks, it will keep you reading until the last page.

Note: the link below is to the first in the series, Basin and Range.  If you want to purchase it, click on the link, and part of the proceeds will go to support Skeptophilia.  And if you like it, you'll no doubt easily find the others!





Wednesday, January 10, 2018

Reversing the backfire

I suspect a lot of us have been pining for some good news lately.

Between Kim Jong-Un bragging about his capacity for unleashing nuclear destruction on anyone who insults Dear Leader, to Donald Trump alternately bragging about the size of his genitals and saying that anyone who calls him childish is a stinky stupid poopy-face, to various natural disasters and human-made conflicts, it's all too easy to decide that the only acceptable response is to curl up into a fetal position and whimper softly.

So I was kind of tickled to run into a post made a couple of days ago by Dr. Steve Novella over at the wonderful blog NeuroLogica, which discusses a study that has found the backfire effect isn't significant -- at least under controlled conditions.

[image courtesy of the Wikimedia Commons]

The paper, "The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence" by Thomas Wood of Ohio State University and Ethan Porter of George Washington University, appeared over at the Social Science Research Network on January 2.  It suggests that the backfire effect -- the tendency of people to double down on erroneous beliefs when presented with factual evidence they're wrong -- might not be as pervasive as we'd thought.

The authors write:
Can citizens heed factual information, even when such information challenges their partisan and ideological attachments?  The “backfire effect,” described by Nyhan and Reifler (2010), says no: rather than simply ignoring factual information, presenting respondents with facts can compound their ignorance.  In their study, conservatives presented with factual information about the absence of Weapons of Mass Destruction in Iraq became more convinced that such weapons had been found.  The present paper presents results from five experiments in which we enrolled more than 10,100 subjects and tested 52 issues of potential backfire.  Across all experiments, we found no corrections capable of triggering backfire, despite testing precisely the kinds of polarized issues where backfire should be expected.  Evidence of factual backfire is far more tenuous than prior research suggests.  By and large, citizens heed factual information, even when such information challenges their ideological commitments.
The encouraging thing about this is that it suggests ignorance is curable.  The initial studies  -- that strong, but incorrect, beliefs were damn near unfixable -- were nothing short of crushing.  When I first read the research by Nyhan and Reifler, my initial reaction was, "Why the hell am I bothering with this blog, then?"  (Not, I hasten to add, that I think I'm always right, or something; but since this blog's overarching theme is sussing out the truth by evaluating the evidence skeptically and dispassionately, the backfire effect kind of blows a giant hole in its efficacy.)

This recent research, however, gives me a ray of hope.  Novella, in his piece at NeuroLogica, says it with his typical eloquence:
If we passively go with the flow of our identity, we will tend to cocoon ourselves in a comfortable echochamber that will bathe us only in facts that have been curated for maximal ideological ease.  This feedback loop will not only maintain our ideology but polarize it, making us more radical, and less reasonable. 
Ideally, therefore, we should be emotionally aloof to any ideological identity, to any particular narrative or belief system.  Further, we should seek out information based upon how reliable it is, rather than how much it confirms what we already believe or want to belief.  In fact, to correct for this bias we should specifically seek out information that contradicts our current beliefs.
My only caveat about this whole thing is that even if the backfire effect is minor and rare, correcting false beliefs depends on people being exposed to the correct information in the first place.  I was just talking with my wife yesterday about the role Fox News has in insulating Donald Trump's diehard followers from accurate information about what he's saying and doing.  It not only shields them from anti-Trump editorializing and political spin; but it winnows out the actual quotes, video clips, and tweets, only giving listeners the ones that put him in a favorable light.  While the rest of the major networks are buzzing about the disastrous interview with Stephen Miller (who John Fugelsang hilariously called "Gerbil Goebbels") and Steve Bannon's implosion and the Michael Wolff book and Trump's asinine and infantile response to it, Fox News is doing a piece on investigating the Clinton Foundation.

Because that is clearly more relevant than what the President of the United States and his administration are doing.

So if people are being shielded from the facts, they never have the opportunity to self-correct, even if the backfire effect isn't as big a deal as we'd thought.

Anyhow, at least this is a glimmer of encouragement that humanity is potentially salvageable after all.  For which I am very thankful.  For one thing, being in a fetal position on the floor is uncomfortable, and confuses my dog, not that the latter is all that difficult.  For another, I kind of like writing this blog, and it'd be a bummer to throw in the towel.

Wednesday, October 11, 2017

Course correction

I suppose you could say that everything I write here at Skeptophilia has the same overarching theme; how to tell truth from falsehood, how to recognize spurious claims, how to tell if you're being had.  But helping people to do this is an uphill struggle, and just how uphill was highlighted by a meta-analysis published last week in the Journal of the Association for Psychological Science, which had the rather dismal conclusion that we debunkers are kind of fucked no matter what we do.

Of course, being academics, they didn't state it that way.  Here's how the authors phrased it:
This meta-analysis investigated the factors underlying effective messages to counter attitudes and beliefs based on misinformation.  Because misinformation can lead to poor decisions about consequential matters and is persistent and difficult to correct, debunking it is an important scientific and public-policy goal. This meta-analysis revealed large effects for presenting misinformation, debunking, and the persistence of misinformation in the face of debunking.  Persistence was stronger and the debunking effect was weaker when audiences generated reasons in support of the initial misinformation.  A detailed debunking message correlated positively with the debunking effect.  Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.
Put more simply, the authors, Man-pui Sally Chan, Christopher R. Jones, and Kathleen Hall Jamieson of the University of Pennsylvania, and Dolores Albarracín of the University of Illinois at Urbana-Champaign, found that when confronting misinformation, a detailed response generates some degree of correction -- but makes some people double down on their incorrect understanding.

So it's yet another verification of the backfire effect, which makes it a little hard to see how we skeptics are supposed to move forward.  And the problem becomes even worse when people have been taught to distrust sources that could potentially ameliorate the problem; I can't tell you how many times I've seen posts stating that sites like Snopes and FactCheck.org are flawed, hopelessly biased, or themselves have an agenda to pull the wool over people's eyes.

It's like I've said before: once you convince people to doubt the facts, and that everyone is lying, you can convince them of anything.

[image courtesy of photographer John Snape and the Wikimedia Commons]

"The effect of misinformation is very strong," said co-author Dolores Albarracín.  "When you present it, people buy it.  But we also asked whether we are able to correct for misinformation.  Generally, some degree of correction is possible but it’s very difficult to completely correct."

The authors weren't completely doom-and-gloom, however, and made three specific recommendations for people dedicated to skepticism and the truth.  These are:
  • Reduce arguments that support misinformation: the media needs to be more careful about inadvertently repeating or otherwise giving unwarranted credence to the misinformation itself.
  • Engage audiences in scrutiny and counterarguing of information: schools, especially, should promote skepticism and critical thinking.  It is beneficial to have the audience involved in generating counterarguments -- further supporting the general idea of "teach people how to think, not what to think."
  • Introduce new information as part of the debunking message: give evidence and details.  Even though "misinformation persistence" is strong even in the face of detailed debunking, there was a positive correlation between detailed information and correction of misapprehension.  So: don't let the backfire effect stop you from fighting misinformation.
It may be an uphill battle, but it does work, and is certainly better than the alternative, which is giving up.  As Albarracín put it: "What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect."

I think the most frustrating part of all this for me is that there are biased media sources.  Lots of them.  Some of them (so-called "clickbait") post bullshit to drive up ad revenue; others are simply so ridiculously slanted that anything they publish should be independently verified every single time.  And because people tend to gravitate toward media that agree with what they already thought was true, sticking with sources that conform to your own biases makes it unlikely that you'll see where you're wrong (confirmation bias), and will allow you to persist in that error because you're surrounding yourself by people who are saying the same thing (the echo-chamber effect).

And that one, I don't know how to address.  It'd be nice if the fringe media would act more responsibly -- but we all know that's not going to happen any time soon.  So I'll just end with an exhortation for you to broaden the media you do read -- if you're conservative, check out the arguments on MSNBC every once in a while (and give them serious thought; don't just read, scoff, and turn away).  Same if you're a liberal; hit Fox News on occasion.  It may not change your mind, but at least it'll make it more likely that you'll discover the holes in your own thinking.

Saturday, September 2, 2017

Political backfires

The good news from yesterday's post, wherein we learned some ways of fighting the backfire effect and convincing people to change their minds, was immediately counterbalanced by a new (and discouraging) study out of Denmark that showed that for politicians, the more data they have access to, the worse backfire effect becomes.

A team at Aarhus University led by Martin Baekgaard was studying motivated reasoning, which is the thought process we engage in when we are presented with information either supporting or refuting our prior beliefs.  In the first part of the experiment, test subjects were given test score data from two schools, A and B, and asked to evaluate which was more successful.  A different set of test subjects was given the same data, but one of the two schools was labeled "Public School A" and the other "Private School B" -- like in the United States, the relative merits of public vs. private schools is a topic of heated debate.

This first bit of research generated results that were unsurprising.  When the two schools were given anonymous tags, the data was evaluated fairly by both people who supported public schools and those who supported private schools.  When they were labeled, however, the backfire effect kicked in, and the test subjects' prior opinions skewed their analysis of the results.

So far, nothing we didn't already know.  But the second part of the experiment not only looked at the quantity of data provided, and compared the results of 1,000 test subjects from a variety of professions as compared to 954 career politicians.  And this gave some results that were, to put it mildly, interesting.  Let me give it to you in the authors' own words:
Does evidence help politicians make informed decisions even if it is at odds with their prior beliefs?  And does providing more evidence increase the likelihood that politicians will be enlightened by the information?  Based on the literature on motivated political reasoning and the theory about affective tipping points, this article hypothesizes that politicians tend to reject evidence that contradicts their prior attitudes, but that increasing the amount of evidence will reduce the impact of prior attitudes and strengthen their ability to interpret the information correctly.  These hypotheses are examined using randomized survey experiments with responses from 954 Danish politicians, and results from this sample are compared to responses from similar survey experiments with Danish citizens.  The experimental findings strongly support the hypothesis that politicians are biased by prior attitudes when interpreting information.  However, in contrast to expectations, the findings show that the impact of prior attitudes increases when more evidence is provided.
Yes, you read that right.  Politicians, like other people, are prone to falling into the backfire effect.  But unlike the rest of us, the more data they're given, the worse the backfire effect becomes.  Show a politician additional evidence, and all you're doing is making sure that (s)he stays planted even more firmly.

Baekgaard et al. propose a reason for this result, and I suspect they're correct; most politicians are, by their very nature, partisan, and have been elected because of strongly supporting a particular political agenda.  Since the backfire effect occurs when people double down on their beliefs because of feeling threatened, it stands to reason that politicians -- whose jobs depend on their beliefs being right -- would experience a greater sense of threat when they find they're wrong than the rest of us do.

But that leaves us with the rather alarming result that the people who are directing policy and making decisions for an entire electorate are going to be the ones whose response to the data is worst.

"The Great Presidential Puzzle" by James Albert Wales (1880) [image courtesy of the Wikimedia Commons]

And, of course, this result is borne out by what we see around us.  Here in the United States, it seems like every time new studies are performed and new data generated, the determination of politicians to shout "damn the facts, full speed ahead!" only gets stronger.  Which can explain why any of a number of crazy policies have been implemented, ones that fly in the face of every rational argument there is.

But in the words of Charlie Brown, "Now that I know that, what do I do?"  And my answer is: beats the hell out of me.  As I said in a previous post, I think nothing's going to change until the voters wise up, and that won't happen until we have a more educated citizenry.

And heaven only knows what it'll take for that to come about.

Friday, September 1, 2017

Argue with me

In recent months, I've done several posts that reference the backfire effect -- the tendency of people to double down on their previous beliefs when challenged, even when shown hard evidence that their views are incorrect.  But of course, this brings up the question, if people tend to plant their feet when you offer counterarguments, how do you change someone's mind?

A quartet of researchers at Cornell University, Chenhao Tan, Vlad Niculae, Cristian Danescu-Niculescu-Mizil, and Lillian Lee, have studied this very question, and presented their findings in a paper called, "Winning Arguments: Interaction Dynamics and Persuasion Strategies in Good-faith Online Discussions."  My wife stumbled onto this study a couple of days ago, and knowing this was right down my alley, forwarded it to me.

What the researchers did was to study patterns on r/ChangeMyView, a subreddit where people post opinions and invite argument.  If someone does succeed in changing the original poster's view, the successful arguer is awarded a ∆ (the Greek letter delta, which in science is used to represent change).  By seeing who was awarded deltas, and analyzing their statements, the researchers were able to determine the characteristics of statements that were the most successful, and the ones that were generally unsuccessful.

Argument Irresistible, by Robert Macaire (from the magazine Le Charivari, May 1841) [image courtesy of the Wikimedia Commons]

And the results are a fascinating window into how we form, and hold on to, our opinions.  The authors write:
Changing someone's opinion is arguably one of the most important challenges of social interaction.  The underlying process proves difficult to study: it is hard to know how someone's opinions are formed and whether and how someone's views shift. Fortunately, ChangeMyView, an active community on Reddit, provides a platform where users present their own opinions and reasoning, invite others to contest them, and acknowledge when the ensuing discussions change their original views.  In this work, we study these interactions to understand the mechanisms behind persuasion. 
We find that persuasive arguments are characterized by interesting patterns of interaction dynamics, such as participant entry-order and degree of back-and-forth exchange.  Furthermore, by comparing similar counterarguments to the same opinion, we show that language factors play an essential role.  In particular, the interplay between the language of the opinion holder and that of the counterargument provides highly predictive cues of persuasiveness. Finally, since even in this favorable setting people may not be persuaded, we investigate the problem of determining whether someone's opinion is susceptible to being changed at all.  For this more difficult task, we show that stylistic choices in how the opinion is expressed carry predictive power.
More simply put, Tan et al. found that it wasn't the content of the argument that determined its success, it was how it was worded.  In particular, they found that the use of calmer words, statements that were serious (i.e. not joking or sarcasm), and arguments that were worded differently from the original statement (i.e. were not simply direct responses to what was said) were the most effective.  Quotes from sources were relatively ineffective, but if you can post a link to a corroborating site, it strengthens your argument.

Another thing that was more likely to increase your success at convincing others was appearing flexible yourself.  Starting out with "You're an idiot if you don't see that..." is a poor opening salvo.  Wording such as "It could be that..." or "It looks like the data might support that..." sounds as if it would be a signal of a weak argument, but in fact, such softer phrasing was much more likely to be persuasive than a full frontal attack.

Even more interesting were the characteristics of the original posts that signaled that the person was persuadable.  The people who were most likely to change their minds, the researchers found, wrote longer posts, included more information and data in the form of lists, included sources, and were more likely to use first-person singular pronouns (I, my) rather than first-person plural (we, our) or third-person impersonal (they, their).

Unsurprising, really; if a person is basing his/her opinion on evidence, I'd expect (s)he would be easier to convince using different evidence.  And the "I" vs. "we" vs. "they" thing also makes some sense; as I posted a couple of weeks ago, despite our technological advances, we remain tribal creatures.  If you engage that in-group-identity module in the brain, it's no wonder that we are more likely to hang on to whatever views allow us to keep our sense of belonging to the tribe.

The Tan et al. research, however, does give us some ideas about how to frame arguments in order to give us the greatest likelihood of success.  Stay calm, don't rant or ridicule.  Give your reasoning, and expand on your own views rather than simply responding to what the other person said.  If you have links or sources, post them.  Especially, show your own willingness to be persuaded.  If the person you're arguing with sees you as reasonable yourself, you're much more likely to be listened to.

Most importantly, don't give up debate as a completely fruitless and frustrating endeavor.  Vlad Niculae, who co-authored the study, found their results to be encouraging.  "If you’ve never visited [ChangeMyView]," Nicolae writes, "the concept that people can debate hot topics without getting into flame wars on the internet might be hard to believe.  Or, if you read a lot of Youtube comments, you might be inclined to believe the backfire effect, and doubt that graceful concession is even possible online.  But a quick trip to this great subreddit will undoubtedly make you a believer."

Thursday, August 24, 2017

Tribalism vs. the facts

For the diehard skeptic, one of the most frustrating things about human nature is how to combat belief in the absence of evidence (or even in the face of evidence to the contrary).

And I'm not talking about religion here, or at least not solely about religion.  The current 30% or so of Americans who still support Donald Trump are a good example of an evidence-free belief that borders on religious fervor; witness a recent poll of Trump supporters wherein six out of ten said that they can't think of anything he could do that would change their approval of his presidency.

The maddening part of all this is that at its heart, skepticism only asks one thing; that you base your understanding on facts.  The idea that people can adhere to their beliefs so strongly that no logic or evidence could shift them is a little incomprehensible.

But it's even worse than this.  A new study has shown that if a person is predisposed to certain beliefs -- anything from Trump support to climate change denialism to young-Earth creationism -- it doesn't help for them to learn more about the subject.

In fact, learning more about the subject actually increases their certainty that they were right in the first place.

These were the rather dismal findings of Caitlin Drummond and Baruch Fischhoff of Carnegie Mellon University, whose paper "Individuals With Greater Science Literacy and Education Have More Polarized Beliefs on Controversial Science Topics" appeared last week in the Proceedings of the National Academy of Sciences.  The authors write:
Although Americans generally hold science in high regard and respect its findings, for some contested issues, such as the existence of anthropogenic climate change, public opinion is polarized along religious and political lines.  We ask whether individuals with more general education and greater science knowledge, measured in terms of science education and science literacy, display more (or less) polarized beliefs on several such issues...  We find that beliefs are correlated with both political and religious identity for stem cell research, the Big Bang, and human evolution, and with political identity alone on climate change.  Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues.
Put simply, your views on (for example) evolutionary biology have less to do with your understanding of the subject than they do on your political and religious identification.  Which, of course, implies that if you are trying to convince someone of the correctness of the evolutionary model, teaching them about what the scientists are actually saying is unlikely to change their perspective, and it may actually cause them to double down on their original beliefs.

[image courtesy of the Wikimedia Commons]

So it's another example of the insidious backfire effect, and it is profoundly maddening.  It is unsurprising, perhaps, given the fact that for all of our technology and civilization, we're still tribal animals.  Our in-group identification, with respect to politics, religion, ethnicity, or nationality, trumps damn near everything else, up to and including the facts and evidence sitting right in front of our faces, and that education isn't going to change that.

It remains to be seen what can be done about this.  Baruch Fischhoff, who co-authored the study, said:
These are troubling correlations. We can only speculate about the underlying causes.  One possibility is that people with more education are more likely to know what they are supposed to say, on these polarized issues, in order to express their identity.  Another possibility is that they have more confidence in their ability to argue their case.
"Troubling" is right, especially given that I'm a science teacher.  I've always thought that one of the main jobs of science teachers is to correct students' misapprehensions about how the world works, because let's face it: a great deal of science is counterintuitive.  As Sean Carroll put it, in his wonderful book about the discovery of the Higgs boson, The Particle at the End of the Universe:
It's only because the data force us into corners that we are inspired to create the highly counterintuitive structures that form the basis for modern physics...  Imagine that a person in the ancient world was wondering what made the sun shine.  It's not really credible to imagine that they would think about it for a while and decide, "I bet most of the sun is made up of particles that can bump into one another and stick together, with one of them converting into a different kind of particle by emitting yet a third particle, which would be massless if it wasn't for the existence of a field that fill space and breaks the symmetry that is responsible for the associated force, and that fusion of the original two particles releases energy, which we ultimately see as sunlight."  But that's exactly what happens.  It took many decades to put this story together, and it never would have happened if our hands weren't forced by the demands of observation and experiment at every step.
The same, of course, is true for every discipline of science.  None of it is simple and intuitive; that's why we need the scientists.

But if people don't believe what the scientists are saying, not because of a lack of understanding or a disagreement over the facts, but because of tribal identity and in spite of the facts, there's not a whole hell of a lot you can do.

Which makes me even more depressed about our current situation here in the United States.