Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label error. Show all posts
Showing posts with label error. Show all posts

Saturday, June 11, 2022

Locked into error

Back in 2011, author Kathryn Schulz did a phenomenal TED Talk called "On Being Wrong."  She looks at how easy it is to slip into error, and how hard it is not only to correct it, but (often) even to recognize that it's happened.  At the end, she urges us to try to find our way out of the "tiny, terrified space of rightness" that virtually all of us live in.

Unfortunately, that's one thing that she herself gets wrong.  Because for a lot of people, their belief in their rightness about everything isn't terrified; it's proudly, often belligerently, defiant.

I'm thinking of one person in particular, here, who regularly posts stuff on social media that is objectively wrong -- I mean, hard evidence, no question about it -- and does so in a combative way that comes across as, "I dare you to contradict me."  I've thus far refrained from saying anything.  One of my faults is that I'm a conflict avoider, but I also try to be cognizant of the cost/benefit ratio.  Maybe I'm misjudging, but I think the likelihood of my eliciting a "Holy smoke, I was wrong" -- about anything -- is as close to zero as you could get.

Now, allow me to say up front that I'm not trying to imply here that I'm right about everything, nor that I don't come across as cocky or snarky at times.  Kathryn Schulz's contention (and I think she's spot-on about this one) is that we all fall into the much-too-comfortable trap of believing that our view of the world perfectly reflects reality.  One of the most startling bullseyes Schulz makes in her talk is about how it feels to be wrong:

So why do we get stuck in this feeling of being right?  One reason, actually, has to do with the feeling of being wrong.  So let me ask you guys something...  How does it feel -- emotionally -- how does it feel to be wrong?  Dreadful.  Thumbs down.  Embarrassing...  Thank you, these are great answers, but they're answers to a different question.  You guys are answering the question: How does it feel to realize you're wrong?  Realizing you're wrong can feel like all of that and a lot of other things, right?  I mean, it can be devastating, it can be revelatory, it can actually be quite funny...  But just being wrong doesn't feel like anything.

I'll give you an analogy.  Do you remember that Looney Tunes cartoon where there's this pathetic coyote who's always chasing and never catching a roadrunner?  In pretty much every episode of this cartoon, there's a moment where the coyote is chasing the roadrunner and the roadrunner runs off a cliff, which is fine -- he's a bird, he can fly.  But the thing is, the coyote runs off the cliff right after him.  And what's funny -- at least if you're six years old -- is that the coyote's totally fine too.  He just keeps running -- right up until the moment that he looks down and realizes that he's in mid-air.  That's when he falls.  When we're wrong about something -- not when we realize it, but before that -- we're like that coyote after he's gone off the cliff and before he looks down.  You know, we're already wrong, we're already in trouble, but we feel like we're on solid ground.  So I should actually correct something I said a moment ago.  It does feel like something to be wrong; it feels like being right.
What brought this talk to mind -- and you should take fifteen minutes and watch the whole thing, because it's just that good -- is some research out of the University of California - Los Angeles published a couple of weeks ago in Psychological Review that looked at the neuroscience of these quick -- and once made, almost impossible to undo -- judgments about the world.


The study used a technique called electrocorticography to see what was going on in a part of the brain called the gestalt cortex, which is known to be involved in sensory interpretation.  In particular, the team analyzed the activity of the gestalt cortex when presented with the views of other people, some of which the test subjects agreed with, some with which they disagreed, and others about which they had yet to form an opinion.

The most interesting result had to do with the strength of the response.  The reaction of the gestalt cortex is most pronounced when we're confronted with views opposing our own, and with statements about which we've not yet decided.  In the former case, the response is to suppress the evaluative parts of the brain -- i.e., to dismiss immediately what we've read because it disagrees with what we already thought.  In the latter case, it amplifies evaluation, allowing us to make a quick judgment about what's going on, but once that's happened any subsequent evidence to the contrary elicits an immediate dismissal.  Once we've made our minds up -- and it happens fast -- we're pretty much locked in.

"We tend to have irrational confidence in our own experiences of the world, and to see others as misinformed, lazy, unreasonable or biased when they fail to see the world the way we do," said study lead author Matthew Lieberman, in an interview with Science Daily.  "We believe we have merely witnessed things as they are, which makes it more difficult to appreciate, or even consider, other perspectives.  The mind accentuates its best answer and discards the rival solutions.  The mind may initially process the world like a democracy where every alternative interpretation gets a vote, but it quickly ends up like an authoritarian regime where one interpretation rules with an iron fist and dissent is crushed.  In selecting one interpretation, the gestalt cortex literally inhibits others."

Evolutionarily, you can see how this makes perfect sense.  As a proto-hominid out on the African savanna, it was pretty critical to look at and listen to what's around you and make a quick judgment about its safety.  Stopping to ponder could be a good way to become a lion's breakfast.  The cost of making a wrong snap judgment and overestimating the danger is far lower than blithely going on your way and assuming everything is fine.  But now?  This hardwired tendency to squelch opposing ideas without consideration means we're unlikely to correct -- or even recognize -- that we've made a mistake.

I'm not sure what's to be done about this.  If anything can be done.  Perhaps it's enough to remind people -- including myself -- that our worldviews aren't flawless mirrors of reality, they're the result of our quick evaluation of what we see and hear.  And, most importantly, that we never lose by reconsidering our opinions and beliefs, weighing them against the evidence, and always keeping in mind the possibility that we might be wrong.  I'll end with another quote from Kathryn Schulz:
This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to, and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are these perfectly translucent windows, and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing.  That is not true, and if it were, life would be incredibly boring.  The miracle of your mind isn't that you can see the world as it is, it's that you can see the world as it isn't.  We can remember the past, and we can think about the future, and we can imagine what it's like to be some other person in some other place.  And we all do this a little differently...  And yeah, it is also why we get things wrong.

Twelve hundred years before Descartes said his famous thing about "I think therefore I am," this guy, St. Augustine, sat down and wrote "Fallor ergo sum" -- "I err, therefore I am."  Augustine understood that our capacity to screw up, it's not some kind of embarrassing defect in the human system, something we can eradicate or overcome.  It's totally fundamental to who we are.  Because, unlike God, we don't really know what's going on out there.  And unlike all of the other animals, we are obsessed with trying to figure it out.  To me, this obsession is the source and root of all of our productivity and creativity.

**************************************

Thursday, August 15, 2019

Doubling down on error

Is it just me, or is the defining hallmark of discourse these days a steadfast refusal to admit when you're wrong?

Surprisingly enough I'm not referring here to Donald Trump, who has raised a casual disdain for the truth to near-mythic proportions.  What's even more astonishing, though, is his followers' determination to believe everything he says, even when it contradicts what he just said.  Trump could say, "The sky is green!  It is also purple-and-orange plaid!  And I didn't say either of those things!  Also, I am not here!" and his devotees would just nod and smile and comment on what an honest and godly man he is and how great America is now that we've been abandoned by all our allies and the national debt is a record 22 trillion dollars.

In this case, though, I'm referring to two Republican policy wonks who apparently wouldn't believe climate change was happening if the entire continent spontaneously burst into flame.  The first was Matt Schlapp, head of the American Conservative Union, who was pissed off by Bernie Sanders publicly calling Trump an idiot for not accepting climate change, and responded in a tweet, "They can’t even predict if it will rain on tues but we are certain about the weather 12 yrs from now."

This is such an egregious straw man that it's almost a work of art.  In 21 words, we find the following:
  • Weather ≠ climate.  For fuck's sake.  We've been through this how many times before?
  • Meteorologists are, actually, quite good at predicting when and where it will rain.  Weather is a complex affair, so they don't always get it right, but if the evening weather report says your annual family picnic tomorrow is going to get a drenching, you should probably pay attention.
  • Knowing the climatic trends tells you exactly nothing about "the weather twelve years from now."  Cf. my earlier comment about how weather ≠ climate.
  • Predictions and trends don't imply certainty.  Ever.  But if 99% of working climatologists believe that anthropogenic climate change is happening, and that it's going to have drastic negative effects not only on the environment but ourselves, I'm gonna listen to them rather than to a guy whose main occupation seems to be sneering at people he disagrees with.
Then there was writer and pontificator Dinesh d'Souza, who posted a video of kangaroos hopping about in the snow with the caption, "Global warming comes to Australia.  Unless you want to believe your lying eyes!"

Unsurprisingly, within minutes d'Souza was excoriated by hundreds of people letting him know that (1) the Earth is spherical, implying that (2) there are these things called "hemispheres," which (3) cause the seasons, and (4) since Australia is in the opposite one than North America, they're experiencing winter right now.  Also, he was informed more than once that the largest mountain range in Australia is named "the Snowy Mountains," and it's for an analogous reason that the Rocky Mountains got their name by virtue of being composed largely of rocks.

A grove of native trees in New South Wales, Australia.  They're called "snow gums."  Guess why?  [Image licensed under the Creative Commons Thennicke, Snow gums, Dead Horse Gap NSW Australia, CC BY-SA 4.0]

What gets me about this is not that two laypeople made a mistake about science.  That is gonna happen because (let's face it) science can be hard.  What I find astonishing is that when confronted with multitudes of fact-based objections, neither man said, "Wow, that sure was a dumb statement!  What a goober I am."  Both of them took the strategy of "Death Before Backing Down," and I can nearly guarantee that this incident will not change their minds one iota, and that (given the opportunity) they will make equally idiotic statements next time.

Look, I'm not claiming I'm infallible.  Far from it.  But what I will say is that if I'm wrong, I'll admit it -- and if it's in print (as here at Skeptophilia) I'll post a correction or retraction, or (if the error was egregious enough) delete the post entirely.  I've done so more than once over the nine years I've had this blog, and although admitting you're mistaken is never pleasant, it's absolutely critical to honest... everything.

But that seems to be a lost art lately.  The attitude these days is, "If someone proves you're wrong, keep saying the same thing, only be more strident."  Evidently truth these days isn't about who has the stronger evidence, but who yells the loudest.  It's no wonder the American citizenry is, as a whole, so misinformed, especially on scientific matters -- in science the touchstone is not volume but factual support.

And that seems to be the last thing any of these people are looking at.

***********************************

This week's Skeptophilia book recommendation is sheer brilliance -- Jenny Lawson's autobiographical Let's Pretend This Never Happened.  It's an account of her struggles with depression and anxiety, and far from being a downer, it's one of the funniest books I've ever read.  Lawson -- best known from her brilliant blog The Blogess -- has a brutally honest, rather frenetic style of writing, and her book is sometimes poignant and often hilarious.  She draws a clear picture of what it's like to live with crippling social anxiety, an illness that has landed Lawson (as a professional author) in some pretty awkward situations.  She looks at her own difficulties (and those of her long-suffering husband) through the lens of humor, and you'll come away with a better understanding of those of us who deal day-to-day with mental illness, and also with a bellyache from laughing.

[Note: If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, January 28, 2017

Locking yourself into error

I got in a rather interesting -- well, I suppose you could call it a "discussion" -- with a Trump supporter yesterday.

It came about because of recent posts here at Skeptophilia that have been pretty critical of the president, his appointees, and their decisions.  After a few minutes of the usual greetings and pleasantries ("You're a liberal lackey who sucks up what the lying mainstream media says without question!", stuff like that), I asked her what to me is the only pertinent question in such situations:

"What would it take to convince you that you are wrong?"

"I'm not wrong," she said.

"That's not what I asked," I responded.  "I asked what would it take to convince you that you are wrong.  About Donald Trump.  Or about anything."

"What would it take to convince you?" she shot back.

"Facts and evidence that my opinion was in error.  Or at least a good logical argument."

"People like you would never believe it anyway.  You're swallowing the lies from the media.  Thank God Donald Trump was elected despite people like you and your friends in the MSM."

"And you still haven't answered my question."

At that point, she terminated the conversation and blocked me.

Couple that with a second comment from a different person -- one I elected not to respond to, because eventually I do learn not to take the bait -- saying that of course I have a liberal bias "since I get my information from CNN," and you can see that the fan mail just keeps rolling in.

Of course, the question I asked the first individual isn't original to me; it was the single most pivotal moment in the never-to-be-forgotten debate between Ken Ham and Bill Nye over the theory of evolution in February of 2014, in which the moderator asked each man what, if anything, would change his mind.  Nye said:
We would need just one piece of evidence.  We would need the fossil that swam from one layer to another.  We would need evidence that the universe is not expanding.  We would need evidence that the stars appear to be far away but are not.  We would need evidence that rock layers could somehow form in just 4,000 years…  We would need evidence that somehow you can reset atomic clocks and keep neutrons from becoming protons.  Bring on any of those things and you would change me immediately.
Ham, on the other hand, gave a long, rambling response that can be summed up as "Nothing would change my mind.  No evidence, no logic, nothing."

The whole thing dovetails perfectly with a paper released just two days ago in the journal Political Psychology.  Entitled "Science Curiosity and Political Psychology," by Dan M. Kahan, Asheley Landrum, Katie Carpenter, Laura Helft, and Kathleen Hall Jamieson, the paper looks at the connection between scientific curiosity and a willingness to consider information that runs counter to one's own political biases and preconceived notions.  The authors write:
[S]ubjects high in science curiosity display a marked preference for surprising information—that is, information contrary to their expectations about the current state of the best available evidence—even when that evidence disappoints rather than gratifies their political predispositions.  This is in marked contrast, too, to the usual style of information-search associated with [politically-motivated reasoning], in which partisans avoid predisposition-threatening in favor of predisposition-affirming evidence. 
Together these two forms of evidence paint a picture—a flattering one indeed—of individuals of high science curiosity. In this view, individuals who have an appetite to be surprised by scientific information—who find it pleasurable to discover that the world does not work as they expected—do not turn this feature of their personality off when they engage political information but rather indulge it in that setting as well, exposing themselves more readily to information that defies their expectations about facts on contested issues.  The result is that these citizens, unlike their less curious counterparts, react more open-mindedly and respond more uniformly across the political spectrum to the best available evidence.
And maybe that's what's at the heart of all this.  I've always thought that the opposite of curiosity is fear -- those of us who are scientifically curious (and I will engage in a bit of self-congratulation and include myself in this group) tend to be less afraid about being found to be wrong, and more concerned with making sure we have all our facts straight.

[image courtesy of the Wikimedia Commons]

So I'll reiterate my question, aimed not only toward Trump supporters, but to everyone: what would it take to convince you that you are wrong?  About your political beliefs, religious beliefs, moral stances, anything?  It's a question we should keep in the forefront of our minds all the time.

Because once you answer that question with a defiant "nothing could convince me," you have effectively locked yourself into whatever errors you may have made, and insulated yourself from facts, logic, evidence -- and the truth.