Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label Kathryn Schulz. Show all posts
Showing posts with label Kathryn Schulz. Show all posts

Friday, November 1, 2024

Wrongness

I get a lot of negative comments.

It comes with the territory, I suppose, and I knew when I started writing this blog fourteen years ago that I would have to develop a thick skin.  Given the subject matter, there's hardly a post I do that won't piss someone off.  Here's a sampling of comments, and a brief description of the topic that elicited them:
  • You are either ignorant or just stupid.  I'm putting my bet on the latter.  (after a post on machines that are supposed to "alkalinize" water to make it more healthful)
  • Narrow-minded people like you are the worst problem this society faces.  (after a post on "crystal healing")
  • I am honestly offended by what you wrote.  (after a post on alternative medicine)
  • I can't say I warm to your tone.  (after a post on ghost hunting)
  • That is the most ignorant thing I have ever read.  I could feel my IQ dropping as I read it.  (after a post in which I made a statement indicating that I think recent climate change is anthropogenic in origin)
  • I hate smug dilettantes like you.  (after a post on mysticism vs. rationalism)
  • You are a worthless wanker, and I hope you rot in hell.  (from a young-earth creationist)
My skin isn't thick enough that some of these don't sting.  For example, the one that called me a "smug dilettante" has a grain of truth to it; I'm not a scientist, just a retired science teacher, and if my educational background has a flaw it's that it's a light year across and an inch deep.  Notwithstanding that in a previous century people like me were called "polymaths," not "dabblers" or "dilettantes," the commenter scored a point, whether he knew it or not.  I'm well-read, and have a decent background in a lot of things, but I'm not truly an expert in anything.

Other disagreements on this list have been resolved by discussion, which is honestly what I prefer to do.  The comments that came from the posts on alternative medicine and ghost hunting generated fruitful discussion, and understanding (if not necessarily agreement) on both sides.

Most of the time, though, I just don't engage with people who choose to use the "Comments" section (or email) as a venue for snark.  You're not going to get very far by calling me ignorant, for example.  I make a practice of not writing about subjects on which I am ignorant, so even if I make an offhand comment about something, I try to make sure that I could back it up with facts if I needed to.  (Cf. this site, apropos of the individual who thinks I am ignorant for accepting the anthropogenic nature of recent climate change.  Plus, I once had the amazing Bill McKibben give me a thumbs-up for one of my climate change posts, which counts for a great deal.)

That said, what a lot of people don't seem to recognize about me is the extent to which my understanding of the world is up for grabs.  Like anyone, I do have my biases, and my baseline assumptions -- the latter including the idea that the universe is best understood through the dual lenses of logic and evidence.


But everything else?  My attitude is, if you want to try to convince me about Bigfoot or chakras or crystals or astrology or your particular take on religion or anything else, knock yourself out.  But you'd better have the evidence on your side, because even if I am a dilettante, I have read up on the topics on which I write.

I am as prone as the next guy, though, to getting it wrong sometimes.  And I am well aware of the fact that we can slide into error without realizing it.  As journalist Kathryn Schulz said, in her phenomenal lecture "On Being Wrong" (which you should all take fifteen minutes and watch as soon as you're done reading this):
How does it feel to be wrong?  Dreadful, thumbs down, embarrassing.  Those are great answers.  But they're answers to a different question.  (Those are) the answers to the question, "How does it feel to realize you're wrong?"  Realizing you're wrong can feel like all of that, and a lot of other things.  It can be devastating.  It can be revelatory.  It can actually be quite funny...  But just being wrong?  It doesn't feel like anything...  We're already wrong, we're already in trouble, but we still feel like we're on solid ground.  So I should actually correct something I said a moment ago: it does feel like something to be wrong.  It feels like being right.
To those who are provoked, even pissed off by what I write: good.  We never discover our errors -- and I'm very much including myself in this assessment -- without being knocked askew once in a while.  Let yourself be challenged without having a knee-jerk kick in response, and you have my word that I'll do the same.  And while I don't like having my erroneous thinking uncovered any more than anyone else, I will take a deep breath and admit it when I screw up.  I've published retractions in Skeptophilia more than once, which has been a profoundly humbling but entirely necessary experience.

So keep those cards and letters coming.  Even the negative ones.  I'm not going to promise you I'll change my mind on every topic I'm challenged on, but I do promise that I'll consider what you've said.

On the other hand, calling me a "worthless wanker" didn't accomplish much but making me choke-snort a mouthful of coffee all over my computer.  So I suppose that the commenter even got his revenge there, if only in a small way.

****************************************


Saturday, June 11, 2022

Locked into error

Back in 2011, author Kathryn Schulz did a phenomenal TED Talk called "On Being Wrong."  She looks at how easy it is to slip into error, and how hard it is not only to correct it, but (often) even to recognize that it's happened.  At the end, she urges us to try to find our way out of the "tiny, terrified space of rightness" that virtually all of us live in.

Unfortunately, that's one thing that she herself gets wrong.  Because for a lot of people, their belief in their rightness about everything isn't terrified; it's proudly, often belligerently, defiant.

I'm thinking of one person in particular, here, who regularly posts stuff on social media that is objectively wrong -- I mean, hard evidence, no question about it -- and does so in a combative way that comes across as, "I dare you to contradict me."  I've thus far refrained from saying anything.  One of my faults is that I'm a conflict avoider, but I also try to be cognizant of the cost/benefit ratio.  Maybe I'm misjudging, but I think the likelihood of my eliciting a "Holy smoke, I was wrong" -- about anything -- is as close to zero as you could get.

Now, allow me to say up front that I'm not trying to imply here that I'm right about everything, nor that I don't come across as cocky or snarky at times.  Kathryn Schulz's contention (and I think she's spot-on about this one) is that we all fall into the much-too-comfortable trap of believing that our view of the world perfectly reflects reality.  One of the most startling bullseyes Schulz makes in her talk is about how it feels to be wrong:

So why do we get stuck in this feeling of being right?  One reason, actually, has to do with the feeling of being wrong.  So let me ask you guys something...  How does it feel -- emotionally -- how does it feel to be wrong?  Dreadful.  Thumbs down.  Embarrassing...  Thank you, these are great answers, but they're answers to a different question.  You guys are answering the question: How does it feel to realize you're wrong?  Realizing you're wrong can feel like all of that and a lot of other things, right?  I mean, it can be devastating, it can be revelatory, it can actually be quite funny...  But just being wrong doesn't feel like anything.

I'll give you an analogy.  Do you remember that Looney Tunes cartoon where there's this pathetic coyote who's always chasing and never catching a roadrunner?  In pretty much every episode of this cartoon, there's a moment where the coyote is chasing the roadrunner and the roadrunner runs off a cliff, which is fine -- he's a bird, he can fly.  But the thing is, the coyote runs off the cliff right after him.  And what's funny -- at least if you're six years old -- is that the coyote's totally fine too.  He just keeps running -- right up until the moment that he looks down and realizes that he's in mid-air.  That's when he falls.  When we're wrong about something -- not when we realize it, but before that -- we're like that coyote after he's gone off the cliff and before he looks down.  You know, we're already wrong, we're already in trouble, but we feel like we're on solid ground.  So I should actually correct something I said a moment ago.  It does feel like something to be wrong; it feels like being right.
What brought this talk to mind -- and you should take fifteen minutes and watch the whole thing, because it's just that good -- is some research out of the University of California - Los Angeles published a couple of weeks ago in Psychological Review that looked at the neuroscience of these quick -- and once made, almost impossible to undo -- judgments about the world.


The study used a technique called electrocorticography to see what was going on in a part of the brain called the gestalt cortex, which is known to be involved in sensory interpretation.  In particular, the team analyzed the activity of the gestalt cortex when presented with the views of other people, some of which the test subjects agreed with, some with which they disagreed, and others about which they had yet to form an opinion.

The most interesting result had to do with the strength of the response.  The reaction of the gestalt cortex is most pronounced when we're confronted with views opposing our own, and with statements about which we've not yet decided.  In the former case, the response is to suppress the evaluative parts of the brain -- i.e., to dismiss immediately what we've read because it disagrees with what we already thought.  In the latter case, it amplifies evaluation, allowing us to make a quick judgment about what's going on, but once that's happened any subsequent evidence to the contrary elicits an immediate dismissal.  Once we've made our minds up -- and it happens fast -- we're pretty much locked in.

"We tend to have irrational confidence in our own experiences of the world, and to see others as misinformed, lazy, unreasonable or biased when they fail to see the world the way we do," said study lead author Matthew Lieberman, in an interview with Science Daily.  "We believe we have merely witnessed things as they are, which makes it more difficult to appreciate, or even consider, other perspectives.  The mind accentuates its best answer and discards the rival solutions.  The mind may initially process the world like a democracy where every alternative interpretation gets a vote, but it quickly ends up like an authoritarian regime where one interpretation rules with an iron fist and dissent is crushed.  In selecting one interpretation, the gestalt cortex literally inhibits others."

Evolutionarily, you can see how this makes perfect sense.  As a proto-hominid out on the African savanna, it was pretty critical to look at and listen to what's around you and make a quick judgment about its safety.  Stopping to ponder could be a good way to become a lion's breakfast.  The cost of making a wrong snap judgment and overestimating the danger is far lower than blithely going on your way and assuming everything is fine.  But now?  This hardwired tendency to squelch opposing ideas without consideration means we're unlikely to correct -- or even recognize -- that we've made a mistake.

I'm not sure what's to be done about this.  If anything can be done.  Perhaps it's enough to remind people -- including myself -- that our worldviews aren't flawless mirrors of reality, they're the result of our quick evaluation of what we see and hear.  And, most importantly, that we never lose by reconsidering our opinions and beliefs, weighing them against the evidence, and always keeping in mind the possibility that we might be wrong.  I'll end with another quote from Kathryn Schulz:
This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to, and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are these perfectly translucent windows, and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing.  That is not true, and if it were, life would be incredibly boring.  The miracle of your mind isn't that you can see the world as it is, it's that you can see the world as it isn't.  We can remember the past, and we can think about the future, and we can imagine what it's like to be some other person in some other place.  And we all do this a little differently...  And yeah, it is also why we get things wrong.

Twelve hundred years before Descartes said his famous thing about "I think therefore I am," this guy, St. Augustine, sat down and wrote "Fallor ergo sum" -- "I err, therefore I am."  Augustine understood that our capacity to screw up, it's not some kind of embarrassing defect in the human system, something we can eradicate or overcome.  It's totally fundamental to who we are.  Because, unlike God, we don't really know what's going on out there.  And unlike all of the other animals, we are obsessed with trying to figure it out.  To me, this obsession is the source and root of all of our productivity and creativity.

**************************************

Saturday, May 15, 2021

Thin ice

In her phenomenal TED talk "On Being Wrong," journalist Kathryn Schulz says, "[W]e all kind of wind up traveling through life, trapped in this little bubble of feeling very right about everything...  [and] I want to convince you that it is possible to step outside of that feeling and that if you can do so, it is the single greatest moral, intellectual and creative leap you can make."

I've often thought that that the willingness to entertain the possibility that your knowledge is incomplete -- that you may not have all the answers, and (more critically) that some of the answers you've arrived at might be false -- is the cornerstone of developing a real understanding of how things actually are.  Put a different way, certainty can be a blindfold.

[Image licensed under the Creative Commons Dale Schoonover, Kim Schoonover, Blindfold hat, CC BY 3.0]

I'm not saying I like finding out I'm wrong about something.  As Schulz points out, finding out you've made a mistake can be revelatory, enlightening, or hilarious -- but it can also be humiliating, frustrating, or devastating.  I'm reminded of one of the funniest scenes from The Big Bang Theory -- when Sheldon meets Stephen Hawking:


While most of us have never had the experience of embarrassing the hell out of ourselves in front of one of the smartest people in the world, I think we can all relate.  And part of what makes it funny -- and relatable -- is until it's pointed out, Sheldon can't fathom that he actually made a mistake.  Maybe there are few people as colossally arrogant as he is, but the truth is we are more like him than we want to admit.  We cling to the things we believe and what we think we understand with a fervor that would do the Spanish Inquisition proud.


The reason all this comes up is a paper this week in Proceedings of the National Academy of Sciences by Jeroen van Baar and Oriel Feldman-Hall (of Brown University) and David Halpern (of the University of Pennsylvania) called, "Intolerance of Uncertainty Modulates Brain-to-Brain Synchrony During Politically Polarized Perception."  In this study, the researchers gave a group of test subjects videos to watch -- strongly liberal, strongly conservative, and politically neutral -- and looked at the brain's response to the content.  What they found was that (unsurprisingly) some test subjects had strongly aversive reactions to the videos, but the strongest correlation to the strength of the response wasn't whether the watcher was him/herself conservative or liberal (putting to rest the idea that one side is intolerant and the other isn't), nor was it the perceived distance between the content of the video and the test subject's own belief; it was how intolerant the person was of uncertainty.

In other words, how angry you get over hearing political commentary you don't agree with depends largely on how unwilling you are to admit that your own understanding might be flawed.

It's kind of a devastating result, isn't it?  The polarization we're currently experiencing here in the United States (and undoubtedly elsewhere) is being driven by the fact that a great many people on both sides are absolutely and completely convinced they're right.  About everything.  Again to quote Kathryn Schulz, "This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are just these perfectly translucent windows and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing."

A lot of it, I think, boils down to fear.  To admit that we might be wrong -- fundamentally, deeply wrong, perhaps about something we've believed our entire lives -- is profoundly destabilizing.  What we thought was solid and everlasting turns out to be thin ice, but instead of taking steps to rectify our misjudgment and skate to safety, we just close our eyes and keep going.  There's a part of us that can't quite believe we might not have everything figured out.

Like I said, it's not that I enjoy being wrong myself; I find it just as mortifying as everyone else does.  So part of me hopes that I do have the big things figured out, that my most dearly-held assumptions about how the world works won't turn out to be completely in error.  But it behooves us all to keep in the back of our minds that human minds are fallible -- not just in the theoretical, "yeah, people make mistakes" sense, but that some of the things we're surest about may be incorrect.

Let's all work to become a little humbler, a little more uncomfortable with uncertainty -- as Schulz puts it, to be able to "step outside of that tiny, terrified space of rightness and look around at each other and look out at the vastness and complexity and mystery of the universe and be able to say, 'Wow, I don't know.  Maybe I'm wrong.'"

********************************

I have often been amazed and appalled at how the same evidence, the same occurrences, or the same situation can lead two equally-intelligent people to entirely different conclusions.  How often have you heard about people committing similar crimes and getting wildly different sentences, or identical symptoms in two different patients resulting in completely different diagnoses or treatments?

In Noise: A Flaw in Human Judgment, authors Daniel Kahneman (whose wonderful book Thinking, Fast and Slow was a previous Skeptophilia book-of-the-week), Olivier Sibony, and Cass Sunstein analyze the cause of this "noise" in human decision-making, and -- more importantly -- discuss how we can avoid its pitfalls.  Anything we can to to detect and expunge biases is a step in the right direction; even if the majority of us aren't judges or doctors, most of us are voters, and our decisions can make an enormous difference.  Those choices are critical, and it's incumbent upon us all to make them in the most clear-headed, evidence-based fashion we can manage.

Kahneman, Sibony, and Sunstein have written a book that should be required reading for anyone entering a voting booth -- and should also be a part of every high school curriculum in the world.  Read it.  It'll open your eyes to the obstacles we have to logical clarity, and show you the path to avoiding them.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, March 8, 2017

Contradicting the narrative

In her marvelous TED Talk "On Being Wrong," writer and journalist Kathryn Schulz describes a "series of unfortunate assumptions" that we tend to make when we find out that there are people who disagree with us.

First, we tend to assume that the dissenters are simply ignorant -- that they don't have access to the same facts as we do, and that if we graciously enlighten them, they'll say, "Oh, of course!" and join our side.  If that doesn't work, if the people who disagree with us turn out to have access to (and understand) the same facts as we have, then we turn to a second assumption -- that they're stupid.  They're taking all of the evidence, putting it together, and are too dumb to do it right.

If that doesn't work -- if our intellectual opponents have the same facts as we do, and turn out to be smart enough, but they still disagree with us -- we move on to a third, and worse, assumption; that they're actually malevolent.  They have the facts, know how to put them together, and are suppressing the right conclusion (i.e. ours) for their own evil purposes.

This, Schulz says, is a catastrophe.  "This attachment to our own sense of rightness keeps us from preventing mistakes when we absolutely need to," she says, "and it causes us to treat each other terribly."

I got a nice, and scientific, object lesson in support of Schulz's claim yesterday, when I stumbled across a paper in PLoS One by Clinton Sanchez, Brian Sundermeier, Kenneth Gray, and Robert J. Calin-Jagemann called "Direct Replication of Gervais & Norenzayan (2012): No Evidence That Analytic Thinking Decreases Religious Belief."  Apparently five years ago, a pair of psychological researchers, Will Gervais of the University of Kentucky and Ara Norenzayan of the University of British Columbia, had published a study showing that there was an inverse correlation between religious belief and analytical thinking, and further, stimulating analytical thinking in the religious has the effect of weakening their beliefs.

[image courtesy of Lucien leGray and the Wikimedia Commons]

Well, all of that fits nicely into the narrative we atheists would like to believe, doesn't it?  Oh, those religious folks -- if we could just teach 'em how to think, the scales would fall from their eyes, and (in Schulz's words) they'd "come on over to our team."  The problem is, when Sanchez et al. tried to replicate Gervais and Norenzayan's findings, they were unable to do so -- despite (or perhaps because of) the fact that Sanchez et al. used a much larger sample size and tighter controls.  The authors write:
What might explain the notable difference between our results and those reported by G&N?  We can rule out substantive differences in materials and procedures, as these were essentially identical.  We can also rule out idiosyncrasies in participant pools, as we collected diverse samples and used extensive quality controls.  Finally, we can also rule out researcher incompetence, as we were able to detect an expected effect of similar size using a positive control. 
One possibility is that Study 2 of G&N substantially over-estimated the effect of the manipulation on religious belief.  This seems likely, not only because of the data presented here but also because evidence published while this project was in progress suggests that the experimental manipulation may not actually influence analytic thinking...
Based on our results and the notable issues of construct validity that have emerged we conclude that the experiments reported by G&N do not provide strong evidence that analytic thinking causes a reduction in religious belief.  This conclusion is further supported by results from an independent set of conceptual replications that was recently published which also found little to no effect of analytic thinking manipulations on religious belief.
To their credit, Gervais and Norenzayan not only cooperated with the research of Sanchez et al., they admitted afterwards that their original experiments had led to a faulty conclusion.  In fact, in his blog, Gervais gives a wryly humorous take on their comeuppance, by presenting the criticisms of Sanchez et al. and only at the end revealing that it was his own paper that had been, more or less, cut to ribbons.  He says of Sanchez's team, "I congratulate them on their fine work."

He also included the following in his postscript:
FFFFFFFFFFUUUUUUUUUUUUUCCCCCCCCCKKKKKKKKKKKK!!!!
Understandably.  As Schulz points out, it's often devastating and embarrassing to find out that we screwed up.  Doubly so when you're a scientist, since your reputation and your livelihood depend on getting things right.  So kudos to Gervais and Norenzayan for admitting their paper hadn't shown what they said it did.

So not only is this a great example of science done right, in the larger analysis, it tells us atheists that we can't get away with dismissing religious folks as simply not being as smart and analytical as we are.  Which, honestly, is just as well, because it would leave me trying to explain friends of mine who are honest, smart, well-read, logical... and highly religious.  It supports the kinder (and more accurate) conclusion that we're all trying to figure things out as best we can with what information we have at hand, and the fact that we come to radically different answers is testimony to the difficulty of understanding a complex and fascinatingly weird universe with our limited perceptions and fallible minds.

Or, as Schulz concludes, "I want to convince you that it is possible to step outside [the feeling of being right about everything], and that if you do so it is the greatest moral, intellectual, and creative leap you can take...  If you really want to rediscover wonder, you need to step out of that tiny, terrified space of rightness, and look around at each other, and look out at the vastness and complexity and mystery of the universe, and be able to say, 'Wow.  I don't know.  Maybe I'm wrong.'"

Saturday, February 11, 2017

The devil made me do it

One of the human tendencies I find the hardest to comprehend is the bafflement some people feel when they find out that there are people who disagree with them.

Being a center-left atheist from conservative, Christian southern Louisiana, I have never been under the illusion that everyone agrees with me.  Further, I am convinced that the people who do disagree with me are, by and large, good, kind, honest people who believe what they do for their own heartfelt reasons.  While we've come to differing conclusions about the way the universe works and how governance should happen down here on Earth, mostly we respect each other despite our differences, and mostly we get along pretty well.

But there's a contingent on both sides of the spectrum who seem entirely incredulous that people who disagree with them actually exist.  And I ran into several interesting examples of this just yesterday, revolving around leaders of the Religious Right who are so befuddled by the fact that there are folks who don't support Donald Trump that they can only explain it by proposing that said dissenters are motivated by Satan.

Starting with Pastor Lance Wallnau, who was asked on The Jim Bakker Show what he thought about the Donald Trump's inauguration.  Wallnau replied:
What I believe is happening is there was a deliverance of the nation from the spirit of witchcraft in the Oval Office.  The spirit of witchcraft was in the Oval Office, it was about to intensify to a higher level demon principality, and God came along with a wrecking ball -- Trump -- and shocked everyone, the church cried out for mercy and bam—God knocked that spirit out, and what you’re looking at is the manifestation of an enraged demon through the spirit.
So, of course, only people under the influence of the devil himself would object to all of this.  About the Women's March on Washington, he said that the people who showed up to celebrate Trump were motivated by god, and the people who protested... weren't:
[The crowd that chered at the inauguration] was, in a great measure, the Christian community showing up in Washington to celebrate God’s intervention...  The people attending Trump’s inauguration represented the people of God that went to Washington to celebrate the mercy of God... those who went to the following day’s Women’s March on Washington were the people of the devil that came in order to fight it.
Wallnau isn't the only one who ascribes criticisms of Trump to a demonic source.  Rick Wiles, conspiracy theorist par excellence and purveyor of End Times nonsense, said that Satan was involved -- but so was Satan's right-hand man here on Earth, none other than Barack Obama:
We are witnessing a full-blown Marxist/communist resistance movement, a revolution in America.  The chief banker funding the Purple Revolution is billionaire George Soros and the chief community organizer directing the insurrection in the streets is none other than Barack Hussein Obama …  My gut feeling says Barack Obama is on the phone day and night and he is directing the protests, he is organizing, he is giving clear instructions to the people what to do and how to carry it out.

This is outright sedition, and we have laws in the United States against sedition….  What the Democrats are doing, and the news media and the Obamanista bureaucrats inside the government agencies, what they are doing is, these are acts of sedition. 
You wanna get God worked up?  You know what sedition reminds Him of?  Lucifer.  It all goes back to Lucifer because what Lucifer did in heaven was commit sedition …  So all acts of sedition are inspired by Lucifer. 
Those who are opposing Trump are not only breaking the laws against sedition, but are also breaking God’s laws.
Not to be outdone, Pat Robertson had to join in the fray, and said this week on his show The 700 Club that not only are the protests motivated by Satan, they're not even real:
They’re paid for, many of them, and George Soros and those like him are paying the bill to make all these demonstrations look like the nation is rising up against this ban; it’s not. The people of America want to be safe from terrorists.
Okay, it's not that I expect these three guys and others like them to do anything but celebrate Trump from the rooftops, although I am still a little mystified at how the family-values, Ten-Commandments-touting, live-like-Jesus Christian Right ever embraced someone like Donald Trump in the first place.  Given that now Trump is their Golden Boy, I suppose they have their reasons.  But what I completely fail to understand is how you can be so wedded to your worldview that the only way you can conceive of people disagreeing with you is by postulating that they must be motivated by Satan.

Or, at the very least, Barack "Antichrist" Obama.


I've recommended more than once Kathryn Schulz's amazing TED Talk "On Being Wrong," in which she makes a powerful case that we not only need to be aware that others can disagree with us without their being stupid, evil, deluded, or immoral, but that considering the possibility that we ourselves might be wrong about our views is one of the most mind-altering, liberating steps we can take.  In any case, being so invested in our theories that we have to ascribe our own views to god and our opponents' views to the devil seems to me to be so arrogant as to be entirely incomprehensible.

So maybe there are people whose existence baffles me, after all.

Tuesday, March 1, 2016

The origins of moral outrage

Here in the United States, we're in the middle of an increasingly nasty presidential race, which means that besides political posturing, we're seeing a lot of another facet of human behavior:

Moral outrage.

We all tend to feel some level of disbelief that there are people who don't believe in the same standards of morality and ethics that we do.  As Kathryn Schulz points out, in her wonderful TED talk "On Being Wrong," "We walk around in a little bubble of feeling right about everything...  We all accept that we can be wrong in the abstract.  Of course we could be wrong.  But when we try to think of one single thing we're wrong about, here and now, we can't do it."

So what this does is to drive us to some really ugly assumptions about our fellow humans.  If they disagree with us, they must be (check all that apply): deluded, misguided, uninformed, ignorant, immoral, or plain old stupid.

[image courtesy of photographer Joost J. Bakker and the Wikimedia Commons]

But a recent paper in Nature shows that we have another, and darker, driver for moral outrage than our inability to conceive of the existence of people who disagree with us.  Jillian J. Jordan, Moshe Hoffman, Paul Bloom, and David G. Rand, in a collaboration between the Departments of Psychology at Harvard and Yale, released the results of a fairly grim study in "Third-Party Punishment as a Costly Signal of Trustworthiness," in which we find out that those who call out (or otherwise punish) bad behavior or negative actions do so in part because afterwards, they are perceived as more trustworthy themselves.

In the words of the researchers:
Third-party punishment (TPP), in which unaffected observers punish selfishness, promotes cooperation by deterring defection.  But why should individuals choose to bear the costs of punishing?  We present a game theoretic model of TPP as a costly signal of trustworthiness.  Our model is based on individual differences in the costs and/or benefits of being trustworthy.  We argue that individuals for whom trustworthiness is payoff-maximizing will find TPP to be less net costly (for example, because mechanisms that incentivize some individuals to be trustworthy also create benefits for deterring selfishness via TPP).  We show that because of this relationship, it can be advantageous for individuals to punish selfishness in order to signal that they are not selfish themselves... 
We show that TPP is indeed a signal of trustworthiness: third-party punishers are trusted more, and actually behave in a more trustworthy way, than non-punishers.  Furthermore, as predicted by our model, introducing a more informative signal—the opportunity to help directly—attenuates these signalling effects.  When potential punishers have the chance to help, they are less likely to punish, and punishment is perceived as, and actually is, a weaker signal of trustworthiness.  Costly helping, in contrast, is a strong and highly used signal even when TPP is also possible.  Together, our model and experiments provide a formal reputational account of TPP, and demonstrate how the costs of punishing may be recouped by the long-run benefits of signalling one’s trustworthiness.
Calling out people who transgress not only makes the transgression less likely to happen again; it also strengthens the position of the one who called out the transgressor.  It's unlikely that people do this consciously, but Jordan et al. have shown that punishing selfishness isn't necessarily selfless itself.

All of which makes the whole group dynamics thing a little scary.  As social primates, we have a strong innate vested interest in remaining part of the in-group, and this sometimes casts a veneer of high morality over actions that are actually far more complex.  As Philip Zimbardo showed in his infamous "Stanford Prison Experiment," we will do a great deal both to conform to the expectations of the group we belong to, and to exclude and vilify those in an opposing group.  And now the study by Jordan et al. has showed that we do this not only to eradicate behaviors we consider immoral, but to appear more moral to our fellow group members.

Which leaves me wondering how we can tease apart morality from the sketchier side of human behavior.  Probably we can't.  It will, however, make me a great deal more careful to be sure I'm on solid ground before I call someone else out on matters of belief.  I'm nowhere near sure enough of the purity of my own motives most of the time to be at all confident, much less self-righteous, about proclaiming to the world what I think is right and wrong.

Thursday, January 9, 2014

Faith, belief, and agnosticism: a guest post by author Cly Boehs


My dear friend, the author and artist Cly Boehs, was inspired by one of my posts from two weeks ago to write an essay of her own responding to the points I brought up, and I have invited her to present it here.  You can (and should!) read Cly's short stories, posted on her wonderful blog Mind at Play, and I encourage you all to buy her brilliant collection of four novellas, The Most Intangible Thing, available at Amazon here.  I know you'll be as entertained and intrigued by Cly's writing as I am.

************************************
On Skeptophilia on December 24th, 2013, in an article entitled, "Elf highway blockade," you ask the question, “…how do specific counterfactual beliefs become so entrenched, despite a complete lack of evidence that entire cultures begin to buy in?” You state that you get how individuals can become superstitious but are perplexed by how cultures can do this—supposedly because more heads should be better than one? The underlying question seems to be—why wouldn’t there be enough dissenting voices in such groups to stop such ridiculous claims? How can so many be so wrong about something so outlandish? 

Since I’ve spent quite a bit of time researching, thinking and writing about why people (as individuals and groups) believe what they do, I’d like to take this question on, at least offering an opinion in brief form. Over time, I’ve come to two major conclusions (1) groups don’t use factual evidence any more than individuals to come to their beliefs; and (2) once individuals’ beliefs are strengthen by numbers, the believers take on a superior hue such that they disavow any other claims than their own—they are right and that’s that. You seem to be asking how this can happen when there is contrary factual evidence readily available for them to see. To a rationalist, a term like “factual evidence” is redundant because both “facts” and “evidence” imply objectivity. To a person basing evidence on faith or will-to-believe, “evidence” lies in subjective truth; and since that truth’s validity is based on personal experience, the more of those will-to-believers you can gather together, the greater the validation of truth (see below).  

I’d like to draw attention to two points about both individuals and groups that allow any belief (superstition or not) to become foundational to them. First, culture, the state, the church, all organizations and institutions are made up of individuals and studies have shown that what the individuals in the group believe becomes strengthen by numbers. Which brings me to my second notion: that belief(s) of the group are held together because they believe they are right, often the only right. The strength of belief gained in numbers produces a feeling of superiority such that the group forms an “us vs them” mentality and most often (depending on how significant the belief is to the group) takes it to battle against other beliefs. Lord knows we have enough examples of this throughout human history and in foreign affairs today. 

It is tremendously important that we remember that groups are made up of individuals—that in the most important way, groups do not exist in and of themselves. When we begin thinking that they do, we end up in extreme situations such as with the Nazi mentality of World War II, the mass execution of 1862 in Mankato, Minnesota and the Guantanamo Bay detention camp. The defendants at the Nuremberg trials (or any others for that matter) can claim immunity on the grounds of mass-think. Which leads me to this business of feeling superior and safe because of being right. 

I realize in your article you are talking specifically about superstitious group beliefs—beliefs way, way out there like believing that elves not only exist but have rights. But I ask you, how far off is this from the now institutionalized belief that our supreme court recently translated into the law of the land, that corporations are individuals with rights? And how far away from beliefs such as transmigration of souls and transubstantiation is the belief in elves? Or that Buddha’s mother, before his birth, was struck on her right side by a white bull elephant that held a lotus in its trunk, an elephant that then vanished into her— or as another story goes—the elephant entered her womb and shortly after disappeared? This is after the elephant walked around her three times—well, of course it was three times. Three is the magic number of fairy tales and religious triads, right?




We all know stories from sacred texts that defy objective evidence—water into wine, an ass speaking on the road to Damascus, parted water to expose dry land (Buddha performed this one before Moses), a Hindu holy man changing jackals into horses and back again at will and the list goes on and on. Miracles or superstitions as a foundational belief arises out of “faith,” as a belief not based on facts. In fact, such beliefs as miracles and superstition are a demonstration of faith. Actually, it’s why they are there. Faith of this kind produces massive power in individuals especially when socialized and politicized, which makes faith-based belief(s) concrete in ritual and activism. Superstition works in this way because when individuals, strengthened by groups, believe against “the odds,” e.g., the Anabaptists against the Catholics and Protestants; the American Revolutionaries against English and French militaries—such polarities only strengthens each in what they believe is right. [A study demonstrating this clearly showed up recently on the site politico.com in which two psychologists demonstrated that the more extreme a person’s views are the more they think they are right.] The most important component in any cultural or polarized situation is belief, not reasonableness, facts. And this is because facts change by necessity; while belief can remain consistent and constant if founded in faith, the will-to-believe. The validity of a belief is often expressed in this constancy down through time, e.g. the Vedas are 3500 years old and Christian time is counted since Christ and so forth. There is comfort in not only being right but being so with such consistency and constancy. 

So how does a group of people (note that a singular verb is used for the term “group”) end up believing that elves should be given rights to stop a highway through their sacred territory? By individuals comprising the group believing elves-are-individuals-with-rights. And these individuals find strength in this belief by numbers in the groups and by the outlandishness of the belief as proof of faith in that belief. According to this logic then, the crazier the idea, the greater the faith needed to believe it; therefore, the greater the proof of its validity. Trust of this sort and the will-to-believe—a trust to act in faith before any supporting evidence, one of William James’ terms for this is “confidence”—can be constant in a way reason based on facts cannot. And the stronger the faith of an individual or group is, the stronger the belief. And because of this, psychologist and philosophers of all kinds of stripes have opted for head over heart, reason over emotion-and-feeling, reason over will. And it’s easy to see why. When people fear that their faith or will-to-believe is taken from them, they know that what they regard as constant, permanent, is gone; living in a constantly ambiguous state-of-affairs leaves one too vulnerable. ‘Tis unsafe. 

But polarity is not the answer. An alternative to rightness-in-belief lies in the willingness (note the word) to believe conditionally—not to give up belief. We will believe (again, note the word). We have to believe in order to function, actually to literally live at all. We don’t have all the facts for what we believe, never will. But in order to develop more fully as individuals, our beliefs have to be founded with an open heart and mind. And in order for this to happen, individuals have to believe in the self over groups, have to understand how culture can be an obstacle to self-identity, and have to be willing to die because believing this is deeply threatening to those caught in the whole system of belief that states “being right” is all there can be, especially when it comes wrapped in the outrageous intentionality of religious fervor. 

And what about imagination and creativity in all of this? The human capacity for ambiguity is at the heart of creativity, true self-identity, justice and all human endeavors with inherent freedom for the individual in them. Ambiguity is not waffling between one held belief and another—it is remaining open to the possibility that states-of-affairs can be other than they are perceived to be. In other words, we can be wrong. And the human ability to be wrong is inherent in so much of what we learn. [Gordon pointed out to me recently Kathyrn Schulz’s TED talk, “On Being Wrong,” which is a beautiful declaration on why we seek solace in being right.]

My view is not that we should evolve to a state of rationalism over beliefs based on faith or will, but become individuals with an ability to suspend belief just as we suspend facts. By “suspended belief or facts,” I mean lift the total-rightness-of-belief, out of its foundational bedrock, i.e. hold what is known either as fact or belief-through-faith in regard, even respect, even act on it until something else replaces it. But when suspended belief of this kind is presented to individuals as a possibility, they usually become more radicalized in their position than before. They can’t give up the constancy of faith over the inconstancy of facts, when in order for all of us to live as fully as possible, we have to give up both so that we can embrace both. What’s interesting is that when suspended belief is presented to a rationalist, meaning they have to take the agnostic position over the atheistic one, there is just as much fluttering of feathers and great resistance. Since rationalism is based on evidence (observable facts) and without it, there can be no belief, they are just as adamant in their position as Faith-and-Will believers are in theirs. Suspended belief doesn’t mean no belief. It means not knowing or even not knowable—which is what agnosticism is.

So what to do with the Friends of Lava and the project detrimental to elf culture?

What would “suspended belief” look like for both sides in this dispute? Why not negotiate and do it while applying Gordon’s suggestion of critical thinking thrown into the mix…mess—the discussion being along the lines of how far traditions in culture should/could be allowed to influence progress of a practical nature. But also, discussion with some open-mindedness has to be there as well, lifting that desire to be right and listening to the other side, working together to meet some middle way in the situation. We have these negotiations going on all the time—the ten commandments monument in the Alabama Judicial Building and another one in Oklahoma on its state capitol grounds as examples. Is this really too terribly different from the Icelandic version of respecting the Little People and their territory? Unfortunately we haven’t found a way yet to discuss such disputes without the rush to polarities and the superiority of our views—so until such time, it is left to the settlements in the courts.


Monday, September 23, 2013

Proof, souls, skepticism, and being wrong

I think one of the problems with scientists and non-scientists not understanding each other revolves around the meaning of the word "proof."

I ran into two interesting instances of this in the last couple of days.  One of them was a response to my post last week about the conspiracy theorist conference that's being held next Saturday in my home town, in which I wrote (amongst other comments) a rather snarky paragraph about people who believe in chemtrails, anti-vaxx propaganda, and so on.

Well, that sort of thing always upsets some readers.  "I hate these damn skeptics," wrote one commenter, "who think they have everything proven!  The world always has to be how they see it!"

First off, in my own defense, I've never claimed that I was infallible; only that the evidence very much supports the contention that (1) chemtrails don't exist, and (2) vaccinations are safe and effective. And just because I'm pretty certain to be right about these two things doesn't mean that I think I'm right about everything.

But the more interesting thing is the use of the word "proof."  Because in science, disproof is usually far easier than proof.  If you have a model of how you think the world works, you design a test of that model, and see if the results are consistent with what the model predicts.  If they are not -- assuming that nothing was wrong with the research protocol -- then your model is disproven (although scientists generally prefer the word "unsupported").

Of course, the problem is that in this context, you never really "prove" your model; you simply add to the support for it.  Nothing is ever proven, because additional experiments could show that your model hadn't predicted correctly in all cases, and needs revision.

But still the sense persists out there amongst your average layperson that scientists "prove their theories," and that all you need is some hand-waving argument and a few fancy-looking diagrams to accomplish this.

As an example of the latter, consider the site that is making the rounds of social media with the headline "Scientists prove the existence of the soul!"  Of course, when I clicked on the link, I was already primed to view the whole thing with a jaundiced eye, because it's not like I don't have my own biases on this particular topic.  But I'm happy that in this case, I wasn't off base in my skepticism, because this link turned out to be a wild woo-woo claim par excellence.

The whole thing is based upon the "research" of a Russian scientist who claims to have photographed the soul leaving the body as someone dies.  Here's a pair of his photographs:


And here is the accompanying explanation:
The timing of astral disembodiment in which the spirit leaves the body has been captured by Russian scientist Konstantin Korotkov, who photographed a person at the moment of his death with a bioelectrographic camera.

The image taken using the gas discharge visualization method, an advanced technique of Kirlian photography shows in blue the life force of the person leaving the body gradually.

According to Korotkov, navel and head are the parties who first lose their life force (which would be the soul) and the groin and the heart are the last areas where the spirit before surfing the phantasmagoria of the infinite.

In other cases according to Korotkov has noted that "the soul" of people who suffer a violent and unexpected death usually manifests a state of confusion in your power settings and return to the body in the days following death.  This could be due to a surplus of unused energy.
Well, first, those doesn't look to me like Kirlian photographs.  Kirlian photography is a way of capturing an image of the static electrical discharge from an object, and shows distinctive bright "flame" marks around the object being photographed.  Here, for example, is a Kirlian photograph of a leaf:


What Korotkov's photograph looks like to me is a false-color photograph taken with an infrared camera, which colorizes the regions of a human body (or anything) based upon its temperature.  So naturally the heart (positioned, as it is, in mid-torso) and the groin would tend to be warmer.  I don't think it has anything to do with your soul sticking around because it's especially attached to your heart and your naughty bits.

I also have to wonder how Korotkov was able to study people who experienced "violent and unexpected deaths."  It's not as if you can plan to have a scientist around for those, especially the unexpected ones.

But in the parlance of the infomercial -- "Wait!  There's more!"
The technique developed by Korotkov, who is director of the Research Institute of Physical Culture, St. Petersburg, is endorsed as a medical technology by the Ministry of Health of Russia and is used by more than 300 doctors in the world for stress and monitoring progress of patients treated for diseases such as cancer.  Korotkov says his energy imaging technique could be used to watch all kinds of imbalances biophysical and diagnose in real time and also to show if a person does have psychic powers or is a fraud.
This technique, which measures real-time and stimulated radiation is amplified by the electromagnetic field is a more advanced version of the technology developed for measuring Semyon Kirlian aura.

Korotkov observations confirm, as proposed by Kirlian, that "stimulated electro-photonic light around the tips of the fingers of the human being contains coherent and comprehensive statement of a person, both physically and psychologically."

In this video interview Korotkov speaks of the effect in the bioenergy field with food, water and even cosmetics. And emphasizes one umbrella drink water and organic food, particularly noting that the aura of the people in the Undies [sic] suffers the negative effects of nutrients as technologization distributed in this society.

Korotkov also speaks of their measurements in supposedly loaded with power and influence that people have in the bioenergy fields of others. Checking Rupert Sheldrake's experiment of the feeling of being watched : Because a person's bioenergy field changes when someone else directs his attention, even though it is backwards and not consciously perceived. Also a place fields are altered when there is a concentration of tourists.
Well then.  We have "electro-photonic light" (is there another kind?), "bioenergy fields" (sorry, Sheldrake, but there's no evidence they exist), a reference to "real" versus "fraud" psychic powers, and a contention that tourists affect a person's soul.  Not to mention the thing about "undies," which I sincerely hope was a typo or mistranslation, because I would hate to think that my boxers are somehow creating negative effects in my spiritual nutrients.

And this is what people read, and say that it "proves the existence of a soul?"

Of course, what we have going on here is confirmation bias -- when you already believed something, so a tiny piece of sketchy evidence is all you need to shore up that belief.  I think I can state without fear of contradiction that no one who didn't already believe that souls exist would be convinced by this article.

So that's the problem, isn't it?  And not just in this admittedly ridiculous claim that equates dead bodies cooling off with their souls escaping.  Think of people who listen, uncritically, to "news" about their favorite controversial story -- evolution vs. creationism, the safety of vaccinations, the role of human activities in climate change, whether the public school system is headed for disaster.  If you uncritically accept what you're hearing as proof, just because it supports the contentions you already had, you'll never find out where you've got things wrong.  And that, to me, is the heart of science -- and the only way to lift yourself above your biases.

If you have fifteen minutes, and want to listen to someone who demonstrates this point brilliantly, take a look at the TED talk by Kathryn Schulz called "On Being Wrong."


I can honestly say that watching this short video was to me an eye-opener to the point of being life-changing.  She asks us to shift our viewpoint from trying to "prove" what we already believed to be true, to thinking seriously about the possibility of our being wrong -- and frames it in a way I had honestly never considered.  I think that the first time I watched it, I spent the last half of it listening with my mouth hanging open in sheer astonishment.

You have wonder how much pain and suffering could be averted in the world if more people would entertain the possibility of their being wrong.  Right now, there are hostages being held in a mall in Kenya (and 68 known dead in the incident) because of men who are so convinced that their worldview is right that they are willing to slaughter innocent people in its name.

Maybe we have been, as a species, looking at things the wrong way round.  Maybe we shouldn't constantly be looking for proof for what we already believed.  Science (at its best) approaches the world tentatively, testing, probing, and wondering -- and constantly asking the question, "what if this model is wrong?"  I know we can't all be scientists, and that not all problems are scientific in nature, but the general approach -- always keeping in mind our own fallibility -- has a lot to recommend it.