Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, May 15, 2021

Thin ice

In her phenomenal TED talk "On Being Wrong," journalist Kathryn Schulz says, "[W]e all kind of wind up traveling through life, trapped in this little bubble of feeling very right about everything...  [and] I want to convince you that it is possible to step outside of that feeling and that if you can do so, it is the single greatest moral, intellectual and creative leap you can make."

I've often thought that that the willingness to entertain the possibility that your knowledge is incomplete -- that you may not have all the answers, and (more critically) that some of the answers you've arrived at might be false -- is the cornerstone of developing a real understanding of how things actually are.  Put a different way, certainty can be a blindfold.

[Image licensed under the Creative Commons Dale Schoonover, Kim Schoonover, Blindfold hat, CC BY 3.0]

I'm not saying I like finding out I'm wrong about something.  As Schulz points out, finding out you've made a mistake can be revelatory, enlightening, or hilarious -- but it can also be humiliating, frustrating, or devastating.  I'm reminded of one of the funniest scenes from The Big Bang Theory -- when Sheldon meets Stephen Hawking:


While most of us have never had the experience of embarrassing the hell out of ourselves in front of one of the smartest people in the world, I think we can all relate.  And part of what makes it funny -- and relatable -- is until it's pointed out, Sheldon can't fathom that he actually made a mistake.  Maybe there are few people as colossally arrogant as he is, but the truth is we are more like him than we want to admit.  We cling to the things we believe and what we think we understand with a fervor that would do the Spanish Inquisition proud.


The reason all this comes up is a paper this week in Proceedings of the National Academy of Sciences by Jeroen van Baar and Oriel Feldman-Hall (of Brown University) and David Halpern (of the University of Pennsylvania) called, "Intolerance of Uncertainty Modulates Brain-to-Brain Synchrony During Politically Polarized Perception."  In this study, the researchers gave a group of test subjects videos to watch -- strongly liberal, strongly conservative, and politically neutral -- and looked at the brain's response to the content.  What they found was that (unsurprisingly) some test subjects had strongly aversive reactions to the videos, but the strongest correlation to the strength of the response wasn't whether the watcher was him/herself conservative or liberal (putting to rest the idea that one side is intolerant and the other isn't), nor was it the perceived distance between the content of the video and the test subject's own belief; it was how intolerant the person was of uncertainty.

In other words, how angry you get over hearing political commentary you don't agree with depends largely on how unwilling you are to admit that your own understanding might be flawed.

It's kind of a devastating result, isn't it?  The polarization we're currently experiencing here in the United States (and undoubtedly elsewhere) is being driven by the fact that a great many people on both sides are absolutely and completely convinced they're right.  About everything.  Again to quote Kathryn Schulz, "This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are just these perfectly translucent windows and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing."

A lot of it, I think, boils down to fear.  To admit that we might be wrong -- fundamentally, deeply wrong, perhaps about something we've believed our entire lives -- is profoundly destabilizing.  What we thought was solid and everlasting turns out to be thin ice, but instead of taking steps to rectify our misjudgment and skate to safety, we just close our eyes and keep going.  There's a part of us that can't quite believe we might not have everything figured out.

Like I said, it's not that I enjoy being wrong myself; I find it just as mortifying as everyone else does.  So part of me hopes that I do have the big things figured out, that my most dearly-held assumptions about how the world works won't turn out to be completely in error.  But it behooves us all to keep in the back of our minds that human minds are fallible -- not just in the theoretical, "yeah, people make mistakes" sense, but that some of the things we're surest about may be incorrect.

Let's all work to become a little humbler, a little more uncomfortable with uncertainty -- as Schulz puts it, to be able to "step outside of that tiny, terrified space of rightness and look around at each other and look out at the vastness and complexity and mystery of the universe and be able to say, 'Wow, I don't know.  Maybe I'm wrong.'"

********************************

I have often been amazed and appalled at how the same evidence, the same occurrences, or the same situation can lead two equally-intelligent people to entirely different conclusions.  How often have you heard about people committing similar crimes and getting wildly different sentences, or identical symptoms in two different patients resulting in completely different diagnoses or treatments?

In Noise: A Flaw in Human Judgment, authors Daniel Kahneman (whose wonderful book Thinking, Fast and Slow was a previous Skeptophilia book-of-the-week), Olivier Sibony, and Cass Sunstein analyze the cause of this "noise" in human decision-making, and -- more importantly -- discuss how we can avoid its pitfalls.  Anything we can to to detect and expunge biases is a step in the right direction; even if the majority of us aren't judges or doctors, most of us are voters, and our decisions can make an enormous difference.  Those choices are critical, and it's incumbent upon us all to make them in the most clear-headed, evidence-based fashion we can manage.

Kahneman, Sibony, and Sunstein have written a book that should be required reading for anyone entering a voting booth -- and should also be a part of every high school curriculum in the world.  Read it.  It'll open your eyes to the obstacles we have to logical clarity, and show you the path to avoiding them.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



No comments:

Post a Comment