Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label wrongness. Show all posts
Showing posts with label wrongness. Show all posts

Monday, November 6, 2023

Lateral thinking

One of the biggest impediments to clear thinking is the fact that it's so hard for us to keep in mind that we could be wrong.

As journalist Kathryn Schulz put it:

I asked you how it felt to be wrong, and you had answers like humiliating, frustrating, embarrassing, devastating.  And those are great answers.  But they're answers to a different question.  Those are answers to the question, "How does it feel to find out you're wrong?"  But being wrong?  Being wrong doesn't feel like anything...  You remember those characters on Saturday morning cartoons, the Coyote and the Roadrunner?  The Coyote was always doing things like running off a cliff, and when he'd do that, he'd run along for a while, not seeing that he was already over the edge.  It was only when he noticed it that he'd start to fall.  That's what being wrong is like before you've realized it.  You're already wrong, you're already in trouble...  So I should amend what I said earlier.  Being wrong does feel like something.

It feels like being right.

We cling desperately to the sense that we have it all figured out, that we're right about everything.  Oh, in theoretical terms we realize we're fallible; all of us can remember times we've been wrong.  But right here, right now?  It's like my college friend's quip, "I used to be conceited, but now I'm perfect."

The trouble with all this is that it blinds us to the errors that we do make, because if you don't keep at least trying to question your own answers, you won't see your own blunders.  It's why lateral thinking puzzles are so difficult, but so important; they force you to set aside the usual conventions of how puzzles are solved, and to question your own methods and intuitions at every step.  This was the subject of a study by Andrew Meyer (of the Chinese University of Hong Kong) and Shane Frederick (of Yale University) that appeared in the journal Cognition last week.  They looked at a standard lateral thinking puzzle, and tried to figure out how to get people to avoid falling into thinking their (usually incorrect) first intuition was right.

The puzzle was a simple computation problem:

A bat and a ball together cost $1.10.  The bat costs $1.00 more than the ball.  How much does the ball cost?

The most common error is simply to subtract the two, and to come up with ten cents as the cost of the ball.  But a quick check of the answer should show this can't be right.  If the bat costs a dollar and the ball costs ten cents, then the bat costs ninety cents more than the ball, not a dollar more (as the problem states).  The correct answer is that the ball costs $0.05 and the bat costs $1.05 -- the sum is $1.10, and the difference is an even dollar.

Meyer and Frederick tried different strategies for improving people's success.  Bolding the words "more than the ball" in the problem, to call attention to the salient point, had almost no effect at all.  Then they tried three different levels of warnings:

  1. Be careful!  Many people miss this problem.
  2. Be careful!  Many people miss the following problem because they do not take the time to check their answer.
  3. Be careful!  Many people miss the following problem because they read it too quickly and actually answer a different question than the one that was asked.

All of these improved success, but not by as much as you might think.  The number of people who got the correct answer went up by only about ten percent, no matter which warning was used.

Then the researchers decided to be about as blatant as you can get, and put in a bolded statement, "HINT: The answer is NOT ten cents!"  This had the best improvement rate of all, but amazingly, still didn't eliminate all of the wrong answers.  Some people were so certain their intuition was right that they stuck to their guns -- apparently assuming that the researchers were deliberately trying to mislead them!

[Image licensed under the Creative Commons © Nevit Dilmen, Question mark 1, CC BY-SA 3.0]

If you find this tendency a little unsettling... well, you should.  It's one thing to stick to a demonstrably wrong answer in some silly hypothetical bat-and-ball problem; it's another thing entirely to cling to incorrect intuition or erroneous understanding when it affects how you live, how you act, how you vote.

It's why learning how to suspend judgment is so critical.  To be able to hold a question in your mind and not immediately jump to what seems like the "obvious answer" is one of the most important things there is.  I used to assign lateral thinking puzzles to my Critical Thinking students every so often -- I told them, "Think of these as mental calisthenics.  They're a way to exercise your problem-solving ability and look at problems from angles you might not think of right away.  Don't rush to find an answer; keep considering them until you're sure you're on the right track."

So I thought I'd throw a few of the more entertaining puzzles at you.  None of them involve much in the way of math (nothing past adding, subtracting, multiplying, and dividing), but all of them take an insight that requires pushing aside your first impression of how problems are solved.  Enjoy!  (I'll include the answers at the end of tomorrow's post, if any of them stump you.)

1.  The census taker problem

A census taker goes to a man's house, and asks for the ages of the man's three daughters.

"The product of their ages is 36," the man says.

The census taker replies, "That's not enough information to figure it out."

The man says, "Okay, well, the sum of their ages is equal to the house number across the street."

The census taker looks out of the window at the house across the street, and says, "I'm sorry, that's still not enough information to figure it out."

The man says, "Okay... my oldest daughter has red hair."

The census taker says, "Thank you," and writes down the ages.

How old are the three daughters?

2. The St. Ives riddle

The St. Ives riddle is a famous puzzle that goes back to (at least) the seventeenth century:

As I was going to St. Ives,
I met a man with seven wives.
Each wife had seven kids,
Each kid had seven cats,
Each cat had seven kits.
Kits, cats, kids, and wives, how many were going to St. Ives?

3.  The bear

A man goes for a walk.  He walks a mile south, a mile east, and a mile north, and after that is back where he started.  At that point, he sees a large bear rambling around.  What color is the bear?

4.  A curious sequence

What is the next number in this sequence: 8, 5, 4, 9, 1, 7, 6...

5.  Classifying the letters

You can classify the letters in the English alphabet as follows:

Group 1: A, M, T, U, V, W, Y

Group 2: B, C, D, E, K

Group 3: H, I, O, X

Group 4: N, S, Z

Group 5: F, G, J, L, P, Q, R

What's the reason for grouping them this way?

6.  The light bulb puzzle

At the top of a ten-story building are three ordinary incandescent light bulbs screwed into electrical sockets.  On the first floor are three switches, one for each bulb, but you don't know which switch turns on which bulb, and you can't see the bulbs (or their light) from the place where the switches are located.  How can you determine which switch operates which bulb... and only take a single trip from the first floor up to the tenth?

Have fun!

****************************************



Saturday, June 11, 2022

Locked into error

Back in 2011, author Kathryn Schulz did a phenomenal TED Talk called "On Being Wrong."  She looks at how easy it is to slip into error, and how hard it is not only to correct it, but (often) even to recognize that it's happened.  At the end, she urges us to try to find our way out of the "tiny, terrified space of rightness" that virtually all of us live in.

Unfortunately, that's one thing that she herself gets wrong.  Because for a lot of people, their belief in their rightness about everything isn't terrified; it's proudly, often belligerently, defiant.

I'm thinking of one person in particular, here, who regularly posts stuff on social media that is objectively wrong -- I mean, hard evidence, no question about it -- and does so in a combative way that comes across as, "I dare you to contradict me."  I've thus far refrained from saying anything.  One of my faults is that I'm a conflict avoider, but I also try to be cognizant of the cost/benefit ratio.  Maybe I'm misjudging, but I think the likelihood of my eliciting a "Holy smoke, I was wrong" -- about anything -- is as close to zero as you could get.

Now, allow me to say up front that I'm not trying to imply here that I'm right about everything, nor that I don't come across as cocky or snarky at times.  Kathryn Schulz's contention (and I think she's spot-on about this one) is that we all fall into the much-too-comfortable trap of believing that our view of the world perfectly reflects reality.  One of the most startling bullseyes Schulz makes in her talk is about how it feels to be wrong:

So why do we get stuck in this feeling of being right?  One reason, actually, has to do with the feeling of being wrong.  So let me ask you guys something...  How does it feel -- emotionally -- how does it feel to be wrong?  Dreadful.  Thumbs down.  Embarrassing...  Thank you, these are great answers, but they're answers to a different question.  You guys are answering the question: How does it feel to realize you're wrong?  Realizing you're wrong can feel like all of that and a lot of other things, right?  I mean, it can be devastating, it can be revelatory, it can actually be quite funny...  But just being wrong doesn't feel like anything.

I'll give you an analogy.  Do you remember that Looney Tunes cartoon where there's this pathetic coyote who's always chasing and never catching a roadrunner?  In pretty much every episode of this cartoon, there's a moment where the coyote is chasing the roadrunner and the roadrunner runs off a cliff, which is fine -- he's a bird, he can fly.  But the thing is, the coyote runs off the cliff right after him.  And what's funny -- at least if you're six years old -- is that the coyote's totally fine too.  He just keeps running -- right up until the moment that he looks down and realizes that he's in mid-air.  That's when he falls.  When we're wrong about something -- not when we realize it, but before that -- we're like that coyote after he's gone off the cliff and before he looks down.  You know, we're already wrong, we're already in trouble, but we feel like we're on solid ground.  So I should actually correct something I said a moment ago.  It does feel like something to be wrong; it feels like being right.
What brought this talk to mind -- and you should take fifteen minutes and watch the whole thing, because it's just that good -- is some research out of the University of California - Los Angeles published a couple of weeks ago in Psychological Review that looked at the neuroscience of these quick -- and once made, almost impossible to undo -- judgments about the world.


The study used a technique called electrocorticography to see what was going on in a part of the brain called the gestalt cortex, which is known to be involved in sensory interpretation.  In particular, the team analyzed the activity of the gestalt cortex when presented with the views of other people, some of which the test subjects agreed with, some with which they disagreed, and others about which they had yet to form an opinion.

The most interesting result had to do with the strength of the response.  The reaction of the gestalt cortex is most pronounced when we're confronted with views opposing our own, and with statements about which we've not yet decided.  In the former case, the response is to suppress the evaluative parts of the brain -- i.e., to dismiss immediately what we've read because it disagrees with what we already thought.  In the latter case, it amplifies evaluation, allowing us to make a quick judgment about what's going on, but once that's happened any subsequent evidence to the contrary elicits an immediate dismissal.  Once we've made our minds up -- and it happens fast -- we're pretty much locked in.

"We tend to have irrational confidence in our own experiences of the world, and to see others as misinformed, lazy, unreasonable or biased when they fail to see the world the way we do," said study lead author Matthew Lieberman, in an interview with Science Daily.  "We believe we have merely witnessed things as they are, which makes it more difficult to appreciate, or even consider, other perspectives.  The mind accentuates its best answer and discards the rival solutions.  The mind may initially process the world like a democracy where every alternative interpretation gets a vote, but it quickly ends up like an authoritarian regime where one interpretation rules with an iron fist and dissent is crushed.  In selecting one interpretation, the gestalt cortex literally inhibits others."

Evolutionarily, you can see how this makes perfect sense.  As a proto-hominid out on the African savanna, it was pretty critical to look at and listen to what's around you and make a quick judgment about its safety.  Stopping to ponder could be a good way to become a lion's breakfast.  The cost of making a wrong snap judgment and overestimating the danger is far lower than blithely going on your way and assuming everything is fine.  But now?  This hardwired tendency to squelch opposing ideas without consideration means we're unlikely to correct -- or even recognize -- that we've made a mistake.

I'm not sure what's to be done about this.  If anything can be done.  Perhaps it's enough to remind people -- including myself -- that our worldviews aren't flawless mirrors of reality, they're the result of our quick evaluation of what we see and hear.  And, most importantly, that we never lose by reconsidering our opinions and beliefs, weighing them against the evidence, and always keeping in mind the possibility that we might be wrong.  I'll end with another quote from Kathryn Schulz:
This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to, and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are these perfectly translucent windows, and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing.  That is not true, and if it were, life would be incredibly boring.  The miracle of your mind isn't that you can see the world as it is, it's that you can see the world as it isn't.  We can remember the past, and we can think about the future, and we can imagine what it's like to be some other person in some other place.  And we all do this a little differently...  And yeah, it is also why we get things wrong.

Twelve hundred years before Descartes said his famous thing about "I think therefore I am," this guy, St. Augustine, sat down and wrote "Fallor ergo sum" -- "I err, therefore I am."  Augustine understood that our capacity to screw up, it's not some kind of embarrassing defect in the human system, something we can eradicate or overcome.  It's totally fundamental to who we are.  Because, unlike God, we don't really know what's going on out there.  And unlike all of the other animals, we are obsessed with trying to figure it out.  To me, this obsession is the source and root of all of our productivity and creativity.

**************************************

Saturday, May 15, 2021

Thin ice

In her phenomenal TED talk "On Being Wrong," journalist Kathryn Schulz says, "[W]e all kind of wind up traveling through life, trapped in this little bubble of feeling very right about everything...  [and] I want to convince you that it is possible to step outside of that feeling and that if you can do so, it is the single greatest moral, intellectual and creative leap you can make."

I've often thought that that the willingness to entertain the possibility that your knowledge is incomplete -- that you may not have all the answers, and (more critically) that some of the answers you've arrived at might be false -- is the cornerstone of developing a real understanding of how things actually are.  Put a different way, certainty can be a blindfold.

[Image licensed under the Creative Commons Dale Schoonover, Kim Schoonover, Blindfold hat, CC BY 3.0]

I'm not saying I like finding out I'm wrong about something.  As Schulz points out, finding out you've made a mistake can be revelatory, enlightening, or hilarious -- but it can also be humiliating, frustrating, or devastating.  I'm reminded of one of the funniest scenes from The Big Bang Theory -- when Sheldon meets Stephen Hawking:


While most of us have never had the experience of embarrassing the hell out of ourselves in front of one of the smartest people in the world, I think we can all relate.  And part of what makes it funny -- and relatable -- is until it's pointed out, Sheldon can't fathom that he actually made a mistake.  Maybe there are few people as colossally arrogant as he is, but the truth is we are more like him than we want to admit.  We cling to the things we believe and what we think we understand with a fervor that would do the Spanish Inquisition proud.


The reason all this comes up is a paper this week in Proceedings of the National Academy of Sciences by Jeroen van Baar and Oriel Feldman-Hall (of Brown University) and David Halpern (of the University of Pennsylvania) called, "Intolerance of Uncertainty Modulates Brain-to-Brain Synchrony During Politically Polarized Perception."  In this study, the researchers gave a group of test subjects videos to watch -- strongly liberal, strongly conservative, and politically neutral -- and looked at the brain's response to the content.  What they found was that (unsurprisingly) some test subjects had strongly aversive reactions to the videos, but the strongest correlation to the strength of the response wasn't whether the watcher was him/herself conservative or liberal (putting to rest the idea that one side is intolerant and the other isn't), nor was it the perceived distance between the content of the video and the test subject's own belief; it was how intolerant the person was of uncertainty.

In other words, how angry you get over hearing political commentary you don't agree with depends largely on how unwilling you are to admit that your own understanding might be flawed.

It's kind of a devastating result, isn't it?  The polarization we're currently experiencing here in the United States (and undoubtedly elsewhere) is being driven by the fact that a great many people on both sides are absolutely and completely convinced they're right.  About everything.  Again to quote Kathryn Schulz, "This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are just these perfectly translucent windows and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing."

A lot of it, I think, boils down to fear.  To admit that we might be wrong -- fundamentally, deeply wrong, perhaps about something we've believed our entire lives -- is profoundly destabilizing.  What we thought was solid and everlasting turns out to be thin ice, but instead of taking steps to rectify our misjudgment and skate to safety, we just close our eyes and keep going.  There's a part of us that can't quite believe we might not have everything figured out.

Like I said, it's not that I enjoy being wrong myself; I find it just as mortifying as everyone else does.  So part of me hopes that I do have the big things figured out, that my most dearly-held assumptions about how the world works won't turn out to be completely in error.  But it behooves us all to keep in the back of our minds that human minds are fallible -- not just in the theoretical, "yeah, people make mistakes" sense, but that some of the things we're surest about may be incorrect.

Let's all work to become a little humbler, a little more uncomfortable with uncertainty -- as Schulz puts it, to be able to "step outside of that tiny, terrified space of rightness and look around at each other and look out at the vastness and complexity and mystery of the universe and be able to say, 'Wow, I don't know.  Maybe I'm wrong.'"

********************************

I have often been amazed and appalled at how the same evidence, the same occurrences, or the same situation can lead two equally-intelligent people to entirely different conclusions.  How often have you heard about people committing similar crimes and getting wildly different sentences, or identical symptoms in two different patients resulting in completely different diagnoses or treatments?

In Noise: A Flaw in Human Judgment, authors Daniel Kahneman (whose wonderful book Thinking, Fast and Slow was a previous Skeptophilia book-of-the-week), Olivier Sibony, and Cass Sunstein analyze the cause of this "noise" in human decision-making, and -- more importantly -- discuss how we can avoid its pitfalls.  Anything we can to to detect and expunge biases is a step in the right direction; even if the majority of us aren't judges or doctors, most of us are voters, and our decisions can make an enormous difference.  Those choices are critical, and it's incumbent upon us all to make them in the most clear-headed, evidence-based fashion we can manage.

Kahneman, Sibony, and Sunstein have written a book that should be required reading for anyone entering a voting booth -- and should also be a part of every high school curriculum in the world.  Read it.  It'll open your eyes to the obstacles we have to logical clarity, and show you the path to avoiding them.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]