Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label cognitive biases. Show all posts
Showing posts with label cognitive biases. Show all posts

Tuesday, February 18, 2025

Misremembering the truth

There are two distinct, but similar-sounding, cognitive biases that I've written about many times here at Skeptophilia because they are such tenacious barriers to rational thinking.

The first, confirmation bias, is our tendency to uncritically accept claims when they fit with our preconceived notions.  It's why a lot of conservative viewers of Fox News and liberal viewers of MSNBC sit there watching and nodding enthusiastically without ever stopping and saying, "... wait a moment."

The other, dart-thrower's bias, is more built-in.  It's our tendency to notice outliers (because of their obvious evolutionary significance as danger signals) and ignore, or at least underestimate, the ordinary as background noise.  The name comes from the thought experiment of being in a bar while there's a darts game going on across the room.  You'll tend to notice the game only when there's an unusual throw -- a bullseye, or perhaps impaling the bartender in the forehead -- and not even be aware of it otherwise.

Well, we thought dart-thrower's bias was more built into our cognitive processing system and confirmation bias more "on the surface" -- and the latter therefore more culpable, conscious, and/or controllable.  Now, it appears that confirmation bias might be just as hard-wired into our brains as dart-thrower's bias is.

I recently read a paper that shed some light on this rather troubling finding in Human Communication Research, describing a study conducted by a team led by Jason Coronel of Ohio State University.  In "Investigating the Generation and Spread of Numerical Misinformation: A Combined Eye Movement Monitoring and Social Transmission Approach," Coronel, along with Shannon Poulsen and Matthew D. Sweitzer, did a fascinating series of experiments that showed we not only tend to accept information that agrees with our previous beliefs without question, we honestly misremember information that disagrees -- and we misremember it in such a way that in our memories, it further confirms our beliefs!

The location of memories (from Memory and Intellectual Improvement Applied to Self-Education and Juvenile Instruction, by Orson Squire Fowler, 1850) [Image is in the Public Domain]

What Coronel and his team did was to present 110 volunteers with passages containing true numerical information on social issues (such as support for same-sex marriage and rates of illegal immigration).  In some cases, the passages agreed with what (according to polls) most people believe to be true, such as that the majority of Americans support same-sex marriage.  In other cases, the passages contained information that (while true) is widely thought to be untrue -- such as the fact that illegal immigration across the Mexican border has been dropping for years and in the last five years has been at its lowest rates since the mid-1990s.

Across the board, people tended to recall the information that aligned with the conventional wisdom correctly, and the information that didn't incorrectly.  Further -- and what makes this experiment even more fascinating -- is that when people read the unexpected information, data that contradicted the general opinion, eye-tracking monitors recorded that they hesitated while reading, as if they recognized that something was strange.  In the immigration passage, for example, they read that the rate of immigration had decreased from 12.8 million in 2007 to 11.7 million in 2014, and the readers' eyes bounced back and forth between the two numbers as if their brains were saying, "Wait, am I reading that right?"

So they spent longer on the passage that conflicted with what most people think -- and still tended to remember it incorrectly.  In fact, the majority of people who did remember wrong got the numbers right -- 12.8 million and 11.7 million -- showing that they'd paid attention and didn't just scoff and gloss over it when they hit something they thought was incorrect.  But when questioned afterward, they remembered the numbers backwards, as if the passage had actually supported what they'd believed prior to the experiment!

If that's not bad enough, Coronel's team then ran a second experiment, where the test subjects read the passage, then had to repeat the gist to another person, who then passed it to another, and so on.  (Remember the elementary school game of "Telephone?")  Not only did the data get flipped -- usually in the first transfer -- subsequently, the difference between the two numbers got greater and greater (thus bolstering the false, but popular, opinion even more strongly).  In the case of the immigration statistics, the gap between 2007 and 2014 not only changed direction, but by the end of the game it had widened from 1.1 million to 4.7 million.

This gives you an idea what we're up against in trying to counter disinformation campaigns.  And it also illustrates that I was wrong in one of my preconceived notions; that people falling for confirmation bias are somehow guilty of locking themselves deliberately into an echo chamber.  Apparently, both dart-thrower's bias and confirmation bias are somehow built into the way we process information.  We become so certain we're right that our brain subconsciously rejects any evidence to the contrary.

Why our brains are built this way is a matter of conjecture.  I wonder if perhaps it might be our tribal heritage at work; that conforming to the norm, and therefore remaining a member of the tribe, has a greater survival value than being the maverick who sticks to his/her guns about a true but unpopular belief.  That's pure speculation, of course.  But what it illustrates is that once again, our very brains are working against us in fighting Fake News -- which these days is positively frightening, given how many powerful individuals and groups are, in a cold and calculated fashion, disseminating false information in an attempt to mislead us, frighten us, or anger us, and so maintain their positions of power.

****************************************

Monday, June 13, 2022

The google trap

The eminent physicist Stephen Hawking said, "The greatest enemy of knowledge is not ignorance; it is the illusion of knowledge."

Somewhat more prosaically, my dad once said, "Ignorance can be cured.  We're all ignorant about some things.  Stupid, on the other hand, goes all the way to the bone."

Both of these sayings capture an unsettling idea; that often it's more dangerous to think you understand something than it is to admit you don't.  This idea was illustrated -- albeit using an innocuous example -- in a 2002 paper called "The Illusion of Explanatory Depth" by Leo Rozenblit and Frank Keil, of Yale University.  What they did is to ask people to rate their level of understanding of a simple, everyday object (for example, how a zipper works), on a scale of zero to ten.  Then, they asked each participant to write down an explanation of how zippers work in as much detail as they could.  Afterward, they asked the volunteers to re-rate their level of understanding.

Across the board, people rated themselves lower the second time, after a single question -- "Okay, then explain it to me" -- shone a spotlight on how little they actually knew.

The problem is, unless you're in school, usually no one asks the question.  You can claim you understand something, you can even have a firmly-held opinion about it, and there's no guarantee that your stance is even within hailing distance of reality.

And very rarely does anyone challenge you to explain yourself in detail.

[Image is in the Public Domain]

If that's not bad enough, a recent paper by Adrian Ward (of the University of Texas - Austin) showed that not only do we understand way less than we think we do, we fold what we learn from other sources into our own experiential knowledge, regardless of the source of that information.  Worse still, that incorporation is so rapid and smooth that afterward, we aren't even aware of where our information (right or wrong) comes from.

Ward writes:

People frequently search the internet for information.  Eight experiments provide evidence that when people “Google” for online information, they fail to accurately distinguish between knowledge stored internally—in their own memories—and knowledge stored externally—on the internet.  Relative to those using only their own knowledge, people who use Google to answer general knowledge questions are not only more confident in their ability to access external information; they are also more confident in their own ability to think and remember.  Moreover, those who use Google predict that they will know more in the future without the help of the internet, an erroneous belief that both indicates misattribution of prior knowledge and highlights a practically important consequence of this misattribution: overconfidence when the internet is no longer available.  Although humans have long relied on external knowledge, the misattribution of online knowledge to the self may be facilitated by the swift and seamless interface between internal thought and external information that characterizes online search.  Online search is often faster than internal memory search, preventing people from fully recognizing the limitations of their own knowledge.  The internet delivers information seamlessly, dovetailing with internal cognitive processes and offering minimal physical cues that might draw attention to its contributions.  As a result, people may lose sight of where their own knowledge ends and where the internet’s knowledge begins.  Thinking with Google may cause people to mistake the internet’s knowledge for their own.

I recall vividly trying, with minimal success, to fight this in the classroom.  Presented with a question, many students don't stop to try to work it out themselves, they immediately jump to looking it up on their phones.  (One of many reasons I had a rule against having phones out during class, another exercise in frustration given how clever teenagers are at hiding what they're doing.)  I tried to make the point over and over that there's a huge difference between looking up a fact (such as the average number of cells in the human body) and looking up an explanation (such as how RNA works).  I use Google and/or Wikipedia for the former all the time.  The latter, on the other hand, makes it all too easy simply to copy down what you find online, allowing you to have an answer to fill in the blank irrespective of whether you have the least idea what any of it means.

Even Albert Einstein, pre-internet though he was, saw the difference, and the potential problem therein.  Once asked how many feet were in a mile, the great physicist replied, "I don't know.  Why should I fill my brain with facts I can find in two minutes in any standard reference book?”

In the decades since Einstein's said this, that two minutes has shrunk to about ten seconds, as long as you have internet access.  And unlike the standard reference books he mentioned, you have little assurance that the information you found online is even close to right.

Don't get me wrong; I think that our rapid, and virtually unlimited, access to human knowledge is a good thing.  But like most good things, it comes at a cost, and that cost is that we have to be doubly cautious to keep our brains engaged.  Not only is there information out there that is simply wrong, there are people who are (for various reasons) very eager to convince you they're telling the truth when they're not.  This has always been true, of course; it's just that now, there are few barriers to having that erroneous information bombard us all day long -- and Ward's paper shows just how quickly we can fall for it.

The cure is to keep our rational faculties online.  Find out if the information is coming from somewhere reputable and reliable.  Compare what you're being told with what you know to be true from your own experience.  Listen to or read multiple sources of information -- not only the ones you're inclined to agree with automatically.  It might be reassuring to live in the echo chamber of people and media which always concur with our own preconceived notions, but it also means that if something is wrong, you probably won't realize it.

Like I said in Saturday's post, finding out you're wrong is no fun.  More than once I've posted stuff here at Skeptophilia and gotten pulled up by the short hairs when someone who knows better tells me I've gotten it dead wrong.  Embarrassing as it is, I've always posted retractions, and often taken the original post down.  (There's enough bullshit out on the internet without my adding to it.)

So we all need to be on our guard whenever we're surfing the web or listening to the news or reading a magazine.  Our tendency to absorb information without question, regardless of its provenance -- especially when it seems to confirm what we want to believe -- is a trap we can all fall into, and Ward's paper shows that once inside, it can be remarkably difficult to extricate ourselves.

**************************************

Tuesday, December 10, 2019

Misremembering the truth

There are two distinct, but similar-sounding, cognitive biases that I've written about many times here at Skeptophilia because they are such tenacious barriers to rational thinking.

The first, confirmation bias, is our tendency to uncritically accept claims when they fit with our preconceived notions.  It's why a lot of conservative viewers of Fox News and liberal viewers of MSNBC sit there watching and nodding enthusiastically without ever stopping and saying, "... wait a moment."

The other, dart-thrower's bias, is more built-in.  It's our tendency to notice outliers (because of their obvious evolutionary significance as danger signals) and ignore, or at least underestimate, the ordinary as background noise.  The name comes from the thought experiment of being in a bar while there's a darts game going on across the room.  You'll tend to notice the game only when there's an unusual throw -- a bullseye, or perhaps impaling the bartender in the forehead -- and not even be aware of it otherwise.

Well, we thought dart-thrower's bias was more built into our cognitive processing system and confirmation bias more "on the surface" -- and the latter therefore more culpable, conscious, and/or controllable.  Now, it appears that confirmation bias might be just as hard-wired into our brains as dart-thrower's bias is.

A paper appeared this week in Human Communication Research, describing research conducted by a team led by Jason Coronel of Ohio State University.  In "Investigating the Generation and Spread of Numerical Misinformation: A Combined Eye Movement Monitoring and Social Transmission Approach," Coronel, along with Shannon Poulsen and Matthew D. Sweitzer, did a fascinating series of experiments that showed we not only tend to accept information that agrees with our previous beliefs without question, we honestly misremember information that disagrees -- and we misremember it in such a way that in our memories, it further confirms our beliefs!

The location of memories (from Memory and Intellectual Improvement Applied to Self-Education and Juvenile Instruction, by Orson Squire Fowler, 1850) [Image is in the Public Domain]

What Coronel and his team did was to present 110 volunteers with passages containing true numerical information on social issues (such as support for same-sex marriage and rates of illegal immigration).  In some cases, the passages agreed with what (according to polls) most people believe to be true, such as that the majority of Americans support same-sex marriage.  In other cases, the passages contained information that (while true) is widely thought to be untrue -- such as the fact that illegal immigration across the Mexican border has been dropping for years and is now at its lowest rates since the mid-1990s.

Across the board, people tended to recall the information that aligned with the conventional wisdom correctly, and the information that didn't incorrectly.  Further -- and what makes this experiment even more fascinating -- is that when people read the unexpected information, data that contradicted the general opinion, eye-tracking monitors recorded that they hesitated while reading, as if they recognized that something was strange.  In the immigration passage, for example, they read that the rate of immigration had decreased from 12.8 million in 2007 to 11.7 million in 2014, and the readers' eyes bounced back and forth between the two numbers as if their brains were saying, "Wait, am I reading that right?"

So they spent longer on the passage that conflicted with what most people think -- and still tended to remember it incorrectly.  In fact, the majority of people who did remember wrong got the numbers right -- 12.8 million and 11.7 million -- showing that they'd paid attention and didn't just scoff and gloss over it when they hit something they thought was incorrect.  But when questioned afterward, they remembered the numbers backwards, as if the passage had actually supported what they'd believed prior to the experiment!

If that's not bad enough, Coronel's team then ran a second experiment, where the test subjects read the passage, then had to repeat the gist to another person, who then passed it to another, and so on.  (Remember the elementary school game of "Telephone?")  Not only did the data get flipped -- usually in the first transfer -- subsequently, the difference between the two numbers got greater and greater (thus bolstering the false, but popular, opinion even more strongly).  In the case of the immigration statistics, the gap between 2007 and 2014 not only changed direction, but by the end of the game it had widened from 1.1 million to 4.7 million.

This gives you an idea what we're up against in trying to counter disinformation campaigns.  And it also illustrates that I was wrong in one of my preconceived notions; that people falling for confirmation bias are somehow guilty of locking themselves deliberately into an echo chamber.  Apparently, both dart-thrower's bias and confirmation bias are somehow built into the way we process information.  We become so certain we're right that our brain subconsciously rejects any evidence to the contrary.

Why our brains are built this way is a matter of conjecture.  I wonder if perhaps it might be our tribal heritage at work; that conforming to the norm, and therefore remaining a member of the tribe, has a greater survival value than being the maverick who sticks to his/her guns about a true but unpopular belief.  That's pure speculation, of course.  But what it illustrates is that once again, our very brains are working against us in fighting Fake News -- which these days is positively frightening, given how many powerful individuals and groups are, in a cold and calculated fashion, disseminating false information in an attempt to mislead us, frighten us, or anger us, and so maintain their positions of power.

***********************

This week's Skeptophilia book of the week is brand new; Brian Clegg's wonderful Dark Matter and Dark Energy: The Hidden 95% of the Universe.  In this book, Clegg outlines "the biggest puzzle science has ever faced" -- the evidence for the substances that provide the majority of the gravitational force holding the nearby universe together, while simultaneously making the universe as a whole fly apart -- and which has (thus far) completely resisted all attempts to ascertain its nature.

Clegg also gives us some of the cutting-edge explanations physicists are now proposing, and the experiments that are being done to test them.  The science is sure to change quickly -- every week we seem to hear about new data providing information on the dark 95% of what's around us -- but if you want the most recently-crafted lens on the subject, this is it.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, September 10, 2019

Invasion of the randonauts

Today the following happened:
  • In the last few months I've been watching episodes of Agatha Christie's Poirot, and last night I watched the amazing "And Then There Were None."  Today on Facebook, one of my friends was participating in the thing that's going around to post your seven favorite book covers, and she posted the cover of the book by the same name.
  • When I went outside ten minutes ago, the cows in the field across the street were all staring in my direction.
  • An acquaintance who moved away five years ago emailed me last night saying he was in town for a couple of days and asking if I wanted to get together for coffee.  Today I went to the grocery store, and who should be there but him.
  • I looked out of my office window a few minutes ago, exactly at the right time to see a hawk zoom by, only about ten feet from the window.
  • I noticed that my angel's trumpet plant has nine new flowers on it.  The scientific name of the plant, Brugmansia, has ten letters, and today is September 9 (9-10).

Why all this stuff comes up is because of a link sent to me by a loyal reader of Skeptophilia yesterday, about a new hobby some people have -- they call themselves the "Randonauts."  The gist is that these folks are trying to prove that we're either in some kind of computer simulation, or else there's some Weird Shit going on, or both.

The way they do it is that you log into a site with a random number generator (the full instructions are in the link), and it will use those to plot out latitudes and longitudes of places near you.  After doing this a bunch of times, it will spit out the set of coordinates that got the most hits.  You go there, and...

... stuff is supposed to happen.  Here are a few things people have reported when doing this:
All of this is supposed to signify that our lives are being controlled, either by some super-intelligent power or by a simulation, and this is making the random number generator not so random -- and directing us to where there are leaks in the matrix, or something.

Tamlin Magee, who wrote the article for The Outline I linked above, was only mildly impressed by her experiences, which I encourage you to read about.  Here's her conclusion:
Whatever you think of the validity of hacking reality or the nature of our possibly deterministic universe, my time randonauting pushed me to pay closer attention to my environment, to stop and notice things, like artwork, signs, symbols, nature, and objects, that I might have otherwise filtered out by default. 
Do I understand the theories behind it all?  Absolutely not.  Do I think I’m challenging a demiurgical Great Programmer, jumping into alternate dimensions or tearing apart the space-time continuum?  Probably also not.  But my trips, nonetheless, felt imbued by a strangely comforting, esoteric mindfulness.  And if only for that reason, I will be randonauting again.
Now, far be it from me to criticize weird and semi-pointless hobbies.  I'm a geocacher, after all, not to mention a birdwatcher (a hobby a former student aptly described as "Pokémon for adults").  So I'm glad Magee had fun, and in that spirit, I'd like to participate myself.

But I don't buy the conclusion any more than she did -- that you're more likely to see weird stuff when you do this than you are at any other time.  I think, as Magee points out, what happens is you're more likely to notice it.

I mean, think about it.  You go anywhere, and your instructions are: notice anything weird.  No restrictions.  Not even any definition of what qualifies as "weird."

My guess is that this would work in every single location in the world you might consider going to.  Because, face it, Weird Shit is everywhere.

So what we have here is a bad case of dart-thrower's bias -- our naturally-evolved tendency to notice outliers.  That, and the desire -- also natural -- that there be some meaning in what happens around us, that it isn't all just chaos.  (We took a look at the darker side of this drive yesterday.)

Anyhow, I think this sounds like it could be entertaining, as long as you don't put too much stock in your results showing that we're in a simulation.  Although I have to admit, given how bizarre the news has been lately, it's crossed my mind more than once that maybe we are in a computer simulation, and the aliens running the simulation have gotten bored, and now they're just fucking with us.

Certainly would explain a lot of what comes out of Donald Trump's mouth.

********************************************

This week's Skeptophilia book recommendation is pure fun: science historian James Burke's Circles: Fifty Round Trips Through History, Technology, Science, and Culture.  Burke made a name for himself with his brilliant show Connections, where he showed how one thing leads to another in discoveries, and sometimes two seemingly unconnected events can have a causal link (my favorite one is his episode about how the invention of the loom led to the invention of the computer).

In Circles, he takes us through fifty examples of connections that run in a loop -- jumping from one person or event to the next in his signature whimsical fashion, and somehow ending up in the end right back where he started.  His writing (and his films) always have an air of magic to me.  They're like watching a master conjuror create an illusion, and seeing what he's done with only the vaguest sense of how he pulled it off.

So if you're an aficionado of curiosities of the history of science, get Circles.  You won't be disappointed.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, October 13, 2018

The danger of myside bias

Fighting bad thinking is an uphill battle some days.

I'm very much including myself in this assessment.  I have biases and preconceived notions and places where I stumble just like everyone else.  Fixing these errors would be nice -- can you imagine the world if all of us were able to think clearly and make our decisions based on evidence?

A pipe dream, I know, and all the more so after I read a new paper in Journal of Cognitive Psychology called "My Point is Valid, Yours is Not: Myside Bias in Reasoning About Abortion," by Vladimíra ÄŒavojová, Jakub Å rol, and Magdalena Adamus of the Institute of Experimental Psychology at the Slovak Academy of Sciences in Bratislava, Slovakia.

In an elegant piece of research, ÄŒavojová et al. gave a series of logic puzzles to volunteers who had been asked in a prior questionnaire what their attitudes toward abortion were, and whether they had prior experience with formal logic.  They were then asked to determine whether various syllogisms were valid or not.  Some were neutral:
All mastiffs are dogs.
Some mastiffs are black.
Therefore, some of the things that are black are dogs. 
(Valid)
Some had to do with abortion:
All fetuses are human beings
Some human beings should be protected.
Therefore, some of those who should be protected are fetuses. 
(Invalid)
To solve each of the syllogisms, it should be irrelevant what your opinion on abortion is; the rules are that if the premises (the first two statements) are true, and the argument is valid, then the conclusion is true.  The participants were told from the outset to treat the premises as true regardless of their views.

Gregor Reisch, Logic Presents Its Main Themes (ca. 1505) [Image is in the Public Domain]

What is fascinating is that both people who were pro-choice and pro-life had a hard time rejecting invalid syllogisms that gave them a conclusion they agreed with, and accepting valid syllogisms that gave them a conclusion they disagreed with.  This pattern held equally with people who had training in formal logic and those who did not.  It's as if once we're considering a strongly-held opinion, our ability to use logic goes out the window.

The authors write:
The study explores whether people are more inclined to accept a conclusion that confirms their prior beliefs and reject one they personally object to even when both follow the same logic.  Most of the prior research in this area has relied on the informal reasoning paradigm; in this study, however, we applied a formal reasoning paradigm to distinguish between cognitive and motivational mechanisms leading to myside bias in reasoning on value-laden topics (in this case abortions).  Slovak and Polish (N = 387) participants indicated their attitudes toward abortion and then evaluated logical syllogisms with neutral, pro-choice, or pro-life content.  We analysed whether participants’ prior attitudes influenced their ability to solve these logically identical reasoning tasks and found that prior attitudes were the strongest predictor of myside bias in evaluating both valid and invalid syllogisms, even after controlling for logical validity (the ability to solve neutral syllogisms) and previous experience of logic.

Which reinforces my not very optimistic notion that however good our brains are, humans remain primarily emotional creatures.  When something elicits a strong emotional response, we're perfectly willing to abandon reasoning -- and sometimes aren't even aware we're doing it.

All of which bodes nothing good from any attempt to correct these errors.  As we've discussed before, even trying to combat bad thinking initiates the backfire effect, wherein people tend to double down on beliefs if they're challenged, and even if they're given concrete evidence that they're wrong.

Makes you wonder what I think I'm accomplishing by writing this blog, doesn't it?

It's not futile, however; however this emotional bent is impossible to eradicate, you can adjust for it as long as you know it's there.  So I suppose this research should give us hope that even if we can't think with perfect clarity all the time, we can at least move in the right direction.
 **************************************

This week's Skeptophilia book recommendation is from the brilliant essayist and polymath John McPhee, frequent contributor to the New Yorker.  I swear, he can make anything interesting; he did a book on citrus growers in Florida that's absolutely fascinating.  But even by his standards, his book The Control of Nature is fantastic.  He looks at times that humans have attempted to hold back the forces of nature -- the attempts to keep the Mississippi River from changing its path to what is now the Atchafalaya River, efforts in California to stop wildfires and mudslides, and a crazy -- and ultimately successful -- plan to save a harbor in Iceland from a volcanic eruption using ice-cold seawater to freeze the lava.

Anyone who has interest in the natural world should read this book -- but it's not just about the events themselves, it's about the people who participated in them.  McPhee is phenomenal at presenting the human side of his investigations, and their stories will stick with you a long time after you close the last page.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]