The first, confirmation bias, is our tendency to uncritically accept claims when they fit with our preconceived notions. It's why a lot of conservative viewers of Fox News and liberal viewers of MSNBC sit there watching and nodding enthusiastically without ever stopping and saying, "... wait a moment."
The other, dart-thrower's bias, is more built-in. It's our tendency to notice outliers (because of their obvious evolutionary significance as danger signals) and ignore, or at least underestimate, the ordinary as background noise. The name comes from the thought experiment of being in a bar while there's a darts game going on across the room. You'll tend to notice the game only when there's an unusual throw -- a bullseye, or perhaps impaling the bartender in the forehead -- and not even be aware of it otherwise.
Well, we thought dart-thrower's bias was more built into our cognitive processing system and confirmation bias more "on the surface" -- and the latter therefore more culpable, conscious, and/or controllable. Now, it appears that confirmation bias might be just as hard-wired into our brains as dart-thrower's bias is.
A paper appeared this week in Human Communication Research, describing research conducted by a team led by Jason Coronel of Ohio State University. In "Investigating the Generation and Spread of Numerical Misinformation: A Combined Eye Movement Monitoring and Social Transmission Approach," Coronel, along with Shannon Poulsen and Matthew D. Sweitzer, did a fascinating series of experiments that showed we not only tend to accept information that agrees with our previous beliefs without question, we honestly misremember information that disagrees -- and we misremember it in such a way that in our memories, it further confirms our beliefs!
The location of memories (from Memory and Intellectual Improvement Applied to Self-Education and Juvenile Instruction, by Orson Squire Fowler, 1850) [Image is in the Public Domain]
Across the board, people tended to recall the information that aligned with the conventional wisdom correctly, and the information that didn't incorrectly. Further -- and what makes this experiment even more fascinating -- is that when people read the unexpected information, data that contradicted the general opinion, eye-tracking monitors recorded that they hesitated while reading, as if they recognized that something was strange. In the immigration passage, for example, they read that the rate of immigration had decreased from 12.8 million in 2007 to 11.7 million in 2014, and the readers' eyes bounced back and forth between the two numbers as if their brains were saying, "Wait, am I reading that right?"
So they spent longer on the passage that conflicted with what most people think -- and still tended to remember it incorrectly. In fact, the majority of people who did remember wrong got the numbers right -- 12.8 million and 11.7 million -- showing that they'd paid attention and didn't just scoff and gloss over it when they hit something they thought was incorrect. But when questioned afterward, they remembered the numbers backwards, as if the passage had actually supported what they'd believed prior to the experiment!
If that's not bad enough, Coronel's team then ran a second experiment, where the test subjects read the passage, then had to repeat the gist to another person, who then passed it to another, and so on. (Remember the elementary school game of "Telephone?") Not only did the data get flipped -- usually in the first transfer -- subsequently, the difference between the two numbers got greater and greater (thus bolstering the false, but popular, opinion even more strongly). In the case of the immigration statistics, the gap between 2007 and 2014 not only changed direction, but by the end of the game it had widened from 1.1 million to 4.7 million.
This gives you an idea what we're up against in trying to counter disinformation campaigns. And it also illustrates that I was wrong in one of my preconceived notions; that people falling for confirmation bias are somehow guilty of locking themselves deliberately into an echo chamber. Apparently, both dart-thrower's bias and confirmation bias are somehow built into the way we process information. We become so certain we're right that our brain subconsciously rejects any evidence to the contrary.
Why our brains are built this way is a matter of conjecture. I wonder if perhaps it might be our tribal heritage at work; that conforming to the norm, and therefore remaining a member of the tribe, has a greater survival value than being the maverick who sticks to his/her guns about a true but unpopular belief. That's pure speculation, of course. But what it illustrates is that once again, our very brains are working against us in fighting Fake News -- which these days is positively frightening, given how many powerful individuals and groups are, in a cold and calculated fashion, disseminating false information in an attempt to mislead us, frighten us, or anger us, and so maintain their positions of power.
This week's Skeptophilia book of the week is brand new; Brian Clegg's wonderful Dark Matter and Dark Energy: The Hidden 95% of the Universe. In this book, Clegg outlines "the biggest puzzle science has ever faced" -- the evidence for the substances that provide the majority of the gravitational force holding the nearby universe together, while simultaneously making the universe as a whole fly apart -- and which has (thus far) completely resisted all attempts to ascertain its nature.
Clegg also gives us some of the cutting-edge explanations physicists are now proposing, and the experiments that are being done to test them. The science is sure to change quickly -- every week we seem to hear about new data providing information on the dark 95% of what's around us -- but if you want the most recently-crafted lens on the subject, this is it.
[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]