Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, September 14, 2019

The illusion of truth

Because we apparently need one more cognitive bias to challenge our confidence in what we hear on the news on a daily basis, today I'm going to tell you about the illusory truth effect.

The idea here is that if you hear a falsehood repeated often enough, in your mind, it becomes a fact.  This is the "big lie" approach that Hitler recommends in Mein Kampf:
All this was inspired by the principle—which is quite true within itself—that in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods. 
It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously.  Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation.  For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.
But the most referenced quote framing this idea comes from Nazi Propaganda Minister Joseph Goebbels: "If you tell a lie big enough and keep repeating it, people will eventually come to believe it."

Which is more than a little ironic, because there's no evidence Goebbels ever said (or wrote) that -- although he certainly did embody the spirit of it.

The topic comes up because of a study that appeared in Cognition this week, called, "An Initial Accuracy Focus Prevents Illusory Truth," by psychologists Nadia M. Brashier (of Harvard University) and Emmaline Drew Eliseev and Elizabeth J. Marsh (of Duke University).  And what they found was simultaneously dismaying and heartening; that it is very easy to get people to fall for illusory truth through repetition, and they can be inoculated against it by having them read the source material with a critical eye the first time, striking out erroneous information.  Doing that, apparently, inoculates them against falling for the lie later, even after repeated exposure.

[Image licensed under the Creative Commons RyanMinkoff, Academic dishonesty, CC BY-SA 4.0]

What's especially frightening about the dismaying part of this study is that being taken in by repeated falsehoods even works for purely factual, easily checkable information.  One of the statements they used was "The fastest land mammal is the leopard," which most people recognize as false (the fastest land mammal is the cheetah).  The surmise is that if you keep seeing the same incorrect statement, you begin to doubt your own understanding or your own memory.

I know this happens to me.  There are few topics I'm so completely confident about that I could hear someone make a contradicting statement and think, "No, that's definitely wrong."  I'm much more likely to think, "Wait... am I remembering incorrectly?"  Part of the problem is that I'm a raging generalist; I know a little bit about a great many things, so if an expert comes along and says I've got it wrong, I'm putting my money on the expert.  (I've also been called a "dilettante" or a "dabbler" or "a light year across and an inch deep," but on the whole I like "generalist" better.)

The problem is, it's easy to mistake someone who simply speaks with a lot of confidence as being an expert.  Take, for example, Donald Trump.  (Please.  No, really, please.  Take him.)  He's lied so many times there's a whole Wikipedia page devoted to "Veracity of Statements by Donald Trump."  As only one example of the illusory truth effect, take his many-times-repeated statement that he would have won the popular vote if it hadn't been for millions of votes cast fraudulently for Hillary Clinton, and also that his electoral college win was "the biggest landslide in history" (it wasn't even close; of the 58 presidential elections the United States has had, Donald Trump's electoral college win comes in at #46).

The problem is, Trump makes these statements with so much confidence, and with such frequency, that it's brought up the question of whether he actually believes them to be true.  Even if he's lying, the technique is remarkably effective -- a sort of Gish gallop of falsehood (the latter term named after creationist Duane Gish, who was known for swamping his debate opponents with rapid-fire arguments of dubious veracity, wearing them down simply by the overall volume).  A lot of his supporters believe that he won by a landslide, that Clinton only did as well as she did because of rampant fraud, and a host of other demonstrably false beliefs (such as the size of Trump's inauguration crowd, attendance at his rallies, how well the economy is doing, and that the air and water in the United States are the highest quality in the world).

So to put the research by Brashier et al. to work, somehow people would have to be willing and able to fact check these statements as they're happening, the first time they hear them -- not very likely, especially given the role of confirmation bias in affecting how much people believe these statements at the outset (someone who supports Trump already would be more likely to believe him, for example when he's stated that the number of illegal immigrants is the highest it's ever been, when in fact it peaked in 2007 and has been falling steadily ever since).

In any case, it's hard to see how all this helps us.  The traction of "alternative facts" has simply become too great, as has the vested interest of partisan and sensationalized media.  Not for nothing do Brashier et al. call our current situation "the post-truth world."

********************************************

This week's Skeptophilia book recommendation is pure fun: science historian James Burke's Circles: Fifty Round Trips Through History, Technology, Science, and Culture.  Burke made a name for himself with his brilliant show Connections, where he showed how one thing leads to another in discoveries, and sometimes two seemingly unconnected events can have a causal link (my favorite one is his episode about how the invention of the loom led to the invention of the computer).

In Circles, he takes us through fifty examples of connections that run in a loop -- jumping from one person or event to the next in his signature whimsical fashion, and somehow ending up in the end right back where he started.  His writing (and his films) always have an air of magic to me.  They're like watching a master conjuror create an illusion, and seeing what he's done with only the vaguest sense of how he pulled it off.

So if you're an aficionado of curiosities of the history of science, get Circles.  You won't be disappointed.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





No comments:

Post a Comment