Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label illusory truth effect. Show all posts
Showing posts with label illusory truth effect. Show all posts

Friday, March 25, 2022

Truth by repetition

You probably have heard the quote attributed to Nazi propaganda minister Joseph Goebbels: "If you tell a lie big enough and continue to repeat it, eventually people will come to believe it."  This has become a staple tactic in political rhetoric -- an obvious recent example being Donald Trump's oft-repeated declaration that he won the 2020 presidential election, despite bipartisan analysis across the United States demonstrating unequivocally that this is false.  (The tactic works; a huge number of Trump supporters still think the election was stolen.)

It turns out that the "illusory truth effect" or "truth-by-repetition effect," as the phenomenon is called, still works even if the claim is entirely implausible.  A study by psychologist Doris Lacassagne at the Université Catholique de Louvain (in Belgium) recently presented 232 test subjects with a variety of ridiculous statements, including "the Earth is a perfect cube," "smoking is good for the lungs," "elephants weigh less than ants," and "rugby is the sport associated with Wimbledon."  In the first phase of the experiment, they were asked to rate the statements not for plausibility, but for how "interesting" they were.  After this, the volunteers were given lists of statements to evaluate for plausibility, and were told ahead of time that some of the statements would be repeated, and that there would be statements from the first list included on the second along with completely new claims.

The results were a little alarming, and support Goebbels's approach to lying.  The false statements -- even some of the entirely ridiculous ones -- gained plausibility from repetition.  (To be fair, the ratings still had average scores on the "false" side of the rating spectrum; but they did shift toward increasing veracity.)

The ones that showed the greatest shift were the ones that required at least a vague familiarity with science or technical matters, such as "monsoons are caused by earthquakes."  It only took a few repetitions to generate movement toward the "true" end of the rating scale, which is scary.  Not all the news was bad, though; although 53% of the participants showed a positive illusory truth effect, 28% showed a negative effect -- repeating false statements triggered their plausibility assessments to decrease.  (I wonder if this was because people who actually know what they're talking about become increasingly pissed off by seeing the same idiotic statement over and over.  I suspect that's how I would react.)

Of course, recognizing that statements are false requires some background knowledge.  I'd be much more likely to fall for believing a false statement about (for example) economics, because I don't know much about the subject; presumably I'd be much harder to fool about biology.  It's very easy for us to see some claim about a subject we're not that familiar with and say, "Huh!  I didn't know that!" rather than checking its veracity -- especially if we see the same claim made over and over.

[Image licensed under the Creative Commons Zabou, Politics, CC BY 3.0]

I honestly have no idea what we could do about this.  The downside of the Freedom of Speech amendment in the Constitution of the United States means that with a limited number of exceptions -- slander, threats of violence, vulgarity, and hate speech come to mind -- people can pretty much say what they want on television.  The revocation of the FCC's Fairness Doctrine in 1987 meant that news media no longer were required to give a balanced presentation of all sides of the issues, and set us up for the morass of partisan editorializing that the nightly news has become in the last few years.  (And, as I've pointed out more than once, it's not just outright lying that is the problem; partisan media does as much damage by what they don't tell you as what they do.  If a particular news channel's favorite political figure does something godawful, and the powers-that-be at the channel simply decide not to mention it, the listeners will never find out about  it -- especially given that another very successful media tactic has been convincing the consumers that "everyone is lying to you except us.")

It's a quandary.  There's currently no way to compel news commentators to tell the truth, or to force them to tell their listeners parts of the news that won't sit well with them.  Unless what the commentator says causes demonstrable harm, the FCC pretty much has its hands tied.

So the Lacassagne study seems to suggest that as bad as partisan lies have gotten, we haven't nearly reached the bottom of the barrel yet.

**************************************

Saturday, September 14, 2019

The illusion of truth

Because we apparently need one more cognitive bias to challenge our confidence in what we hear on the news on a daily basis, today I'm going to tell you about the illusory truth effect.

The idea here is that if you hear a falsehood repeated often enough, in your mind, it becomes a fact.  This is the "big lie" approach that Hitler recommends in Mein Kampf:
All this was inspired by the principle—which is quite true within itself—that in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods. 
It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously.  Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation.  For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.
But the most referenced quote framing this idea comes from Nazi Propaganda Minister Joseph Goebbels: "If you tell a lie big enough and keep repeating it, people will eventually come to believe it."

Which is more than a little ironic, because there's no evidence Goebbels ever said (or wrote) that -- although he certainly did embody the spirit of it.

The topic comes up because of a study that appeared in Cognition this week, called, "An Initial Accuracy Focus Prevents Illusory Truth," by psychologists Nadia M. Brashier (of Harvard University) and Emmaline Drew Eliseev and Elizabeth J. Marsh (of Duke University).  And what they found was simultaneously dismaying and heartening; that it is very easy to get people to fall for illusory truth through repetition, and they can be inoculated against it by having them read the source material with a critical eye the first time, striking out erroneous information.  Doing that, apparently, inoculates them against falling for the lie later, even after repeated exposure.

[Image licensed under the Creative Commons RyanMinkoff, Academic dishonesty, CC BY-SA 4.0]

What's especially frightening about the dismaying part of this study is that being taken in by repeated falsehoods even works for purely factual, easily checkable information.  One of the statements they used was "The fastest land mammal is the leopard," which most people recognize as false (the fastest land mammal is the cheetah).  The surmise is that if you keep seeing the same incorrect statement, you begin to doubt your own understanding or your own memory.

I know this happens to me.  There are few topics I'm so completely confident about that I could hear someone make a contradicting statement and think, "No, that's definitely wrong."  I'm much more likely to think, "Wait... am I remembering incorrectly?"  Part of the problem is that I'm a raging generalist; I know a little bit about a great many things, so if an expert comes along and says I've got it wrong, I'm putting my money on the expert.  (I've also been called a "dilettante" or a "dabbler" or "a light year across and an inch deep," but on the whole I like "generalist" better.)

The problem is, it's easy to mistake someone who simply speaks with a lot of confidence as being an expert.  Take, for example, Donald Trump.  (Please.  No, really, please.  Take him.)  He's lied so many times there's a whole Wikipedia page devoted to "Veracity of Statements by Donald Trump."  As only one example of the illusory truth effect, take his many-times-repeated statement that he would have won the popular vote if it hadn't been for millions of votes cast fraudulently for Hillary Clinton, and also that his electoral college win was "the biggest landslide in history" (it wasn't even close; of the 58 presidential elections the United States has had, Donald Trump's electoral college win comes in at #46).

The problem is, Trump makes these statements with so much confidence, and with such frequency, that it's brought up the question of whether he actually believes them to be true.  Even if he's lying, the technique is remarkably effective -- a sort of Gish gallop of falsehood (the latter term named after creationist Duane Gish, who was known for swamping his debate opponents with rapid-fire arguments of dubious veracity, wearing them down simply by the overall volume).  A lot of his supporters believe that he won by a landslide, that Clinton only did as well as she did because of rampant fraud, and a host of other demonstrably false beliefs (such as the size of Trump's inauguration crowd, attendance at his rallies, how well the economy is doing, and that the air and water in the United States are the highest quality in the world).

So to put the research by Brashier et al. to work, somehow people would have to be willing and able to fact check these statements as they're happening, the first time they hear them -- not very likely, especially given the role of confirmation bias in affecting how much people believe these statements at the outset (someone who supports Trump already would be more likely to believe him, for example when he's stated that the number of illegal immigrants is the highest it's ever been, when in fact it peaked in 2007 and has been falling steadily ever since).

In any case, it's hard to see how all this helps us.  The traction of "alternative facts" has simply become too great, as has the vested interest of partisan and sensationalized media.  Not for nothing do Brashier et al. call our current situation "the post-truth world."

********************************************

This week's Skeptophilia book recommendation is pure fun: science historian James Burke's Circles: Fifty Round Trips Through History, Technology, Science, and Culture.  Burke made a name for himself with his brilliant show Connections, where he showed how one thing leads to another in discoveries, and sometimes two seemingly unconnected events can have a causal link (my favorite one is his episode about how the invention of the loom led to the invention of the computer).

In Circles, he takes us through fifty examples of connections that run in a loop -- jumping from one person or event to the next in his signature whimsical fashion, and somehow ending up in the end right back where he started.  His writing (and his films) always have an air of magic to me.  They're like watching a master conjuror create an illusion, and seeing what he's done with only the vaguest sense of how he pulled it off.

So if you're an aficionado of curiosities of the history of science, get Circles.  You won't be disappointed.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]