Of course, being academics, they didn't state it that way. Here's how the authors phrased it:
This meta-analysis investigated the factors underlying effective messages to counter attitudes and beliefs based on misinformation. Because misinformation can lead to poor decisions about consequential matters and is persistent and difficult to correct, debunking it is an important scientific and public-policy goal. This meta-analysis revealed large effects for presenting misinformation, debunking, and the persistence of misinformation in the face of debunking. Persistence was stronger and the debunking effect was weaker when audiences generated reasons in support of the initial misinformation. A detailed debunking message correlated positively with the debunking effect. Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.Put more simply, the authors, Man-pui Sally Chan, Christopher R. Jones, and Kathleen Hall Jamieson of the University of Pennsylvania, and Dolores Albarracín of the University of Illinois at Urbana-Champaign, found that when confronting misinformation, a detailed response generates some degree of correction -- but makes some people double down on their incorrect understanding.
So it's yet another verification of the backfire effect, which makes it a little hard to see how we skeptics are supposed to move forward. And the problem becomes even worse when people have been taught to distrust sources that could potentially ameliorate the problem; I can't tell you how many times I've seen posts stating that sites like Snopes and FactCheck.org are flawed, hopelessly biased, or themselves have an agenda to pull the wool over people's eyes.
It's like I've said before: once you convince people to doubt the facts, and that everyone is lying, you can convince them of anything.
[image courtesy of photographer John Snape and the Wikimedia Commons]
"The effect of misinformation is very strong," said co-author Dolores Albarracín. "When you present it, people buy it. But we also asked whether we are able to correct for misinformation. Generally, some degree of correction is possible but it’s very difficult to completely correct."
The authors weren't completely doom-and-gloom, however, and made three specific recommendations for people dedicated to skepticism and the truth. These are:
- Reduce arguments that support misinformation: the media needs to be more careful about inadvertently repeating or otherwise giving unwarranted credence to the misinformation itself.
- Engage audiences in scrutiny and counterarguing of information: schools, especially, should promote skepticism and critical thinking. It is beneficial to have the audience involved in generating counterarguments -- further supporting the general idea of "teach people how to think, not what to think."
- Introduce new information as part of the debunking message: give evidence and details. Even though "misinformation persistence" is strong even in the face of detailed debunking, there was a positive correlation between detailed information and correction of misapprehension. So: don't let the backfire effect stop you from fighting misinformation.
It may be an uphill battle, but it does work, and is certainly better than the alternative, which is giving up. As Albarracín put it: "What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect."
I think the most frustrating part of all this for me is that there are biased media sources. Lots of them. Some of them (so-called "clickbait") post bullshit to drive up ad revenue; others are simply so ridiculously slanted that anything they publish should be independently verified every single time. And because people tend to gravitate toward media that agree with what they already thought was true, sticking with sources that conform to your own biases makes it unlikely that you'll see where you're wrong (confirmation bias), and will allow you to persist in that error because you're surrounding yourself by people who are saying the same thing (the echo-chamber effect).
And that one, I don't know how to address. It'd be nice if the fringe media would act more responsibly -- but we all know that's not going to happen any time soon. So I'll just end with an exhortation for you to broaden the media you do read -- if you're conservative, check out the arguments on MSNBC every once in a while (and give them serious thought; don't just read, scoff, and turn away). Same if you're a liberal; hit Fox News on occasion. It may not change your mind, but at least it'll make it more likely that you'll discover the holes in your own thinking.