As part of our ongoing inquiry into why people believe in irrational, counterfactual nonsense, last week we looked at a study that showed that if people read nasty comments in an online opinion piece, it caused them to hold onto their preexisting opinion more strongly. Today, we'll consider a study that shows that not only does an obnoxious screed not change someone's mind, facts don't, either.
R. Kelly Garrett, an assistant professor of communication at Ohio State University, recently released the results of an investigation into how people react when they are told that something they'd just read was wrong. He and his team gave test subjects a story about who has access to private health records, but the story had several false statements inserted into it -- for example, that hospital administrators, health insurers, and government officials had unrestricted access to your medical information.
The group was then split in three. One-third was given, immediately after reading the article, a second article from FactCheck.org that showed that the inserted statements were wrong. A second group was given the correction after spending three minutes doing an unrelated task. The third group was not given the correction at all.
Unsurprisingly, the three-minute waiting period had little effect on whether or not the reader ended up believing the false information, and the people who did not receive correction showed the strongest residual belief in the incorrect statements. What was interesting, though, was how the data shifted when you looked at the individuals who received correction, and split those into two groups -- ones who at the beginning of the study identified themselves as supportive of electronic health records, and ones that were against them. The ones who thought that electronic health records were a good idea were very quick to accept correction, and to learn that the scary statements about unrestricted access were false; those who already believed that electronic medical recordkeeping was a bad idea did not budge, even when shown evidence that what they'd been told was false. Instead, Garrett said, the test subjects doubted the source of the correction itself.
"Real-time corrections do have some positive effect, but it is mostly
with people who were predisposed to reject the false claim anyway," Garrett said. "The problem with trying to correct false information is that some
people want to believe it, and simply telling them it is false won’t
That doesn't mean we should give up, Garrett said. "Correcting misperceptions is really a persuasion task. You have to
convince people that, while there are competing claims, one claim is
clearly more accurate." He also said that it provides a cautionary note about rumors in the political arena. "We would anticipate
that systems like Dispute Finder would do little to
change the beliefs of the roughly one in six Americans who, despite
exhaustive news coverage and fact checking, continue to question whether
President Obama was born in the U.S."
He summed up his study as showing that "Humans aren’t vessels into which you can just pour accurate information."
While this is a purely natural result -- it's understandable that it would take a lot of convincing to change someone's mind on an issue (s)he felt strongly about -- it's a little disheartening. It's no wonder, then, that the conspiracy-theorists seem so deaf to reason, that the anti-vaxers and anti-GMO crowd don't budge even in the face of scientific study after scientific study, and that the woo-woos respond to rational argument with the equivalent of "la-la-la-la-la, not listening." It makes the job of the people at sites like FactCheck and Snopes that much harder.
Not to mention mine. And it also explains a good bit of the hate mail I get.