Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, January 26, 2013

The straight scoop

As part of our ongoing inquiry into why people believe in irrational, counterfactual nonsense, last week we looked at a study that showed that if people read nasty comments in an online opinion piece, it caused them to hold onto their preexisting opinion more strongly.  Today, we'll consider a study that shows that not only does an obnoxious screed not change someone's mind, facts don't, either.

R. Kelly Garrett, an assistant professor of communication at Ohio State University, recently released the results of an investigation into how people react when they are told that something they'd just read was wrong.  He and his team gave test subjects a story about who has access to private health records, but the story had several false statements inserted into it -- for example, that hospital administrators, health insurers, and government officials had unrestricted access to your medical information.

The group was then split in three.  One-third was given, immediately after reading the article, a second article from that showed that the inserted statements were wrong.  A second group was given the correction after spending three minutes doing an unrelated task.  The third group was not given the correction at all.

Unsurprisingly, the three-minute waiting period had little effect on whether or not the reader ended up believing the false information, and the people who did not receive correction showed the strongest residual belief in the incorrect statements.  What was interesting, though, was how the data shifted when you looked at the individuals who received correction, and split those into two groups -- ones who at the beginning of the study identified themselves as supportive of electronic health records, and ones that were against them.  The ones who thought that electronic health records were a good idea were very quick to accept correction, and to learn that the scary statements about unrestricted access were false; those who already believed that electronic medical recordkeeping was a bad idea did not budge, even when shown evidence that what they'd been told was false.  Instead, Garrett said, the test subjects doubted the source of the correction itself.

 "Real-time corrections do have some positive effect, but it is mostly with people who were predisposed to reject the false claim anyway," Garrett said.  "The problem with trying to correct false information is that some people want to believe it, and simply telling them it is false won’t convince them."

That doesn't mean we should give up, Garrett said.  "Correcting misperceptions is really a persuasion task.  You have to convince people that, while there are competing claims, one claim is clearly more accurate."  He also said that it provides a cautionary note about rumors in the political arena.  "We would anticipate that systems like Dispute Finder would do little to change the beliefs of the roughly one in six Americans who, despite exhaustive news coverage and fact checking, continue to question whether President Obama was born in the U.S."

He summed up his study as showing that "Humans aren’t vessels into which you can just pour accurate information."

While this is a purely natural result -- it's understandable that it would take a lot of convincing to change someone's mind on an issue (s)he felt strongly about -- it's a little disheartening.  It's no wonder, then, that the conspiracy-theorists seem so deaf to reason, that the anti-vaxers and anti-GMO crowd don't budge even in the face of scientific study after scientific study, and that the woo-woos respond to rational argument with the equivalent of "la-la-la-la-la, not listening."  It makes the job of the people at sites like FactCheck and Snopes that much harder.

Not to mention mine.  And it also explains a good bit of the hate mail I get.


  1. A trifle discouraging about the state of human rationality . . . but hey, after all, it's only some scientist who says so! Why let that influence me?

  2. The fact that people will believe what they want to believe is only part of the problem. My first response to this type of information is to immediately fact check it. I'm predisposed not to believe anything that anyone says, unless i already know them to be a trusted resource, even then, i may want more proof. I don't know if that's good or bad, but i'm guessing a predisposition to either belief or disbelief will result in similar results when presented with a story and a correction. I, of course as a reader of this blog, am prone to search for the rational solution through facts, but those conspiracy theories always sound so cool!

  3. A correlation study should be done with test subjects that are religious. If you already spend a swath of time researching and believing something you cannot prove, stands to reason that you would be predisposed to believe what people tell you.