A particularly difficult one to overcome is our tendency to fall for the post hoc, ergo propter hoc fallacy -- "after this, therefore because of this." We assume that if two events are in close proximity in time and space, the first one must have caused the second one. Dr. Paul Offit, director of the Vaccine Education Center at Children's Hospital of Philadelphia, likes to tell a story about his wife, who is a pediatrician, preparing to give a child a vaccination. The child had a seizure as she was drawing the vaccine into the syringe. If the seizure had occurred only a minute later, right after the vaccine was administered, the parents would undoubtedly have thought that the vaccination caused the seizure -- and after that, no power on earth would have likely convinced them otherwise.
[image courtesy of the Wikimedia Commons]
Which is bad enough. But now three researchers -- Ion Yarritu (Deusto University), Helena Matute (University of Bilbao), and David Luque (University of New South Wales) -- have published research that shows that our falling for what they call the "causal illusion" is so powerful that even evidence to the contrary can't fix the error.
In a paper called "The dark side of cognitive illusions: When an illusory belief interferes with the acquisition of evidence-based knowledge," published earlier this year in the British Journal of Psychology, Yarritu et al. have demonstrated that once we've decided on an explanation for something, it becomes damn near impossible to change.
Their experimental protocol was simple and elegant. Yarritu writes:
During the first phase of the experiment, one group of participants was induced to develop a strong illusion that a placebo medicine was effective to treat a fictitious disease, whereas another group was induced to develop a weak illusion. Then, in Phase 2, both groups observed fictitious patients who always took the bogus treatment simultaneously with a second treatment which was effective. Our results showed that the group who developed the strong illusion about the effectiveness of the bogus treatment during Phase 1 had more difficulties in learning during Phase 2 that the added treatment was effective.The strength of this illusion explains why bogus "alternative medicine" therapies gain such traction. All it takes is a handful of cases where people use "deer antler spray" and find they have more energy (and no, I'm not making this up) to get the ball rolling. Homeopathy owes a lot to this flaw in our reasoning ability; any symptom abatement that occurs after taking a homeopathic "remedy" clearly would have happened even if the patient had taken nothing -- which is, after all, what (s)he did.
And that's not even considering the placebo effect as a further complicating factor.
Helena Matute, one of the researchers in the recent study, has written extensively about the difficulty of battling causal illusions. In an article she wrote for the online journal Mapping Ignorance, Matute writes:
Alternative medicine is often promoted on the argument that it can do no harm. Even though its advocates are aware that its effectiveness has not been scientifically demonstrated, they do believe that it is harmless and therefore it should be used. "If not alone, you should at least use it in combination with evidence-based treatments," they say, "just in case."
But this strategy is not without risk... even treatments which are physically innocuous may have serious consequences in our belief system, sometimes with fatal consequences. When people believe that a bogus treatment works, they may not be able to learn that another treatment, which is really effective, is the cause of their recovery. This finding is important because it shows one of the mechanisms by which people might decide to quit an efficient treatment in favor of a bogus one.I think this same effect is contributory to errors in thinking in a great many other areas. Consider, for instance, the fact that belief in anthropogenic climate change rises in the summer and falls in the winter. After being told that human activity is causing the global average temperature to rise, our brains are primed to look out of the window at the snow falling, and say, "Nah. Can't be."
Post hoc, ergo propter hoc. To quote Stephen Colbert, "Global warming isn't real, because I was cold today. Also great news: world hunger is over because I just ate."
The study by Yarritu et al. highlights not only the difficulty of fighting incorrect causal connections, but why it is so essential that we do so. The decision that two things are causally connected is powerful and difficult to reverse; so it's critical that we be aware of this bias in thinking, and watch our own tendency to leap to conclusions. But even more critical is that we are given reliable evidence to correct our own errors in causality, and that we listen to it. Like any cognitive bias, we can combat it -- but only if we're willing to admit that we might get it wrong sometimes.
Or as Michael Shermer put it, "Don't believe everything you think."