Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, January 20, 2021

The illusion of causality

Fighting bad thinking is an uphill battle, sometimes.  Not only, or even primarily, because there's so much of it out there; the real problem is that our brains are hard-wired to make poor connections, and once those connections are made, to hang on to them like grim death.

A particularly difficult one to overcome is our tendency to fall for the post hoc, ergo propter hoc fallacy -- "after this, therefore because of this."  We assume that if two events are in close proximity in time and space, the first one must have caused the second one.  Dr. Paul Offit, director of the Vaccine Education Center at Children's Hospital of Philadelphia, likes to tell a story about his wife, who is a pediatrician, preparing to give a child a vaccination.  The child had a seizure as she was drawing the vaccine into the syringe.  If the seizure had occurred only a minute later, right after the vaccine was administered, the parents would undoubtedly have thought that the vaccination caused the seizure -- and after that, no power on Earth would have likely convinced them otherwise.

[Image is in the Public Domain courtesy of the NIH]

Why do we do this?  The most reasonable explanation is that in our evolutionary history, forming such connections had significant survival value.  Since it's usual that causes and effects are close together in time and space, wiring in a tendency to decide that all such correspondences are causal is still going to be right more often than not.  But it does lead us onto some thin ice, logic-wise.

Which is bad enough, but consider the study from three researchers -- Ion Yarritu (Deusto University), Helena Matute (University of Bilbao), and David Luque (University of New South Wales) -- that shows our falling for what they call the "causal illusion" is so powerful that even evidence to the contrary can't fix the error.

In a paper called "The dark side of cognitive illusions: When an illusory belief interferes with the acquisition of evidence-based knowledge," published in the British Journal of Psychology, Yarritu et al. have demonstrated that once we've decided on an explanation for something, it becomes damn near impossible to change.

Their experimental protocol was simple and elegant.  The authors write:
During the first phase of the experiment, one group of participants was induced to develop a strong illusion that a placebo medicine was effective to treat a fictitious disease, whereas another group was induced to develop a weak illusion.  Then, in Phase 2, both groups observed fictitious patients who always took the bogus treatment simultaneously with a second treatment which was effective.  Our results showed that the group who developed the strong illusion about the effectiveness of the bogus treatment during Phase 1 had more difficulties in learning during Phase 2 that the added treatment was effective.
The strength of this illusion explains why bogus "alternative medicine" therapies gain such traction.  All it takes is a handful of cases where people use "deer antler spray" and find they have more energy (and no, I'm not making this up) to get the ball rolling.  A friend just told me about someone she knows who has stage four breast cancer.  Asked how her chemo treatment was going, the friend said cheerfully, "Oh, I'm not doing chemo.  I'm treating it with juicing and coffee enemas!  And I feel fine!"

Sadly, she'll "feel fine" until she doesn't anymore, and at that point it'll probably be too late for chemo to help her.

Homeopathy owes a lot to this flaw in our reasoning ability; any symptom abatement that occurs after taking a homeopathic "remedy" clearly would have happened even if the patient had taken nothing -- which is, after all, what (s)he did.

And that's not even considering the placebo effect as a further complicating factor.

Helena Matute, one of the researchers in the recent study, has written extensively about the difficulty of battling causal illusions. In an article she wrote for the online journal Mapping Ignorance, Matute writes:
Alternative medicine is often promoted on the argument that it can do no harm.  Even though its advocates are aware that its effectiveness has not been scientifically demonstrated, they do believe that it is harmless and therefore it should be used.  "If not alone, you should at least use it in combination with evidence-based treatments," they say, "just in case."  
But this strategy is not without risk... even treatments which are physically innocuous may have serious consequences in our belief system, sometimes with fatal consequences.  When people believe that a bogus treatment works, they may not be able to learn that another treatment, which is really effective, is the cause of their recovery. This finding is important because it shows one of the mechanisms by which people might decide to quit an efficient treatment in favor of a bogus one.
I think this same effect is contributory to errors in thinking in a great many other areas.  Consider, for instance, the fact that belief in anthropogenic climate change rises in the summer and falls in the winter.  After being told that human activity is causing the global average temperature to rise, our brains are primed to look out of the window at the snow falling, and say, "Nah.  Can't be."

Post hoc, ergo propter hoc.  To quote Stephen Colbert, "Global warming isn't real, because I was cold today.  Also great news: world hunger is over because I just ate."

The study by Yarritu et al. highlights not only the difficulty of fighting incorrect causal connections, but why it is so essential that we do so.  The decision that two things are causally connected is powerful and difficult to reverse; so it's critical that we be aware of this bias in thinking, and watch our own tendency to leap to conclusions.  But even more critical is that we are given reliable evidence to correct our own errors in causality, and that we listen to it.  Like any cognitive bias, we can combat it -- but only if we're willing to admit that we might get it wrong sometimes.

Or as James Randi was fond of saying, "Don't believe everything you think."

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



1 comment:

  1. Of course many people illegitimately infer a causal relation when y shortly follows x. However, if one has never before encountered either x or y, and one experiences x then shortly afterwards experiences y, and x and y are contiguous, then often we can judge that x was highly likely to be a crucial ingredient for the manifestation of y. This is so even if no causal mechanism can be discerned.

    Attacking the weakest cases isn't interesting. Of course, if any symptom abatement would have occurred with or without a homeopathic "remedy", then this isn't interesting. But what if throughout history and across differing cultures homeopathy seems to have an effect greater than simply doing nothing? How many reports of the efficacy of homeopathy would it take before a skeptic might think there's something to it?( admittedly, disentangling from the placebo effect is challenging though). No amount I would guess! A skeptic would ALWAYS assign it to peoples' propensity to assign causality where we only have correlation. (incidentally, I know nothing about homeopathy and I would imagine there's nothing to it, but I'm only guessing!).

    One of the problems with skeptics is that they have this unshakable conviction that reality can only operate in certain ways. Hence, if no conceivable causal mechanism can be dreamt (dreamed) up, then no amount of contrary evidence will ever be sufficient to overturn this conviction. They will always prefer conventional explanations *no matter how convoluted*.





    ReplyDelete