Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label nocebo effect. Show all posts
Showing posts with label nocebo effect. Show all posts

Monday, May 31, 2021

Wishing wells and chicken curses

Combatting magical thinking can be an uphill battle, sometimes.

Even as a diehard skeptic, I get where it comes from.  It can sometimes be an amazingly short trip from "I wish the world worked this way" to believing the world does work that way.  Besides wishful thinking, superstitions can sometimes arise from correlation/causation errors; the classic example is going to watch your favorite sports team while wearing a particular shirt, and your team wins, so you decide the shirt's a lucky charm and proceed to wear it to subsequent games.

This reminds me of one of my college philosophy teachers, who recounted to us something that happened the previous evening.  There'd been a big thunderstorm, and the power went out, and his three-year-old daughter got scared and said, "Daddy, make the lights come back on!"

So he stood up and said, in a thunderous voice, "LET THERE BE... LIGHT."

And the power came back on.

His daughter really respected him after that.  But I bet she started getting suspicious the next time there was a power outage, and his magical ability suddenly didn't work so well any more.

Once a superstitious belief is in place, it can be remarkably hard to eradicate.  You'd think that, like my professor's daughter, once you had some experience disconfirming your belief, you'd go, "Oh, okay, I guess my lucky shirt doesn't work after all."  But we've got a number of things going against us.  Confirmation bias -- we tend to give more weight to evidence that supports what we already believed to be true.  The sunk-cost fallacy -- when we've already put a lot of energy and time into supporting a claim, we're very reluctant to admit we were wrong and it was all a waste.

Another, and weirder, reason superstitions can get cemented into place is the peculiar (but substantiated) nocebo effect.  As you might guess, the nocebo effect is kind of an anti-placebo effect; nocebo is Latin for "I will harm" (placebo means "I will please").  When somebody believes that some magical action will cause them injury, they can sometimes sustain real harm -- apparently the expectation of harm manifests as actual, measurable symptoms.  (This has sometimes been used to explain cases where "voodoo curses" have resulted in the targets becoming ill.)

The reason this comes up is because of two recent discoveries of artifacts for delivering curses in ancient Greece.  The ancient Greeks were a fascinating mix of science and superstition -- but, as I mentioned above, that seems to be part of the human condition.  When we think of them, it's usually either in the context of all the scientific inquiries and deep thought by people like Aristarchus, Pythagoras, Archimedes, and Aristotle, or because they had gods and sub-gods and sub-sub-gods in charge of damn near everything.  

This latter tendency probably explains the 2,500 year old tablets that were recovered from a well in Athens, each one containing a detailed curse targeting a specific person.  The people who wrote each one didn't sign them; apparently, cursing someone and then signing it, "cordially yours, Kenokephalos" was considered a stupid move that was just asking for retribution.

The tablets, which were made of lead, were found by a team led by Jutta Stroszeck, director of the Kerameikos excavation on behalf of the German Archaeological Institute in Athens.  They were found in a well supplying a bath-house near the Dipylon -- the city gate near the classical Athenian Academy.

One of the curse tablets discovered by Stroszeck et al.

Apparently throwing the curses into the well started happening because the previous technique was to put them in the coffins of recently-deceased persons, with the intent that the dead guy's spirit would bring the curse-tablet down to the Underworld and say to Hades, "Hey, bro, get a load of this," and Hades would obligingly smite the recipient.  But around that time Athens tried to put the kibosh on people practicing the Black Arts, and made it illegal to put curses in coffins, so the would-be hexers started to throw them into wells instead.

You have to wonder if any ill effects their targets suffered upon drinking the well water came not from the curse, but from lead poisoning.

The second discovery, which was described in the journal Hesperia last week, is even more gruesome; the remains of a dead chicken that had been chopped up, its beak tied shut, then put in a clay vessel pierced with a nail.  It also contained a coin, presumably to pay whatever evil spirit found the cursed Chicken-o-Gram for carrying out the intent of the spell, which was probably to render the target unable to talk.  The paper describes a similar spell launched against one Libanos, a fourth-century C. E. Greek orator:

To his despair, Libanos had lost the ability to speak before an audience.  He could neither read nor write; he was plagued by severe headaches, bodily pain, and gout.  Libanos's condition improved upon the discovery and removal of a mutilated, dismembered chameleon, which had been hidden in his classroom -- a place where he spent much time.  The animal's head was bent between its hind legs, one of its front limbs cut off, and the other was stuffed into its mouth.

The weird mutilations and twisted pose had an obvious aim; to visit upon Libanos painful symptoms and an inability to speak.  What I suspect, though, is that the problems he had were purely natural in origin, and the discovery and removal of the curse acted as a placebo -- he thought, "Okay, now I should get better!", and did.

Why exactly the nocebo and placebo effects work isn't known; it may have to do with the production or inhibition (respectively) of stress hormones like cortisol and adrenaline, which are known to have long-term bad effects if levels stay high.  But honestly, that's just a guess.

Although I still think it's more likely than damage delivered directly by cursed chickens.

In any case, the discoveries are fascinating, and illustrate that the magical thinking we're still fighting today has a long genealogy.  Wouldn't it be nice if logical and science came as readily?

You have to wonder what the human race would have accomplished by now if we had an inborn tendency toward evidence-based thinking rather than believing in evil curses and wishing wells.

*************************************

Astronomer Michio Kaku has a new book out, and he's tackled a doozy of a topic.

One of the thorniest problems in physics over the last hundred years, one which has stymied some of the greatest minds humanity has ever produced, is the quest for finding a Grand Unified Theory.  There are four fundamental forces in nature that we know about; the strong and weak nuclear forces, electromagnetism, and gravity.  The first three can now be modeled by a single set of equations -- called the electroweak theory -- but gravity has staunchly resisted incorporation.

The problem is, the other three forces can be explained by quantum effects, while gravity seems to have little to no effect on the realm of the very small -- and likewise, quantum effects have virtually no impact on the large scales where gravity rules.  Trying to combine the two results in self-contradictions and impossibilities, and even models that seem to eliminate some of the problems -- such as the highly-publicized string theory -- face their own sent of deep issues, such as generating so many possible solutions that an experimental test is practically impossible.

Kaku's new book, The God Equation: The Quest for a Theory of Everything describes the history and current status of this seemingly intractable problem, and does so with his characteristic flair and humor.  If you're interesting in finding out about the cutting edge of physic lies, in terms that an intelligent layperson can understand, you'll really enjoy Kaku's book -- and come away with a deeper appreciation for how weird the universe actually is.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, October 7, 2017

Voodoo in the brain

I'm sure you've heard about the placebo effect, but have you heard of the nocebo effect?

If you know a little Latin, you can guess what it means.  Placebo is Latin for "I will please;" nocebo for "I will harm."  The nocebo effect occurs when you expect something to cause you unpleasant symptoms, and even though what you've consumed is harmless, you experience the symptoms anyhow.

We've known about the nocebo effect for some time.  It gained prominence due to investigations of "voodoo curses," where someone was cursed through a voodoo ritual, and lo and behold, the cursed individual sickens and dies.  Skeptical researchers don't credit this with voodoo actually working; they have come to realize that when a person thinks they're going to become ill, perhaps even die, the expected outcome manifests in the body.

[image courtesy of photographer Marie-Lan Nguyen and the Wikimedia Commons]

A recent study gives us an even better lens into the nocebo effect, and how the brain influences health.  Any medical researcher will tell you that people in clinical trials of medications will often stop taking the pills they were given, usually citing unacceptable side effects.  What is less well known is that a substantial fraction of the people who end up dropping out of the trial actually were receiving an inert substance.

So the control group, in other words.  They were taking a sugar pill, but because they expected to have side effects from the medication, they went ahead and had side effects anyhow.

The most recent study, which was published in Science last week, was the work of four researchers at the University Medical Center of Hamburg, the University of Colorado, and Cambridge University, and had the unwieldy title, "Interactions Between Brain and Spinal cord Mediate Value Effects in Nocebo Hyperalgesia," and it had a fascinating result:

People in the control group of pharmaceutical clinical trials are more likely to have spurious unpleasant side effects if they're told the medication is expensive than if they're told it's cheap.

Furthermore, they have pinpointed the areas in the brain that are responsible for the foul-up.  The authors write:
Value information about a drug, such as the price tag, can strongly affect its therapeutic effect.  We discovered that value information influences adverse treatment outcomes in humans even in the absence of an active substance.  Labeling an inert treatment as expensive medication led to stronger nocebo hyperalgesia [negative side effects] than labeling it as cheap medication.  This effect was mediated by neural interactions between cortex, brainstem, and spinal cord.  In particular, activity in the prefrontal cortex mediated the effect of value on nocebo hyperalgesia.  Value furthermore modulated coupling between prefrontal areas, brainstem, and spinal cord, which might represent a flexible mechanism through which higher-cognitive representations, such as value, can modulate early pain processing.
Which is kind of amazing.  People who experience unexpected side effects are often labeled as hypochondriacs -- i.e., that they know perfectly well they feel fine, and are making up or exaggerating their symptoms out of fear or a desire for attention.  What's really happening appears to be far subtler.  Because of an expectation of harm, the brain actually manifests the symptoms the person feels they're likely to have.  Labeling the medication as expensive increases the subject's sense of having put something unusual into their bodies, resulting in more anxiety and worse side effects.

For me, the most interesting thing about this is the interaction of the brainstem and spinal cord, two parts of the central nervous system that are usually regarded as controlling completely involuntary responses, with the prefrontal cortex, often considered the most advanced part of the human brain -- the part that is associated with reasoning, decision making, and logic.  The fact that a freakout (to use the scientific terminology) in the prefrontal cortex activates a response in the brainstem is astonishing -- and also explains why people who experience the nocebo effect can manifest actual measurable medical symptoms.

And why some of them die.

All of which brings home once again how incredibly complex the brain is.  We're living at an exciting time -- the point where we're finally beginning to understand the thing in our heads that artificial intelligence pioneer Marvin Minsky called a "three-pound meat machine."  And, apparently, how easy it is for the machine to get fooled.  Kind of humbling, that.

Monday, June 23, 2014

Your days are numbered

Most people have heard of the placebo effect.  The name comes from the Latin word meaning "I will please," and refers to the phenomenon that people who are given an ineffective medication after being told that it will ameliorate their symptoms often find that the symptoms do, indeed, abate.  The mechanism is still not well elucidated -- it has been suggested that some of the effect might be caused by the brain producing "endogenous opioids" when a placebo is administered, causing decreased sensations of pain, feelings of well-being, and sounder sleep.  But the fact is, we still don't fully understand it.

Less well-known, but equally well-documented, is the nocebo effect.  "Nocebo" means "I will harm" in Latin, and it is more or less the placebo effect turned on its head.  If a person is told that something will cause pain, or bring him/her to harm, it sometimes does -- even if there's no rational reason why it would.  Individuals who believe in voodoo curses, for example, sometimes show actual medically detectable symptoms, even though such curses are merely empty superstition.  Nevertheless, if you believe in them, you might feel their effects.

Naturally, this further bolsters the superstition itself, which ramps up the anxiety and fear, which makes the nocebo more likely to happen the next time, and so round and round it goes.  And this seems to be what is happening right now in Uganda -- a bizarre phenomenon called "numbers disease."

In "numbers disease," an affected individual suddenly notices a raised pattern on his/her skin that looks like a number.  The number that appears, it is said, represents the number of days the person has left.  Once the number shows up, the individual begins to sicken, and when the allotted time is up, the person dies.

[image courtesy of the Wikimedia Commons]

Dr. Thomas Lutalo, of the Ugandan Ministry of Health, says that he is seeing a rapid increase in the incidence of the "disease," and has suggested that much of the hysteria might be due to relatively harmless skin infections like ringworm that worsen because of improper skin care.  Ringworm rashes are often irregular, meaning that if you're looking for a pattern (e.g. a number) you're likely to find one, especially given that any number will do.  Then, the superstition that gave rise to the "disease" lends itself to superstitious "cures" that often make some easily-treatable disease become more serious.

The worst part is that this one-superstition-leads-to-another thing is generating an upswing in the belief in witchcraft, and is giving local religious leaders another tool for converting the fearful.  "Unfortunately, some Pentecostal pastors are already using the fear of the strange disease as a beacon for luring more followers to their worship centres with promises of a 'cure,'" said Dr. Harriet Birabwa, a psychiatrist at a hospital in the city of Butabika.  "It is a myth that needs to be dispelled immediately as very many people are dying because of harboring such baseless beliefs."

Which is all well and good to say, but as we've seen over and over, superstitions are awfully difficult to combat.  In my Critical Thinking class, I ask, "How many are you are superstitious?", and usually about half the class will cheerfully raise their hands -- despite the fact that it's hard to see how self-identifying as "superstitious" could be a good thing.  This generates a discussion about what they're superstitious about and why, and how we come to such conclusions despite there being little evidence for their veracity.  Fortunately, most of the superstitions I hear about in class are minor silliness -- on the level of a lucky keychain, a special pen to take tests with, or making sure that they put their left shoe on first because otherwise it'd be "bad luck."

But the whole superstitious mindset is counterfactual and irrational, and that in and of itself makes it worth fighting.  Why subscribe to a worldview within which sinister forces, over which you have no control, are capriciously doling out good and bad fortune, and for which (more importantly) there is no evidence whatsoever?  As we're seeing in Uganda, superstition is sometimes not as harmless as it seems, and can lead to fear, anxiety, physical harm, and allowing yourself to be manipulated by the unscrupulous.

So call to mind any superstitions you might fall prey to, and think about whether it might not be time to reconsider them.  Maybe it's time that irrationality's days are numbered... not yours.

Tuesday, July 17, 2012

Bias, self-awareness, and evil spirits

If there's anything that is a sign of true intelligence, it's caution regarding accepting ideas at face value.  The tendency of many, unfortunately, is to accept whatever is being said, or read, without question, especially if the claim comes from a reputable-looking source.

The issue becomes further complicated when we're biased ahead of time to accept (or reject) the source itself.  A study (here) by Charles Lord and Cheryl Taylor, of Texas Christian University, indicates that people are more likely to accept as correct false statements if they're told that the false statement came from someone whose political or religious stance they share, and conversely, to think true statements are false if they're told that the true statement came from a source in the opposite ideological camp.  Another study (here), by Emily Pronin, Daniel Yin, and Lee Ross of Stanford University, further indicates that just about everyone believes him/herself to be unbiased as compared to others; and worse still, a study by David Dunning (here) suggests that we are likely to rate ourselves as "above average" in knowledge, even in realms in which we score in the bottom quartile.

In other words, none of us is aware of how unperceptive, biased, and ignorant we actually are.

So, the salient question becomes: given that this is the case, how do we know what is true or false?

Well, in the absolute sense, we can't.  We're trapped inside our own skulls, and certainty about anything is probably unrealistic.  Science helps, because it establishes a baseline for validity, along with a reliance on hard data.  But even science doesn't solve the problem entirely; as James Burke, one of the finest thinkers I know of, said, in his wonderful documentary series The Day the Universe Changed, "Even when you get the raw data, the situation doesn't improve.  Because it isn't raw data.  It's what you expected to find.  You'd designed your equipment based on what you already thought was going to happen, so what your equipment is good at doing is finding the kind of data you reckoned you were going to find."

Still, the situation isn't as dire as all that, or we'd be in doubt about everything.  There are ways we can detect specious thinking, and an assortment of red flags that will alert us to bias, slant, and outright lies.  Let's look at one fairly simple example, which appeared in the rather goofy online magazine Who Forted? (although let's not dismiss it just because of the source; see paragraph 2).

Entitled "Bad Vibes: Can Dealing With Evil Spirits Kill You?", this article makes the claim that delving too deeply into the occult puts you in touch with "forces" that can have negative effects on your health.  "(W)hat about those few people who make it a career to deliver the mortal souls of sinners from the grip of evil?" the author, Greg Newkirk, asks.  "What of exorcists, demonologists, and ghost hunters with a flair for the dramatic and a reality show audience?  Is there a risk in placing yourself between a negative spirit and it’s [sic] prey?  Surely the religious will believe that it’s your own soul at stake, but do the scars of spiritual warfare have a physical manifestation?  What I’m asking essentially amounts to one question: Can the pursuit of evil spirits affect your heath?"

Newkirk then goes on to describe the various ways in which evil spirits could cause you harm, including (to his credit) the practitioner simply experiencing continuous stress, fear, and negative emotions -- i.e., the effect could be real even if the spirits themselves aren't.  (This, then, might qualify as a sort of nocebo effect -- a documented phenomenon in which a person who believes himself to be in harm's way from supernatural causes actually experiences negative health effects.)

The most interesting part, to me, is when Newkirk begins to list off various psychic researchers, exorcists, black magicians, and so on, gives a brief curriculum vitae for each, and describes how and at what age each died.  If you want the complete stories, check out the link, but here's a list of names, ages, and causes of death:
  • Malachi Martin, 78, brain hemorrhage
  • Ed Warren, 79, cause not listed (but was chronically ill during the last five years of his life)
  • Lou Gentile, early 40s, cancer
  • George Lutz, 59, cancer
  • Tom Robertson, still alive (from his photograph, he appears to be 60-ish), has prostate cancer
  • Ryan Buell, still alive (age 30), has pancreatic cancer
Several things jump out at me about this list:

1) It's short.  Beware of small sample sizes.  Given a small enough sample size, you can find just about any sort of statistically unlikely pattern you'd like.  (Sort of like if you rolled a die four times in a row, and got four sixes -- and decided that the chance of rolling sixes on a fair die was 100%.)

2)  Given that the writer already had decided that working with evil spirits is dangerous, it's pretty likely he'd have selected examples that supported the conclusion he already had, and ignored ones that didn't.  This kind of cherry-picking of data isn't always this obvious -- unfortunately.

3)  Even despite #2, this was the best he could do?  The first two men listed actually lived longer than the average American (US male average life expectancy currently stands at 75.6 years).  A third, Tom Robertson, is still alive, and has a form of cancer that is often treatable.  A fourth, George Lutz, died young of cancer -- but one of two photographs of Lutz in the article shows him sitting with a cigarette in his hand, in front of a full ashtray!

My point here is that there's a middle ground between accepting a source whole-cloth or rejecting it out of hand.  There's no substitute for taking a cautious look at the argument presented, asking yourself some pointed questions about bias and slant (especially, given the Lord and Taylor study, if the source is one you habitually agree or disagree with!), and engaging your brain, before deciding one way or the other.  And, if there isn't enough information to decide, there's nothing wrong with simply holding a judgment in abeyance for a while -- indefinitely, if need be.

A wonderful take on the whole idea of how to analyze claims is the chapter entitled "The Fine Art of Baloney Detection" in Carl Sagan's wonderful book The Demon-Haunted World: Science as a Candle in the Dark (which, in my opinion, should be required reading in every high school science curriculum in the world).  Check it out, while you're taking a break from expelling evil spirits.  It'll be good for your health.