- Philippa Foot's famous "Trolly Problem" experiment (1967), where a person is presented with two scenarios, both of which result in one death to save five people -- but in one, the death is caused by an action with a mechanical intermediary (flipping a switch), while in the second, the death is caused by the person shoving someone off a bridge with their own hands. The interesting result is that humans don't view these as equivalent -- having a mechanical intermediary far reduces the emotional charge of the situation, and makes people much more likely to do it, even though the outcomes are identical.
- The "Milgram experiment," conducted in 1963 by Stanley Milgram, which looked at the likelihood of someone hurting another person if commanded to do so by an authority figure. Turns out, most of us will...
- The Zurich tribalism experiment, done in Switzerland in 2015, wherein we find test subjects are willing to inflict painful shocks on others without activating their own empathy centers -- if the person being shocked is wearing a soccer jersey of a team the test subject didn't like.
- Karen Wynn's "baby lab" experiment (2014), which found that even very young babies have an innate perception of fairness and morality, and want helpful individuals rewarded and unhelpful individuals punished.
Sing Sing Prison, 1915 [Image is in the Public Domain]
There are two problems with this.
First, in 2001, psychologists Alex Haslam and Stephen Reicher tried to replicate Zimbardo's results, and found that it didn't work. What they suggested was that the outcome of the Stanford prison experiment weren't because the "guards" saw the "prisoners" as enemies, but because the guards were identifying with the experimenters -- in other words, their activities were being directed by an authority figure. So the experiment boils down to a rehash of what Milgram did eight years earlier.
But there's a darker side of this, which I just found out about in an article in Medium by Ben Blum called "The Lifespan of a Lie." In it, Blum makes a disturbing claim; that Zimbardo hadn't done what he claimed, which was to break the students into groups randomly and give them no instructions other than "guards control prisoners, prisoners obey guards;" he had actually coached the guards to behave cruelly -- and may have even encouraged one of the prisoners to go into hysterics.
The most famous breakdown, that of "prisoner" Doug Korpi, was dramatic -- he was locked in a closet by a guard, and proceeded to have a complete meltdown, screaming and crying and kicking the door. The problem, Korpi says, is that it was all an act, and both he and Zimbardo knew it. "Anybody who is a clinician would know that I was faking,” Korpi told Blum. "If you listen to the tape, it’s not subtle. I’m not that good at acting. I mean, I think I do a fairly good job, but I’m more hysterical than psychotic."
At least some of the guards were acting as well. One of the ones that had (according to Zimbardo) exhibited true cruelty toward the prisoners, Dave Eshelman, said his whole persona was a put-on. "I took it as a kind of an improv exercise,” Eshelman told Blum. "I believed that I was doing what the researchers wanted me to do, and I thought I’d do it better than anybody else by creating this despicable guard persona. I’d never been to the South, but I used a southern accent, which I got from Cool Hand Luke."
Zimbardo, of course, denies all of this, and spoke to Blum briefly -- mostly to say that the experiment was fine, and the claims of fraud all nonsense. Instead, he said that Haslam and Reicher's failed attempt at replication was "fraudulent," and the experiment itself valid. "It’s the most famous study in the history of psychology at this point," Zimbardo told Blum. "There’s no study that people talk about fifty years later. Ordinary people know about it. They say, ‘What do you do?’ ‘I’m a psychologist.’ It could be a cab driver in Budapest. It could be a restaurant owner in Poland. I mention I’m a psychologist, and they say, ‘Did you hear about the study?’ It’s got a life of its own now. If he wants to say it was all a hoax, that’s up to him. I’m not going to defend it anymore. The defense is its longevity."
Which, of course, is not much of a defense. Some really stupid ideas (I'm lookin' at you, homeopathy) have been around for ages. I do find it rather upsetting, though, and not just because I've been teaching an experiment for years that turns out not to have gone down the way the researchers claimed. It's a stain on science as a whole -- that we accepted the results of an experiment that failed replication, mostly because its outcome seemed so comforting. People aren't inherently immoral; they act immorally when they're placed in situations where it's expected. Alter situations, it implied, and people will rise to higher motives.
Well, maybe. There are still a lot of questions about morality, and the other four experiments I teach have borne up to scrutiny. We do harm more easily when we're one step removed from the person being harmed, when an authority figure tells us to, when the harmed person doesn't belong to our "tribe," and when the recipient of punishment is perceived to have deserved it. But simply banding together, Lord of the Flies-style, to visit harm upon the helpless -- the evidence for that is far slimmer.
And I suppose the Zimbardo experiment will have to be transferred to a different lecture next year -- the one I do on examples of scientific fraud and researcher malfeasance.
This week's Skeptophilia book recommendation is a classic: the late Oliver Sacks's The Man Who Mistook His Wife for a Hat. It's required reading for anyone who is interested in the inner workings of the human mind, and highlights how fragile our perceptual apparatus is -- and how even minor changes in our nervous systems can result in our interacting with the world in what appear from the outside to be completely bizarre ways. Broken up into short vignettes about actual patients Sacks worked with, it's a quick and completely fascinating read.