It may seem self-evident, but it's still kind of disappointing. And the piece of research that showed this -- by Jack Cao, , Max Kleiman-Weiner, and Mahzarin R. Banaji of Harvard University's Department of Psychology -- is as elegant as it is incontrovertible.
In "People Make the Same Bayesian Judgments They Criticize in Others," which appeared in November's issue of Psychological Science, we find out that people are quick to use dispassionate evidence and logic to make their own decisions, but don't like it when other people do the same thing.
What Cao et al. did was to present test subjects with a simple scenario. For example, a surgeon walks into the operating room to perform a procedure. Is the surgeon more likely to be male or female? Another one said that you're being attended by a doctor and a nurse. One is male and one is female. Which is which?
Clearly, just by statistics -- regardless what you think of issues of gender equality -- doctors and/or surgeons are more likely to be male and nurses more likely to be female. And, in fact, almost everyone applied that logic to their own choices. But then the researchers turned the tables. Instead of asking the subjects what they thought about the question, they presented the answers given by a fictional stranger. Jim answered that the surgeon and the doctor were more likely to be male and the nurse more likely to be female. How does Jim rank on scales of morality, intelligence, and respect for equal rights?
Based on that one piece of information, respondents were harsh. Almost across the board, people criticized Jim, saying he was less moral, less intelligent, and less likely to support equal rights than someone who had answered the other way. "People don't like it when someone uses group averages to make judgments about individuals from different social groups who are otherwise identical. They perceive that person as not only lacking in goodness, but also lacking in intelligence," Cao said, in a press release/interview in EurekAlert. "But when it comes to making judgments themselves, these people make the same type of judgment that they had so harshly criticized in others... This is important because it suggests that the distance between our values and the people we are is greater than we might think. Otherwise, people would not have made judgments in a way that they found to be morally bankrupt and incompetent in others."
[Image licensed under the Creative Commons Deval Kulshrestha, Statua Iustitiae, CC BY-SA 4.0]
The other, though, is even worse. It's how willing we are to be severely critical of other people based upon virtually nothing in the way of evidence. How often do we find out one thing about someone -- he's a Catholic, she's a Republican, he's a lawyer, she's a teenager -- and decide we know a great many other things about them without any further information? Worse still, once those decisions are made, we base our moral judgments on what we think we know, and they become very resistant to change.
As a high school teacher, I can't tell you the number of times I've been asked questions like, "How do you handle dealing with being disrespected by surly teenagers every day?" Well, the truth is, the vast majority of the kids in my classes aren't surly at all, and the last time I was seriously disrespected by a student was a very long time ago. But that knee-jerk judgment that if a person is a teenager, (s)he must be a pain in the ass, is automatic, widespread, and pervasive -- and remarkably difficult to challenge.
I think what this demands is a little bit of humility about our own fallibility. We can't help making judgments, but we need to step back and examine them for what they are before we simply accept them. Eradicating this kind of on-the-fly evaluation is the key to eliminating racism, sexism, and various other forms of bigotry that are based not on any kind of empirical evidence, but on our tendency to use one or two facts to infer complex understanding.
As Oliver Wendell Holmes put it, "No generalization is worth a damn, including this one." Or, to quote skeptic and writer Michael Shermer, "Don't believe everything you think."
****************************************
This week's Skeptophilia book recommendation is one of personal significance to me -- Michael Pollan's latest book, How to Change Your Mind. Pollan's phenomenal writing in tours de force like The Omnivore's Dilemma and The Botany of Desire shines through here, where he takes on a controversial topic -- the use of psychedelic drugs to treat depression and anxiety.
Hallucinogens like DMT, LSD, ketamine, and psilocybin have long been classified as schedule-1 drugs -- chemicals which are off limits even for research except by a rigorous and time-consuming approval process that seldom results in a thumbs-up. As a result, most researchers in mood disorders haven't even considered them, looking instead at more conventional antidepressants and anxiolytics. It's only recently that there's been renewed interest, when it was found that one administration of drugs like ketamine, under controlled conditions, was enough to alleviate intractable depression, not just for hours or days but for months.
Pollan looks at the subject from all angles -- the history of psychedelics and why they've been taboo for so long, the psychopharmacology of the substances themselves, and the people whose lives have been changed by them. It's a fascinating read -- and I hope it generates a sea change in our attitudes toward chemicals that could help literally millions of people deal with disorders that can rob their lives of pleasure, satisfaction, and motivation.
[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]
No comments:
Post a Comment