It was that, more than its in-your-face religiosity, that always bothered me the most about the bumper sticker I used to see that said, "Jesus said it, I believe it, and that settles it." And, to be fair, I've met people who were as close-minded about other things -- the controversy over the alleged dangers of GMOs and vaccines, the ethics of everything from capital punishment to abortion to eating meat, and a hundred different stances on political issues.
And then there's the person I know who once said, "If you don't believe that other species should have exactly the same rights as humans, I'm sorry -- you're wrong."
I believe it, and that settles it.
Highlighting the dangers of this attitude, and the advantages of adopting some sense of proportion about our own worldviews, was a study from Duke University published this week in the Personality and Social Psychology Bulletin. Called "Cognitive and Interpersonal Features of Intellectual Humility," this paper makes a strong case that we can go a long way toward improving communication, reaching consensus, getting along with our friends, neighbors, coworkers, and relatives, and (most importantly) recognizing when we ourselves are wrong by ditching our arrogance.
"If you think about what’s been wrong in Washington for a long time, it’s a whole lot of people who are very intellectually arrogant about the positions they have, on both sides of the aisle," said study lead author Mark Leary. "But even in interpersonal relationships, the minor squabbles we have with our friends, lovers and coworkers are often about relatively trivial things where we are convinced that our view of the world is correct and their view is wrong."
[image courtesy of photographer David Shankbone and the Wikimedia Commons]
I think this may be the feature of the current administration here in the United States that bothers me the most -- the steadfast determination never, ever to admit error. Confronted by incontrovertible fact, the reaction is not to say, "Okay, I was wrong," or even to retreat in disarray; they attack, deflect, distract, try to discredit, screech about "fake news" and "alternative facts," and promise reprisal against anyone who says different.
"Death before reconsideration" seems to be the motto these days.
So the study's results were illuminating, but hardly surprising. Intellectually humble people, they found, are more likely to applaud someone who changes his/her mind based on new evidence; the arrogant tend to label this as a "flip-flop," and consider it a sign of weak-mindedness. The arrogant, when reading an article with which they disagree, are more willing to label the author with pejorative adjectives -- immoral, incompetent, dishonest, cold.
Most interestingly, intellectually humble people are far better at discerning strong arguments from weak ones -- leading one to the conclusion that the arrogant tend to make snap judgments based on what they already believed, while the humble wait to see what the evidence says.
A fascinating bit of the study was that they found no correlation between intellectual humility and political leanings, which is another blow to the "my side is right about everything" attitude of the intellectually arrogant. "There are stereotypes about conservatives and religiously conservative people being less intellectually humble about their beliefs," Leary said. "We didn’t find a shred of evidence to support that."
A result I find fairly heartening. It's important for us to realize that our own team doesn't have the market cornered on truth, and to keep our minds open to the fact that we might, in fact, be seeing a biased view of the world ourselves.
Which, now that I come to think of it, is pretty much the definition of "intellectual humility."
"Not being afraid of being wrong – that’s a value, and I think it is a value we could promote," Leary added. "I think if everyone was a bit more intellectually humble we’d all get along better, we’d be less frustrated with each other."
To which I can only add: amen.
Of course, the problem is that people who are sure of themselves are more likely to act on their mistaken beliefs -- sometimes violently -- while people who weigh evidence tend to hold back.
ReplyDelete