Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, February 9, 2021

Fooling the experts

I was bummed to hear about the death of the inimitable Cloris Leachman a week and a half ago at the venerable age of 94.  Probably most famous for her role as Frau Bl├╝cher *wild neighing horse noises* in the movie Young Frankenstein, I was first introduced to her unsurpassed sense of comic timing in the classic 1970s sitcom The Mary Tyler Moore Show, where she played the tightly-wound self-styled intellectual Phyllis Lindstrom.

One of my favorite moments in that show occurred when Phyllis was playing a game of Scrabble against Mary's neighbor Rhoda Morgenstern (played with equal panache by Valerie Harper).  Rhoda puts down the word oxmersis, and Phyllis challenges it.

"There's no such thing as 'oxmersis,'" Phyllis says.

Rhoda looks at her, aghast.  "Really, Phyllis?  I can not believe that someone who knows as much about psychology as you do has never heard of oxmersis."

Long pause, during which you can almost see the gears turning in Phyllis's head.  "Oh," she finally says.  "That oxmersis."

I was immediately reminded of that scene when I ran into a paper while doing some background investigation for yesterday's post, which was about psychologist David Dunning's research with Robert Proctor regarding the deliberate cultivation of stupidity.  This paper looked at a different aspect of ignorance -- what happens when you combine the Dunning-Kruger effect (people's tendency to overestimate their own intelligence and abilities) with a bias called Appeal to Authority.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Richard Dawkins said it, it must be true, regardless of whether the claim has anything to do with Dawkins's particular area of expertise, evolutionary biology.  (I pick Dawkins deliberately, because he's fairly notorious for having opinions about everything, and seems to relish being the center of controversy regardless of the topic.)  

Dunning teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and came up with what could be described as the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think their expertise holds true even for disciplines far outside their chosen area of study, and because of that are more likely to fall for plausible-sounding falsehoods -- like Phyllis's getting suckered by Rhoda's "oxmersis" bluff.

[Image is in the Public Domain]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to fall for the ruse.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results. "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I had to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I ran out of good wrong answers, and just made something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that there's no such thing.)   What I found was that it was the average to upper-average students who were the most likely to be taken in.  The top students didn't get fooled because they knew what the correct answer was; the lowest students were equally likely to pick any of the wrong answers, because they didn't understand the material well.  The mid-range students saw something that sounded technical and vaguely familiar -- and figured that if they weren't sure, it must be that they'd missed learning that particular term.

It was also the mid-range students who were most likely to miss questions where the actual answer seemed too simple.  Another botanical question I liked to throw at them was, "What do all non-vascular land plants have in common?"  I always provided three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason for the small size of non-vascular land plants (the most familiar example is moss) isn't simple at all.  But the answer itself just looked too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I worry a lot about that kind of thing, but I suppose my anxiety really just another case of excessive oxmersis.

*********************************

Science writer Elizabeth Kolbert established her reputation as a cutting-edge observer of the human global impact in her wonderful book The Sixth Extinction (which was a Skeptophilia Book of the Week a while back).  This week's book recommendation is her latest, which looks forward to where humanity might be going.

Under a White Sky: The Nature of the Future is an analysis of what Kolbert calls "our ten-thousand-year-long exercise in defying nature," something that immediately made me think of another book I've recommended -- the amazing The Control of Nature by John McPhee, the message of which was generally "when humans pit themselves against nature, nature always wins."  Kolbert takes a more nuanced view, and considers some of the efforts scientists are making to reverse the damage we've done, from conservation of severely endangered species to dealing with anthropogenic climate change.

It's a book that's always engaging and occasionally alarming, but overall, deeply optimistic about humanity's potential for making good choices.  Whether we turn that potential into reality is largely a function of educating ourselves regarding the precarious position into which we've placed ourselves -- and Kolbert's latest book is an excellent place to start.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



No comments:

Post a Comment