Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label David Dunning. Show all posts
Showing posts with label David Dunning. Show all posts

Tuesday, February 9, 2021

Fooling the experts

I was bummed to hear about the death of the inimitable Cloris Leachman a week and a half ago at the venerable age of 94.  Probably most famous for her role as Frau Blücher *wild neighing horse noises* in the movie Young Frankenstein, I was first introduced to her unsurpassed sense of comic timing in the classic 1970s sitcom The Mary Tyler Moore Show, where she played the tightly-wound self-styled intellectual Phyllis Lindstrom.

One of my favorite moments in that show occurred when Phyllis was playing a game of Scrabble against Mary's neighbor Rhoda Morgenstern (played with equal panache by Valerie Harper).  Rhoda puts down the word oxmersis, and Phyllis challenges it.

"There's no such thing as 'oxmersis,'" Phyllis says.

Rhoda looks at her, aghast.  "Really, Phyllis?  I can not believe that someone who knows as much about psychology as you do has never heard of oxmersis."

Long pause, during which you can almost see the gears turning in Phyllis's head.  "Oh," she finally says.  "That oxmersis."

I was immediately reminded of that scene when I ran into a paper while doing some background investigation for yesterday's post, which was about psychologist David Dunning's research with Robert Proctor regarding the deliberate cultivation of stupidity.  This paper looked at a different aspect of ignorance -- what happens when you combine the Dunning-Kruger effect (people's tendency to overestimate their own intelligence and abilities) with a bias called Appeal to Authority.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Richard Dawkins said it, it must be true, regardless of whether the claim has anything to do with Dawkins's particular area of expertise, evolutionary biology.  (I pick Dawkins deliberately, because he's fairly notorious for having opinions about everything, and seems to relish being the center of controversy regardless of the topic.)  

Dunning teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and came up with what could be described as the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think their expertise holds true even for disciplines far outside their chosen area of study, and because of that are more likely to fall for plausible-sounding falsehoods -- like Phyllis's getting suckered by Rhoda's "oxmersis" bluff.

[Image is in the Public Domain]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to fall for the ruse.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results. "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I had to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I ran out of good wrong answers, and just made something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that there's no such thing.)   What I found was that it was the average to upper-average students who were the most likely to be taken in.  The top students didn't get fooled because they knew what the correct answer was; the lowest students were equally likely to pick any of the wrong answers, because they didn't understand the material well.  The mid-range students saw something that sounded technical and vaguely familiar -- and figured that if they weren't sure, it must be that they'd missed learning that particular term.

It was also the mid-range students who were most likely to miss questions where the actual answer seemed too simple.  Another botanical question I liked to throw at them was, "What do all non-vascular land plants have in common?"  I always provided three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason for the small size of non-vascular land plants (the most familiar example is moss) isn't simple at all.  But the answer itself just looked too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I worry a lot about that kind of thing, but I suppose my anxiety really just another case of excessive oxmersis.

*********************************

Science writer Elizabeth Kolbert established her reputation as a cutting-edge observer of the human global impact in her wonderful book The Sixth Extinction (which was a Skeptophilia Book of the Week a while back).  This week's book recommendation is her latest, which looks forward to where humanity might be going.

Under a White Sky: The Nature of the Future is an analysis of what Kolbert calls "our ten-thousand-year-long exercise in defying nature," something that immediately made me think of another book I've recommended -- the amazing The Control of Nature by John McPhee, the message of which was generally "when humans pit themselves against nature, nature always wins."  Kolbert takes a more nuanced view, and considers some of the efforts scientists are making to reverse the damage we've done, from conservation of severely endangered species to dealing with anthropogenic climate change.

It's a book that's always engaging and occasionally alarming, but overall, deeply optimistic about humanity's potential for making good choices.  Whether we turn that potential into reality is largely a function of educating ourselves regarding the precarious position into which we've placed ourselves -- and Kolbert's latest book is an excellent place to start.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Monday, February 8, 2021

Viral stupidity

My dad used to say that ignorance was only skin deep, but stupid goes all the way to the bone.

There's a lot to that.  Ignorance can be cured; after all, the etymology of the word comes from a- (not) and -gnosis (knowledge).  There are plenty of things I'm ignorant about, but I try my best to be willing to cure that ignorance by working at understanding.

Stupidity, on the other hand, is a different matter.  There's something willful about stupidity.  There's a stubborn sense of "I don't know and I don't care," leading to my dad's wise assessment that on some level stupidity is a choice.  Stupidity is not simply ignorance; it's ignorance plus the decision that ignorance is good enough.

What my dad may have not realized, though, is that there's a third circle of hell, one step down even from stupidity.  Science historian Robert Proctor of Stanford University has made this his area of study, a field he has christened agnotology -- the "study of culturally-constructed ignorance."

Proctor is interested in something that makes stupidity look positively innocent; the deliberate cultivation of stupidity by people who are actually intelligent.  This happens when special interest groups foster confusion among laypeople for their own malign purposes, and see to it that such misinformation goes viral.  For example, this is clearly what has been going on for years with respect to anthropogenic climate change.  There are plenty of people in the petroleum industry who are smart enough to read and understand scientific papers, who can evaluate data and evidence, who can follow a rational argument.  That they do so, and still claim to be unconvinced, is stupidity.

That they then lie and misrepresent the science in order to cast doubt in the minds of less well-informed people in order to push a corporate agenda is one step worse.

"People always assume that if someone doesn't know something, it's because they haven't paid attention or haven't yet figured it out," Proctor says.  "But ignorance also comes from people literally suppressing truth—or drowning it out—or trying to make it so confusing that people stop caring about what's true and what's not."

Anyone else immediately think of Fox News and OAN?  Deliberately cultivating stupidity is their stock in trade.

[Image is licensed under the Creative Commons Betacommand at en.wikipedia, Stupidity is contagious, CC BY 3.0]

The same sort of thing accounts for the claim that COVID was deliberately created by China as a biological weapon, that the illness and death rates are being manipulated to make Donald Trump look bad, and that masks are completely ineffective.  It's behind claims that there was widespread anti-Trump voter fraud in the last election, that every single mass shooting in the United States is an anti-Second-Amendment "false flag," and just about every claim ever made by Sean Hannity.  Proctor says the phenomenon is even responsible for the spread of creationism -- although I would argue that this isn't quite the same thing.  Most of the people pushing creationism are, I think, true believers, not cynical hucksters who know perfectly well that what they're saying isn't true and are only spreading the message to bamboozle the masses.  (Although I have to admit that the "why are there still monkeys?" and "the Big Bang means that nothing exploded and made everything" arguments are beginning to seem themselves like they're one step lower than stupidity, given how many times these objections have been answered.)

"Ignorance is not just the not-yet-known, it’s also a political ploy, a deliberate creation by powerful agents who want you 'not to know'," Proctor says.  "We live in a world of radical ignorance, and the marvel is that any kind of truth cuts through the noise.  Even though knowledge is accessible, it does not mean it is accessed."

David Dunning of Cornell University, who gave his name to the Dunning-Kruger effect (the idea that people systematically overestimate their own knowledge), agrees with Proctor.  "While some smart people will profit from all the information now just a click away, many will be misled into a false sense of expertise," Dunning says.  "My worry is not that we are losing the ability to make up our own minds, but that it’s becoming too easy to do so.  We should consult with others much more than we imagine.  Other people may be imperfect as well, but often their opinions go a long way toward correcting our own imperfections, as our own imperfect expertise helps to correct their errors."

All of which, it must be said, is fairly depressing.  That we can have more information at our fingertips than ever before in history, and still be making the same damned misjudgments, is a dismal conclusion.  It is worse still that there are people who are taking advantage of this willful ignorance to push popular opinion around for their own gain.

So my dad was right; ignorance is curable, stupidity reaches the bone.  And the deliberate cultivation of stupidity studied by Proctor and Dunning, I think, goes past the bone, all the way to the heart.

*********************************

Science writer Elizabeth Kolbert established her reputation as a cutting-edge observer of the human global impact in her wonderful book The Sixth Extinction (which was a Skeptophilia Book of the Week a while back).  This week's book recommendation is her latest, which looks forward to where humanity might be going.

Under a White Sky: The Nature of the Future is an analysis of what Kolbert calls "our ten-thousand-year-long exercise in defying nature," something that immediately made me think of another book I've recommended -- the amazing The Control of Nature by John McPhee, the message of which was generally "when humans pit themselves against nature, nature always wins."  Kolbert takes a more nuanced view, and considers some of the efforts scientists are making to reverse the damage we've done, from conservation of severely endangered species to dealing with anthropogenic climate change.

It's a book that's always engaging and occasionally alarming, but overall, deeply optimistic about humanity's potential for making good choices.  Whether we turn that potential into reality is largely a function of educating ourselves regarding the precarious position into which we've placed ourselves -- and Kolbert's latest book is an excellent place to start.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, January 27, 2021

Overcoming the snap

One of the most frustrating thing about conspiracy theorists is how resistant they are to changing their minds, even when presented with incontrovertible evidence.

Look, for example, at the whole "Stop the Steal" thing.  There are a significant number of Republicans who still won't acknowledge that Biden won the election fair and square, despite the fact that the opposite claim -- that there was widespread voter fraud that favored the Democrats, and an organized effort by the Left to make it seem like Trump lost an election he actually "won in a landslide" -- has gone to court in one form or another over sixty times, and in all but one case the lawsuit was thrown out because of a complete lack of evidence.  The judges who made these decisions include both Republicans and Democrats; the legal response to "Stop the Steal" has been remarkably bipartisan.

Which, you'd think, would be enough, but apparently it isn't.  An amazingly small number of Republicans have said publicly that they were wrong, there was little to no fraud, certainly not enough to sway the election, and that Biden clearly was the victor.  Mostly, the lack of evidence and losses in court has caused the True Believers double down, has made them even surer that a vast conspiracy robbed Trump of his win, and the lack of any kind of factual credibility is because there's an even vaster conspiracy to cover it all up.

Essentially, people have gone from "believe this because there's evidence" to "believe this despite the fact there's no evidence" to "believe this because there's no evidence."

[Image licensed under the Creative Commons SkepticalScience, Conspiracy Theories Fallacy Icon, CC BY-SA 4.0]

Once you've landed in the last-mentioned category, it's hard to see what possible way there'd be to reach you.  But there may be hope, to judge by a study that came out last week in The Journal of Personality and Social Psychology.

In "Jumping to Conclusions: Implications for Reasoning Errors, False Belief, Knowledge Corruption, and Impeded Learning," by Carmen Sanchez of the University of Illinois - Urbana/Champaign and David Dunning of the University of Michigan (of Dunning-Kruger fame), we find out that there is a strong (and fascinating) correlation between four features of the human psyche:

  • Jumping to conclusions -- participants were given a task in which a computerized character was fishing in a lake.  The lake had mostly red fish and a few gray fish, and the researchers looked at how quickly the test subject was confident about predicting the color of the next fish pulled from the lake.
  • Certainty about false beliefs -- volunteers were given a test of their knowledge of American history, and for each four-answer multiple choice question they were asked how confident they were in their answer.  The researchers looked at people who got things wrong -- while simultaneously being certain they were right.
  • Understanding of basic logic -- participants were given a variety of logic puzzles, such as simple syllogisms (All fish can swim; sharks are fish; therefore sharks can swim), and asked to pick out which ones were sound logic and which were faulty.
  • Belief in conspiracy theories -- test subjects were given a variety of common conspiracy theories, such as the belief that cellphones cause cancer but it's being covered up by big corporations, and asked to rank how likely they thought the beliefs were to be true.

They found that the faster you are to jump to conclusions on the fish test, the worse you are at logic, and the more certain you are about your beliefs even if they are wrong -- and, most critically, the more likely you are to believe spurious, zero-evidence claims.

So far, nothing too earth-shattering, and I think most of us could have predicted the outcome.  But what makes this study fascinating is that Sanchez and Dunning looked at interventions that could slow people down and make them less likely to jump to false conclusions -- and therefore, less likely to feel certain about their own false or counterfactual beliefs.

The intervention had four parts:

  • An explanation of the "jumping to conclusions" phenomenon, including an explanation of why it happens in the brain and the fact that we are all prone to this kind of thing.
  • An acknowledgement of the difficulty of making a correct decision based on incomplete information.  Test subjects were shown a zoomed-in photo, and then it was zoomed out a little bit at a time, and the test subjects had to decide when they were sure of what they were looking at. 
  • An exercise in studying optical illusions.  Here, the point was to illustrate the inherent flaws of our own sensory-integrative mechanisms, and how focusing on one thing can make you miss details elsewhere that might give you more useful information.
  • A short video of a male jogger who compliments a female street artist, and gets no response.  He repeats himself, finally becoming agitated and shouting at her, but when she reacts with alarm he turns and runs away.  Later, he finds she has left him a picture she drew, along with a note explaining that she's deaf -- leaving the guy feeling pretty idiotic and ashamed of himself.  This was followed up by asking participants to write down snap judgments they'd made that later proved incorrect, and what additional information they'd have needed in order to get it right.

This is where I got a surprise, because I've always thought of believers in the counterfactual as being essentially unreachable.  And the intervention seems like pretty rudimentary stuff, something that wouldn't affect you unless you were already primed to question your own beliefs.  But what Sanchez and Dunning found is that the individuals who received the intervention did much better on subsequent tasks than the control group did -- they were more accurate in assessing their own knowledge, slower to make snap judgments, and less confident about crediting conspiracy theories.

I don't know about you, but I find this pretty hopeful.  It once again reinforces my contention that one of the most important things we can do in public schools is to teach basic critical thinking.  (And in case you didn't know -- I have an online critical thinking course through Udemy that is available for purchase, and which has gotten pretty good reviews.)

So taking the time to reason with people who believe in conspiracies can actually be productive, and not the exercise in frustration and futility I thought it was.  Maybe we can reach the "Stop the Steal" people -- with an intervention that is remarkably simple.  It's not going to fix them all, nor eradicate such beliefs entirely, but you have to admit that at this point, any movement in the direction of rationality is worth pursuing.

****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Monday, July 25, 2016

Fooling the experts

Today we consider what happens when you blend Appeal to Authority with the Dunning-Kruger Effect.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Stephen Hawking said it, it must be true, regardless of whether the claim has anything to do with Hawking's particular area of expertise.  The Dunning-Kruger Effect, on the other hand, is the idea that people tend to wildly overestimate their abilities, even in the face of evidence to the contrary, which is why we all think we're above average drivers.

Well, David Dunning (of the aforementioned Dunning-Kruger Effect) has teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and come up with the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think that expertise holds true even for disciplines far outside their chosen area of study.

[image courtesy of the Wikimedia Commons]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to get suckered.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results.  "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I have to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I run out of good wrong answers, and just make something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that it doesn't exist.)  What I find was that it was the average to upper-average students who are the most likely to be taken in by the ruse.  The top students don't get fooled because they know what the correct answer is; the lowest students are equally likely to pick any of the wrong answers, because they don't understand the material well.  The mid-range students see something that sounds technical and vaguely familiar -- and figure that if they aren't sure, it must be that they missed learning that particular term.

It's also the mid-range students who are most likely to miss questions where the actual answer seems too simple.  Another botanical question I like to throw at them is "What do all non-vascular land plants have in common?"  There are three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason non-vascular land plants are small isn't simple at all.  But the answer itself just looks too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I'm sure you can appreciate how important that is.

Thursday, June 30, 2016

Viral stupidity

My dad used to say that ignorance was only skin deep, but stupid goes all the way to the bone.

There's a lot to that.  Ignorance can be cured; after all, the etymology of the word comes from a- (not) and -gnosis (knowledge).  There are plenty of things I'm ignorant about, but I'm always willing to cure that ignorance by working at understanding.

Stupidity, on the other hand, is a different matter.  There's something willful about stupidity.  There's a stubborn sense of "I don't know and I don't care," leading to my dad's wise assessment that on some level stupidity is a choice.  Stupidity is not simply ignorance; it's ignorance plus the decision that ignorance is good enough.

What my dad may have not realized, though, is that there's a third circle of hell, one step down even from stupidity.  Science historian Robert Proctor of Stanford University has made this his area of study, a field he has christened agnotology -- the "study of culturally constructed ignorance."

Proctor is interested in something that makes stupidity look positively innocent; the deliberate cultivation of stupidity by people who are actually intelligent.  This happens when special interest groups foster confusion among laypeople for their own malign purposes, and see to it that such misinformation goes viral.  For example, this is clearly what is happening with respect to anthropogenic climate change.  There are plenty of people in the petroleum industry who are smart enough to read and understand scientific papers, who can evaluate data and evidence, who can follow a rational argument.  That they do so, and still claim to be unconvinced, is stupidity.

That they then lie and misrepresent the science in order to cast doubt in the minds of less well-informed people in order to push a corporate agenda is one step worse.

"People always assume that if someone doesn't know something, it's because they haven't paid attention or haven't yet figured it out," Proctor says.  "But ignorance also comes from people literally suppressing truth—or drowning it out—or trying to make it so confusing that people stop caring about what's true and what's not."

[image courtesy of Nevit Dilman and the Wikimedia Commons]

The same sort of thing accounts for the continuing claims that President Obama is a secret Muslim, that Hillary Clinton was personally responsible for the Benghazi attacks, that jet impacts were insufficient to bring down the Twin Towers on 9/11 so it must have been an "inside job."  Proctor says the phenomenon is even responsible for the spread of creationism -- although I would argue that this isn't quite the same thing.  Most of the people pushing creationism are, I think, true believers, not cynical hucksters who know perfectly well that what they're saying isn't true and are only spreading the message to bamboozle the masses.  (Although I have to admit that the "why are there still monkeys?" and "the Big Bang means that nothing exploded and made everything" arguments are beginning to seem themselves like they're one step lower than stupidity, given how many times these objections have been answered.)

"Ignorance is not just the not-yet-known, it’s also a political ploy, a deliberate creation by powerful agents who want you 'not to know'," Proctor says.  "We live in a world of radical ignorance, and the marvel is that any kind of truth cuts through the noise.  Even though knowledge is accessible, it does not mean it is accessed."

David Dunning of Cornell University, who gave his name to the Dunning-Kruger effect (the idea that people systematically overestimate their own knowledge), agrees with Proctor.  "While some smart people will profit from all the information now just a click away, many will be misled into a false sense of expertise," Dunning says.  "My worry is not that we are losing the ability to make up our own minds, but that it’s becoming too easy to do so.  We should consult with others much more than we imagine.  Other people may be imperfect as well, but often their opinions go a long way toward correcting our own imperfections, as our own imperfect expertise helps to correct their errors."

All of which, it must be said, is fairly depressing.  That we can have more information at our fingertips than ever before in history, and still be making the same damned misjudgments, is a dismal conclusion.  It is worse still that there are people who are taking advantage of this willful ignorance to push popular opinion around for their own gain.

So my dad is right; ignorance is curable, stupidity reaches the bone.  And what Proctor and Dunning study, I think, goes past the bone, all the way to the heart.

Wednesday, July 22, 2015

Color me blue

There's been a recent surge in popularity of questionnaire-based tests that supposedly tell you which of four "personality colors" you belong to.  (Here's a typical example.)  You're given questions like:
When facing a big project, you are...
  • deadline-driven
  • worrying
  • researching
  • making it a group effort
And after twenty or so questions of this sort, you're sorted into one of four "color groups," a little like what the Sorting Hat does at Hogwarts, only less reliable.


I throw in the "less reliable" part not only because we are being given a schema that puts every human on the Earth into one of four categories (hell, even the astrologers admit there are twelve), but because the whole thing relies on self-assessment.  When you take these tests, you're not finding out what you're really like, you're finding out what you think you're like.

Which is clearly not the same thing.  We're notoriously bad judges of our own personalities.  In their 2008 paper "Faulty Self-Assessment: Why Evaluating One’s Own Competence Is an Intrinsically Difficult Task," Cornell University psychologists Travis J. Carter and David Dunning had the following to say:
(A)lthough the exhortation to ‘know oneself’ has a long and venerable history, recent investigations in behavioral science paint a vexing and troubling portrait about people’s success at self-insight. Such research increasingly shows that people are not very good at assessing their competence and character accurately.  They often hold self-perceptions that wander a good deal away from the reality of themselves...  (T)he extant psychological literature suggests that people have some, albeit only a meager, amount of self-insight.
And they quote Ann Landers's trenchant quip, "Know yourself.  Don’t accept your dog’s admiration as conclusive evidence that you are wonderful."

So trying to reach self-discovery from a series of restricted-choice questions you answer about yourself has about as much likelihood of revealing some hitherto unguessed truth as those Facebook quizzes that claim to tell you what character from Game of Thrones you are.

What is more vexing is that despite the fact that these tests are only telling you what you think about yourself, the whole "color group" thing is gaining a lot of ground in the business world as a way of improving relational dynamics in the workplace.  Don't believe me?  Check out this article over at Knoji by M. J. Grueso, who tells us the following:
Most companies use a color personality test in order to better understand these personality differences and how to make it work for everyone.  Understanding the different personalities is important not just for big companies but for us as individuals as this will make it easier for us to learn how to better deal with colleagues and clients...  Experts have determined that there are four basic personality types. Yellow, Red, Blue and Green.  And it doesn't have anything to do with a person's favorite color.  As an individual, learning our color personality is also important.  First, because it helps us to better understand ourselves and why we react to certain situations a certain way.  Second, when we understand who we are, it allows us to open ourselves to at least try to understand others as well.
Which all sounds pretty nifty.  But then I started wondering, "Who are these experts?"  And I found out that the whole color-personality thing was the brainchild of one Carol Ritberger, who is the "renowned psychologist" mentioned in the link in the first paragraph of this post...

... but who actually isn't a psychologist at all.  She describes herself as an "innovative leader in the fields of personality behavioral psychology and behavioral medicine," but later goes on to say that her credentials are "a doctorate in Theology and a doctorate in Esoteric Philosophy and Hermetic Science."

Which are about as related to the science of behavioral psychology as alchemy is to chemistry.

But despite having no apparent training in medical science, she claims to have the ability to do what she calls "intuitive healing:"
Our mission is to provide programs that train participants in the science and art of intuitive diagnostics, qualified to work in concert with medical practitioners in the process of healing. 
We stand at the threshold of a time of compelling change-a positive major shift is taking place, and that shift is having a dramatic impact on our lives.  We are compelled to talk about it and to seek to understand it.  It is awakening a new energy force within each of us that is causing dynamic change to occur within the physical body and the human energy system.  We are changing to forms of light that are not as we have previously known them, and are becoming more vibrant, more radiant, and more empowered.  This new energy force is changing our way of thinking and is illuminating a whole new dimension of our persona.  It is creating the need for intense self-exploration and we are being nudged, pushed, and driven to learn more about who we really are.  It is fueling the desire to better understand ourselves-its energy is assisting us in seeking to get in touch with our very souls.  We are being guided to look beyond the obvious and that which our five senses understand.  This new energy force is sensitizing us to the need to develop our thinking while our mental processing remains the same, and the way we perceive our lives is going through a radical change.  Consciousness, as we have known it, is expanding... 
Medical intuition is both an art and a science.  It is a learnable diagnostic skill that provides insight into how the body, mind, and spirit connection interrelates with one's health and well being.
I don't know about you, but if I've got some sort of medical condition, psychological or otherwise, I'd prefer to be treated by an individual with the proper training and credentials, rather than by someone who diagnoses me through "intuition" and babbles about undefined "energy forces" that are "changing our physical bodies" and "expanding our consciousness."

So the whole what-color-are-you thing (1) doesn't tell you anything you didn't already believe, (2) is only as accurate as your own ability to self-assess, and (3) was developed by someone whose grasp of science sounds tenuous at best.

Be that as it may, you'll probably want to know that I'm a "Blue."  "Blues" are tightly-wound, orderly people with good attention to detail, but who tend to be fretful, quiet, pessimistic, and sensitive to criticism. We need to be "more open about our feelings" and "more willing to try new things."

All of which would be immediately apparent to anyone who's known me more than five minutes.  So as a step toward Socrates's ideal of "Know thyself," it doesn't really get me very far, not that I expected it to.

Wednesday, May 20, 2015

Font of knowledge

Punching yet another hole in our sense that humans are at their core a logical, rational species, filmmaker Errol Morris has done an elegant experiment that shows that our perceptions of the truth value of a statement are influenced...

... by what typeface it's set in.

Morris conspired with The New York Times to run a part of David Deutsch's The Beginning of Infinity that looks at the Earth's likelihood of having its habitability destroyed by an asteroid collision.  (Deutsch's conclusion: not very likely.)  Afterwards, the readers were invited to take a survey entitled, "Are You an Optimist or a Pessimist?" which ostensibly measured the degree of pollyanna-ism in the reader, but was really set up to see to what extent readers bought Deutsch's argument that humanity has no real cause to worry.

The variable was the font that the passage was written in.  He used six: Baskerville, Helvetica, Georgia, Comic Sans, Trebuchet, and Computer Modern.


40,000 people responded.  And the results were as fascinating as they were puzzling; Baskerville had a 5-person-out-of-a-thousand edge over the next highest (Helvetica), which may not seem like much, but which statisticians analyzing the experiment have declared is statistically significant.  Cornell University psychologist David Dunning, who helped design the experiment, said:
(The score spread is) small, but it’s about a 1% to 2% difference — 1.5% to be exact, which may seem small but to me is rather large.  You are collecting these data in an uncontrolled environment (who knows, for example just how each person’s computer is rendering each font, how large the font is, is it on an iPad or iPhone, laptop or desktop), are their kids breaking furniture in the background, etc.  So to see any difference is impressive.  Many online marketers would kill for a 2% advantage either in more clicks or more clicks leading to sales.
The whole thing is a little disturbing, frankly -- that our perceptions of the truth are so malleable that they could be influenced by a little thing by what font it's set in.  Morris writes:
Truth is not typeface dependent, but a typeface can subtly influence us to believe that a sentence is true.  Could it swing an election?  Induce us to buy a new dinette set?  Change some of our most deeply held and cherished beliefs?  Indeed, we may be at the mercy of typefaces in ways that we are only dimly beginning to recognize.  An effect — subtle, almost indiscernible, but irrefutably there.
 Morris was interviewed about his experiment, and was asked a particularly trenchant question: When people read this for the first time, how do you hope that will change their own perception of the world?  Morris responded:
I'm not really sure. I'm not even sure what exactly to make of the results, in truth.  Everything I do—everything I write about and everything I make movies about—is about the distance between the world and us.  We think the world is just given to us, that there's no slack in the system, but there is.  Everything I do is about the slack of the system: the difference between reality and our perception of reality.  So in the sense that this essay lets us further reflect on the world around us, and even makes us paranoid about the slack in the system, then I think it's a good and valuable thing.
And I certainly agree.  Anything that makes us aware of our own biases, and the faults in our logic and perceptual systems, is all to the good.  We need to realize how inaccurate our minds are, if for no other reason, to reinforce how important science is as a tool for improving our knowledge.  Science, relying as it does on human minds for data analysis and interpretation, is far from perfect itself; but as a protocol for understanding, it's the best thing we've got.  Without it, we have no way to winnow out the truth from our own flawed assumptions.

So watch for more and more articles and advertisements appearing in Baskerville, because you just know that the media is going to jump on this study.  And I hope that this once and for all stops people from using Comic Sans, because I can't see that font without thinking of Garfield, and heaven knows that's not a good thing.


Monday, March 16, 2015

Science-friendly illogic

I usually don't blog about what other people put in their blogs.  This kind of thing can rapidly devolve into a bunch of shouted opinions, rather than a reasoned set of arguments that are actually based upon evidence.

But just yesterday I ran into a blog that (1) cited real research, and (2) drew conclusions from that research that were so off the rails that I had to comment.  I'm referring to the piece over at Religion News Service by Cathy Lynn Grossman entitled, "God Knows, Evangelicals Are More Science-Friendly Than You Think."  Grossman was part of a panel at the American Association for the Advancement of Science's yearly Dialogue on Science, Ethics, and Religion, and commented upon research presented at that event by Elaine Howard Ecklund, sociologist at Rice University.

Ecklund's research surrounded the attitudes by evangelicals toward science.  She described the following data from her study:
  • 48% of the evangelicals in her study viewed science and religion as complementary.
  • 21% saw the two worldviews as entirely independent of one another (which I am interpreting to be a version of Stephen Jay Gould's "non-overlapping magisteria" idea).
  • A little over 30% saw the two views as in opposition to each other.
84% of evangelicals, Grossman said, "say modern science is going good [sic] in the world."  And she interprets this as meaning that evangelicals are actually, contrary to appearances, "science friendly."  Grossman writes:
Now, the myth that bites the data dust, is one that proclaims evangelicals are a monolithic group of young-earth creationists opposed to theories of human evolution... 
(M)edia... sometimes incorrectly conflate the conservative evangelical view with all Christians’ views under the general “religion” terminology. 
I said this may allow a small subset to dictate the terms of the national science-and-religion conversation although they are not representative in numbers -– or point of view. This could lead to a great deal of energy devoted to winning the approval of the shrinking group and aging group that believes the Bible trumps science on critical issues.
Well, here's the problem with all of this.

This seems to me to be the inherent bias that makes everyone think they're an above-average driver.  Called the Dunning-Kruger effect, it is described by psychologist David Dunning, whose team first described the phenomenon, thusly:
Incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are...  What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge. 
An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.
Now, allow me to say right away that I'm not calling evangelicals incompetent and/or ignorant as a group.  I have a friend who is a diehard evangelical, and he's one of the best-read, most thoughtful (in both senses of the word) people I know.  But what I am pointing out is that people are poor judges of their own understanding and attitudes -- and on that level, Dunning's second paragraph is referring to all of us.

So Ecklund's data, and Grossman's conclusions from it, are not so much wrong as they are irrelevant. It doesn't matter if evangelicals think they're supportive of science, just like my opinion of my own driving ability isn't necessarily reflective of reality.  I'm much more likely to take the evangelicals' wholesale rejection of evolution and climate science as an indication of their lack of support and/or understanding of science than I would their opinions regarding their own attitudes toward it.

And, of course, there's that troubling 30% of evangelicals who do see religion and science as opposed, a group that Grossman glides right past.  She does, however, admit that scientists would probably find it "troubling" that 60% of evangelicals say that "scientists should be open to considering miracles in their theories."

Troubling doesn't begin to describe it, lady.


That doesn't stop Grossman from painting the Religious Right as one big happy science-loving family, and she can't resist ending by giving us secular rationalists a little cautionary kick in the ass:
[S]cientists who want to write off evangelical views as inconsequential may not want to celebrate those trends [that young people are leaving the church in record numbers]. The trend to emphasize personal experience and individualized spirituality over the authority of Scripture or religious denominational theology is part of a larger cultural trend toward rejecting authority. 
The next group to fall victim to that trend could well be the voices of science.
Which may be the most obvious evidence of all that Grossman herself doesn't understand science.  Science doesn't proceed by authority; it proceeds by hard evidence.  Stephen Hawking, one of the most widely respected authorities in physics, altered his position on information loss in black holes when another scientist, John Preskill, demonstrated that he was wrong.  The theoretical refutation of Hawking's position was later confirmed by data from the Wilkinson Microwave Anisotropy Probe.  Significantly, no one -- including Hawking himself -- said, "you have to listen to me, I'm an authority."

If anything, the trend of rejection of authority and "personal experience" works entirely in science's favor.  The less personal bias a scientist has, the less dependence on the word of authority, the more (s)he can think critically about how the world works.

So all in all, I'd like to thank Grossman and Ecklund for the good news, however they delivered it in odd packaging.  Given my own set of biases, I'm not going to be likely to see the data they so lauded in anything but an optimistic light.

Just like I do my own ability to drive.  Because whatever else you might say about me, I have mad driving skills.