Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label confidence. Show all posts
Showing posts with label confidence. Show all posts

Monday, September 12, 2022

Confidence boost

New from the "Well, I Coulda Told You That" department, we have: a study out of MIT showing that confident kids do better in mathematics -- and that confidence instilled in childhood persists into adulthood, with positive outcomes in higher education, employment, and income.

The study appeared in the Journal of Human Resources, and tracked children from eighth grade onward.  It looked at measures of their confidence in their own knowledge and ability, correlated those assessments against their performance in math, and then studied their paths later on in education and eventual employment.  Controlled for a variety of factors, confidence was the best predictor of success.

What's interesting is that their confidence didn't even have to be that accurate to generate positive outcomes.  Overconfident kids had a much better track record than kids who were underconfident by the same amount.  Put a different way, it's better to think you're pretty good at something that you're not than to think you're pretty bad at something that you're not.

I can speak to this from my own experience.  I've had confidence issues all my life, largely stemming from a naturally risk-averse personality together with a mom who (for reasons I am yet to understand) discouraged me from trying things over and over.  I wanted to try martial arts as a teenager; her comment was "you'd quit after three weeks."  I had natural talent at music -- one of the talents I can truly say I was born with -- and asked to take piano lessons.  My mom said, "Why put all that money and time into something for no practical reason?"  I loved (and love) plants and the outdoors, and wanted to apply for a job at a local nursery run by some friends of my dad's.  She said, "That's way more hard, heavy, sweaty work than you'll want to do."

So in the end I did none of those things, at least not until (a lot) later in life.

A great deal of attention has been given to "helicopter parents," who monitor their kids' every move, and heaven knows as a teacher I saw enough of that, as well.  I remember one parent in particular who, if I entered a low grade into my online gradebook (which the parents had access to), I could almost set a timer for how long it'd take me to get an email asking why he'd gotten a low score.  (It usually was under thirty minutes.)  To me, this is just another way of telling kids you have no confidence in them.  It says -- perhaps not as explicitly as my mom did, but says it just the same -- "I don't think you can do this on my own.  Here, let me hold your hand."

Humans are social primates, and we are really sensitive to what others think and say.  Coincidentally, just yesterday I saw the following post, about encouragement in the realm of writing:

Now, let me put out there that this doesn't mean telling people that bad work is good or that incorrect answers are correct.  It is most definitely not the "Everyone Gets A Prize" mentality.  What it amounts to is giving people feedback that encourages, not destroys.  It's saying that anyone can succeed -- while being honest that success might entail a great deal more hard work for some than for others.  And for the person him/herself, it's not saying "I'm better than all of you" -- it's saying, "I know I've got what it takes to achieve my dreams."

Confidence is empowering, energizing, and sexy.  And I say that as someone who is still hesitant, overcautious, self-effacing, and plagued with doubt.  I all too often go into an endeavor -- starting a new book, entering a race, trying a new style of sculpture -- and immediately my mind goes into overdrive with self-sabotage.  "This'll be the time I fail completely.  Probably better not to try."

So it's a work in progress.  But let's all commit to helping each other, okay?  Support your friends and family in achieving what they're passionate about.  Find ways to help them succeed -- not only honest feedback, but simply boosting their confidence in themselves, that whatever difficulties they're currently facing, they can overcome them. 

After all, isn't it more enjoyable to say "see, I toldja so" to someone when they succeed brilliantly than when they fail?

****************************************


Tuesday, February 9, 2021

Fooling the experts

I was bummed to hear about the death of the inimitable Cloris Leachman a week and a half ago at the venerable age of 94.  Probably most famous for her role as Frau Blücher *wild neighing horse noises* in the movie Young Frankenstein, I was first introduced to her unsurpassed sense of comic timing in the classic 1970s sitcom The Mary Tyler Moore Show, where she played the tightly-wound self-styled intellectual Phyllis Lindstrom.

One of my favorite moments in that show occurred when Phyllis was playing a game of Scrabble against Mary's neighbor Rhoda Morgenstern (played with equal panache by Valerie Harper).  Rhoda puts down the word oxmersis, and Phyllis challenges it.

"There's no such thing as 'oxmersis,'" Phyllis says.

Rhoda looks at her, aghast.  "Really, Phyllis?  I can not believe that someone who knows as much about psychology as you do has never heard of oxmersis."

Long pause, during which you can almost see the gears turning in Phyllis's head.  "Oh," she finally says.  "That oxmersis."

I was immediately reminded of that scene when I ran into a paper while doing some background investigation for yesterday's post, which was about psychologist David Dunning's research with Robert Proctor regarding the deliberate cultivation of stupidity.  This paper looked at a different aspect of ignorance -- what happens when you combine the Dunning-Kruger effect (people's tendency to overestimate their own intelligence and abilities) with a bias called Appeal to Authority.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Richard Dawkins said it, it must be true, regardless of whether the claim has anything to do with Dawkins's particular area of expertise, evolutionary biology.  (I pick Dawkins deliberately, because he's fairly notorious for having opinions about everything, and seems to relish being the center of controversy regardless of the topic.)  

Dunning teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and came up with what could be described as the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think their expertise holds true even for disciplines far outside their chosen area of study, and because of that are more likely to fall for plausible-sounding falsehoods -- like Phyllis's getting suckered by Rhoda's "oxmersis" bluff.

[Image is in the Public Domain]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to fall for the ruse.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results. "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I had to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I ran out of good wrong answers, and just made something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that there's no such thing.)   What I found was that it was the average to upper-average students who were the most likely to be taken in.  The top students didn't get fooled because they knew what the correct answer was; the lowest students were equally likely to pick any of the wrong answers, because they didn't understand the material well.  The mid-range students saw something that sounded technical and vaguely familiar -- and figured that if they weren't sure, it must be that they'd missed learning that particular term.

It was also the mid-range students who were most likely to miss questions where the actual answer seemed too simple.  Another botanical question I liked to throw at them was, "What do all non-vascular land plants have in common?"  I always provided three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason for the small size of non-vascular land plants (the most familiar example is moss) isn't simple at all.  But the answer itself just looked too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I worry a lot about that kind of thing, but I suppose my anxiety really just another case of excessive oxmersis.

*********************************

Science writer Elizabeth Kolbert established her reputation as a cutting-edge observer of the human global impact in her wonderful book The Sixth Extinction (which was a Skeptophilia Book of the Week a while back).  This week's book recommendation is her latest, which looks forward to where humanity might be going.

Under a White Sky: The Nature of the Future is an analysis of what Kolbert calls "our ten-thousand-year-long exercise in defying nature," something that immediately made me think of another book I've recommended -- the amazing The Control of Nature by John McPhee, the message of which was generally "when humans pit themselves against nature, nature always wins."  Kolbert takes a more nuanced view, and considers some of the efforts scientists are making to reverse the damage we've done, from conservation of severely endangered species to dealing with anthropogenic climate change.

It's a book that's always engaging and occasionally alarming, but overall, deeply optimistic about humanity's potential for making good choices.  Whether we turn that potential into reality is largely a function of educating ourselves regarding the precarious position into which we've placed ourselves -- and Kolbert's latest book is an excellent place to start.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, January 27, 2021

Overcoming the snap

One of the most frustrating thing about conspiracy theorists is how resistant they are to changing their minds, even when presented with incontrovertible evidence.

Look, for example, at the whole "Stop the Steal" thing.  There are a significant number of Republicans who still won't acknowledge that Biden won the election fair and square, despite the fact that the opposite claim -- that there was widespread voter fraud that favored the Democrats, and an organized effort by the Left to make it seem like Trump lost an election he actually "won in a landslide" -- has gone to court in one form or another over sixty times, and in all but one case the lawsuit was thrown out because of a complete lack of evidence.  The judges who made these decisions include both Republicans and Democrats; the legal response to "Stop the Steal" has been remarkably bipartisan.

Which, you'd think, would be enough, but apparently it isn't.  An amazingly small number of Republicans have said publicly that they were wrong, there was little to no fraud, certainly not enough to sway the election, and that Biden clearly was the victor.  Mostly, the lack of evidence and losses in court has caused the True Believers double down, has made them even surer that a vast conspiracy robbed Trump of his win, and the lack of any kind of factual credibility is because there's an even vaster conspiracy to cover it all up.

Essentially, people have gone from "believe this because there's evidence" to "believe this despite the fact there's no evidence" to "believe this because there's no evidence."

[Image licensed under the Creative Commons SkepticalScience, Conspiracy Theories Fallacy Icon, CC BY-SA 4.0]

Once you've landed in the last-mentioned category, it's hard to see what possible way there'd be to reach you.  But there may be hope, to judge by a study that came out last week in The Journal of Personality and Social Psychology.

In "Jumping to Conclusions: Implications for Reasoning Errors, False Belief, Knowledge Corruption, and Impeded Learning," by Carmen Sanchez of the University of Illinois - Urbana/Champaign and David Dunning of the University of Michigan (of Dunning-Kruger fame), we find out that there is a strong (and fascinating) correlation between four features of the human psyche:

  • Jumping to conclusions -- participants were given a task in which a computerized character was fishing in a lake.  The lake had mostly red fish and a few gray fish, and the researchers looked at how quickly the test subject was confident about predicting the color of the next fish pulled from the lake.
  • Certainty about false beliefs -- volunteers were given a test of their knowledge of American history, and for each four-answer multiple choice question they were asked how confident they were in their answer.  The researchers looked at people who got things wrong -- while simultaneously being certain they were right.
  • Understanding of basic logic -- participants were given a variety of logic puzzles, such as simple syllogisms (All fish can swim; sharks are fish; therefore sharks can swim), and asked to pick out which ones were sound logic and which were faulty.
  • Belief in conspiracy theories -- test subjects were given a variety of common conspiracy theories, such as the belief that cellphones cause cancer but it's being covered up by big corporations, and asked to rank how likely they thought the beliefs were to be true.

They found that the faster you are to jump to conclusions on the fish test, the worse you are at logic, and the more certain you are about your beliefs even if they are wrong -- and, most critically, the more likely you are to believe spurious, zero-evidence claims.

So far, nothing too earth-shattering, and I think most of us could have predicted the outcome.  But what makes this study fascinating is that Sanchez and Dunning looked at interventions that could slow people down and make them less likely to jump to false conclusions -- and therefore, less likely to feel certain about their own false or counterfactual beliefs.

The intervention had four parts:

  • An explanation of the "jumping to conclusions" phenomenon, including an explanation of why it happens in the brain and the fact that we are all prone to this kind of thing.
  • An acknowledgement of the difficulty of making a correct decision based on incomplete information.  Test subjects were shown a zoomed-in photo, and then it was zoomed out a little bit at a time, and the test subjects had to decide when they were sure of what they were looking at. 
  • An exercise in studying optical illusions.  Here, the point was to illustrate the inherent flaws of our own sensory-integrative mechanisms, and how focusing on one thing can make you miss details elsewhere that might give you more useful information.
  • A short video of a male jogger who compliments a female street artist, and gets no response.  He repeats himself, finally becoming agitated and shouting at her, but when she reacts with alarm he turns and runs away.  Later, he finds she has left him a picture she drew, along with a note explaining that she's deaf -- leaving the guy feeling pretty idiotic and ashamed of himself.  This was followed up by asking participants to write down snap judgments they'd made that later proved incorrect, and what additional information they'd have needed in order to get it right.

This is where I got a surprise, because I've always thought of believers in the counterfactual as being essentially unreachable.  And the intervention seems like pretty rudimentary stuff, something that wouldn't affect you unless you were already primed to question your own beliefs.  But what Sanchez and Dunning found is that the individuals who received the intervention did much better on subsequent tasks than the control group did -- they were more accurate in assessing their own knowledge, slower to make snap judgments, and less confident about crediting conspiracy theories.

I don't know about you, but I find this pretty hopeful.  It once again reinforces my contention that one of the most important things we can do in public schools is to teach basic critical thinking.  (And in case you didn't know -- I have an online critical thinking course through Udemy that is available for purchase, and which has gotten pretty good reviews.)

So taking the time to reason with people who believe in conspiracies can actually be productive, and not the exercise in frustration and futility I thought it was.  Maybe we can reach the "Stop the Steal" people -- with an intervention that is remarkably simple.  It's not going to fix them all, nor eradicate such beliefs entirely, but you have to admit that at this point, any movement in the direction of rationality is worth pursuing.

****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]