Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label critical thinking. Show all posts
Showing posts with label critical thinking. Show all posts

Monday, November 6, 2023

Lateral thinking

One of the biggest impediments to clear thinking is the fact that it's so hard for us to keep in mind that we could be wrong.

As journalist Kathryn Schulz put it:

I asked you how it felt to be wrong, and you had answers like humiliating, frustrating, embarrassing, devastating.  And those are great answers.  But they're answers to a different question.  Those are answers to the question, "How does it feel to find out you're wrong?"  But being wrong?  Being wrong doesn't feel like anything...  You remember those characters on Saturday morning cartoons, the Coyote and the Roadrunner?  The Coyote was always doing things like running off a cliff, and when he'd do that, he'd run along for a while, not seeing that he was already over the edge.  It was only when he noticed it that he'd start to fall.  That's what being wrong is like before you've realized it.  You're already wrong, you're already in trouble...  So I should amend what I said earlier.  Being wrong does feel like something.

It feels like being right.

We cling desperately to the sense that we have it all figured out, that we're right about everything.  Oh, in theoretical terms we realize we're fallible; all of us can remember times we've been wrong.  But right here, right now?  It's like my college friend's quip, "I used to be conceited, but now I'm perfect."

The trouble with all this is that it blinds us to the errors that we do make, because if you don't keep at least trying to question your own answers, you won't see your own blunders.  It's why lateral thinking puzzles are so difficult, but so important; they force you to set aside the usual conventions of how puzzles are solved, and to question your own methods and intuitions at every step.  This was the subject of a study by Andrew Meyer (of the Chinese University of Hong Kong) and Shane Frederick (of Yale University) that appeared in the journal Cognition last week.  They looked at a standard lateral thinking puzzle, and tried to figure out how to get people to avoid falling into thinking their (usually incorrect) first intuition was right.

The puzzle was a simple computation problem:

A bat and a ball together cost $1.10.  The bat costs $1.00 more than the ball.  How much does the ball cost?

The most common error is simply to subtract the two, and to come up with ten cents as the cost of the ball.  But a quick check of the answer should show this can't be right.  If the bat costs a dollar and the ball costs ten cents, then the bat costs ninety cents more than the ball, not a dollar more (as the problem states).  The correct answer is that the ball costs $0.05 and the bat costs $1.05 -- the sum is $1.10, and the difference is an even dollar.

Meyer and Frederick tried different strategies for improving people's success.  Bolding the words "more than the ball" in the problem, to call attention to the salient point, had almost no effect at all.  Then they tried three different levels of warnings:

  1. Be careful!  Many people miss this problem.
  2. Be careful!  Many people miss the following problem because they do not take the time to check their answer.
  3. Be careful!  Many people miss the following problem because they read it too quickly and actually answer a different question than the one that was asked.

All of these improved success, but not by as much as you might think.  The number of people who got the correct answer went up by only about ten percent, no matter which warning was used.

Then the researchers decided to be about as blatant as you can get, and put in a bolded statement, "HINT: The answer is NOT ten cents!"  This had the best improvement rate of all, but amazingly, still didn't eliminate all of the wrong answers.  Some people were so certain their intuition was right that they stuck to their guns -- apparently assuming that the researchers were deliberately trying to mislead them!

[Image licensed under the Creative Commons © Nevit Dilmen, Question mark 1, CC BY-SA 3.0]

If you find this tendency a little unsettling... well, you should.  It's one thing to stick to a demonstrably wrong answer in some silly hypothetical bat-and-ball problem; it's another thing entirely to cling to incorrect intuition or erroneous understanding when it affects how you live, how you act, how you vote.

It's why learning how to suspend judgment is so critical.  To be able to hold a question in your mind and not immediately jump to what seems like the "obvious answer" is one of the most important things there is.  I used to assign lateral thinking puzzles to my Critical Thinking students every so often -- I told them, "Think of these as mental calisthenics.  They're a way to exercise your problem-solving ability and look at problems from angles you might not think of right away.  Don't rush to find an answer; keep considering them until you're sure you're on the right track."

So I thought I'd throw a few of the more entertaining puzzles at you.  None of them involve much in the way of math (nothing past adding, subtracting, multiplying, and dividing), but all of them take an insight that requires pushing aside your first impression of how problems are solved.  Enjoy!  (I'll include the answers at the end of tomorrow's post, if any of them stump you.)

1.  The census taker problem

A census taker goes to a man's house, and asks for the ages of the man's three daughters.

"The product of their ages is 36," the man says.

The census taker replies, "That's not enough information to figure it out."

The man says, "Okay, well, the sum of their ages is equal to the house number across the street."

The census taker looks out of the window at the house across the street, and says, "I'm sorry, that's still not enough information to figure it out."

The man says, "Okay... my oldest daughter has red hair."

The census taker says, "Thank you," and writes down the ages.

How old are the three daughters?

2. The St. Ives riddle

The St. Ives riddle is a famous puzzle that goes back to (at least) the seventeenth century:

As I was going to St. Ives,
I met a man with seven wives.
Each wife had seven kids,
Each kid had seven cats,
Each cat had seven kits.
Kits, cats, kids, and wives, how many were going to St. Ives?

3.  The bear

A man goes for a walk.  He walks a mile south, a mile east, and a mile north, and after that is back where he started.  At that point, he sees a large bear rambling around.  What color is the bear?

4.  A curious sequence

What is the next number in this sequence: 8, 5, 4, 9, 1, 7, 6...

5.  Classifying the letters

You can classify the letters in the English alphabet as follows:

Group 1: A, M, T, U, V, W, Y

Group 2: B, C, D, E, K

Group 3: H, I, O, X

Group 4: N, S, Z

Group 5: F, G, J, L, P, Q, R

What's the reason for grouping them this way?

6.  The light bulb puzzle

At the top of a ten-story building are three ordinary incandescent light bulbs screwed into electrical sockets.  On the first floor are three switches, one for each bulb, but you don't know which switch turns on which bulb, and you can't see the bulbs (or their light) from the place where the switches are located.  How can you determine which switch operates which bulb... and only take a single trip from the first floor up to the tenth?

Have fun!

****************************************



Monday, May 2, 2022

The illusion of cynicism

"All politicians are liars."

"I don't trust anyone."

"You have to watch your back constantly."

"Nothing you read in media is true."

When I taught Critical Thinking -- one of my favorite classes to teach -- I found that it was much harder to counteract cynicism than it was gullibility.  Just about everyone knows that gullibility is a mistake; if you "fall for anything," or "believe whatever's told to you," you are automatically considered to be less smart or less sophisticated (at least by people who aren't gullible themselves).  Many of my students thought that the primary reason to learn critical thinking strategies was to make themselves less likely to get suckered by lies and half-truths.

This is itself half true.  As I told my classes, cynicism is exactly as lazy as gullibility.  Disbelieving everything without consideration is no wiser than believing everything without consideration.  It's why I hate the use of the word "skeptic" to mean doubter.  A true skeptic believes what the evidence supports.  The people who disbelieve in anthropogenic climate change, for example, aren't skeptics; they're rejecting the evidence collected over decades, and the theories that have passed the rigors of peer review to become accepted by 97% of the scientific establishment.

But somehow, cynicism has gained a veneer of respectability, as if there's something brave or smart or noble about having the sour attitude that no one and nothing can be trusted.  This was the subject of a paper that appeared in the journal Personality and Social Psychology Bulletin last week, called "The Cynical Genius Illusion: Exploring and Debunking Lay Beliefs About Cynicism and Competence."  The authors, Olga Stavrova of Tilburg University and Daniel Ehlebracht of the University of Cologne, studied a huge amount of data, and found that the public tends to think cynics and scoffers are smarter than average -- but on actual tests of intelligence, people identified as cynics tend to perform more poorly.  The authors write:
Cynicism refers to a negative appraisal of human nature—a belief that self-interest is the ultimate motive guiding human behavior.  We explored laypersons’ beliefs about cynicism and competence and to what extent these beliefs correspond to reality.  Four studies showed that laypeople tend to believe in cynical individuals’ cognitive superiority.  A further three studies based on the data of about 200,000 individuals from 30 countries debunked these lay beliefs as illusionary by revealing that cynical (vs. less cynical) individuals generally do worse on cognitive ability and academic competency tasks.  Cross-cultural analyses showed that competent individuals held contingent attitudes and endorsed cynicism only if it was warranted in a given sociocultural environment.  Less competent individuals embraced cynicism unconditionally, suggesting that—at low levels of competence—holding a cynical worldview might represent an adaptive default strategy to avoid the potential costs of falling prey to others’ cunning.

So a strategy that might have come about because of a desire to avoid being hoodwinked morphs into the conviction that everyone is trying to hoodwink you.  While I understand why someone would want to avoid the former, especially if (s)he's fallen prey in the past, assuming everyone is out to get you is not only the lazy way out, it's factually wrong.

[Image licensed under the Creative Commons Wetsun, Cynicism graffiti, CC BY 2.0]

You know, I think that's one of the most important things I've learned from all the traveling I've done; that everywhere you go, there are good people and bad, kind people and unkind, and that regardless of differences of culture the vast majority of us want the same things -- food, shelter, security, love, safety for our families and friends, the freedom to voice our opinions without fear of repercussions.  The number of people I've run into who really, honestly had ill intent toward me (or toward anyone) were extremely few.

I'll admit, though, that maintaining a healthy, balanced skepticism is hard at times, especially given the polarization of the media lately.  We are very seldom presented with a fair assessment of what's happening, especially insofar as what the opposite side is doing.  Much of the media is devoted to whipping up hatred and distrust of the "other" -- convincing listeners/readers that the opposite party, the other religion(s), the other races or ethnic groups, are unequivocally bad.  Presenting the more complex, nuanced view that there are a few horrible people in every group but that most people are on balance pretty okay, takes a lot more work -- and doesn't attract sponsorship from the corporations who are profiting off the fear, panic, and anger.

It's nice that the Stavrova and Ehlebracht paper supports what I've been claiming for years.  And I'd like to ask you to make a practice of this -- setting aside your preconceived notions and what you've heard from the media, simply looking at the facts and evidence rather than the spin.  I think you'll find that the world is neither the Pollyanna paradise that the gullible believe nor the horrid hellscape in the cynics' minds, but somewhere in that wide middle ground.

And that honestly, it's a much better place to live than either extreme.

**************************************

Wednesday, January 27, 2021

Overcoming the snap

One of the most frustrating thing about conspiracy theorists is how resistant they are to changing their minds, even when presented with incontrovertible evidence.

Look, for example, at the whole "Stop the Steal" thing.  There are a significant number of Republicans who still won't acknowledge that Biden won the election fair and square, despite the fact that the opposite claim -- that there was widespread voter fraud that favored the Democrats, and an organized effort by the Left to make it seem like Trump lost an election he actually "won in a landslide" -- has gone to court in one form or another over sixty times, and in all but one case the lawsuit was thrown out because of a complete lack of evidence.  The judges who made these decisions include both Republicans and Democrats; the legal response to "Stop the Steal" has been remarkably bipartisan.

Which, you'd think, would be enough, but apparently it isn't.  An amazingly small number of Republicans have said publicly that they were wrong, there was little to no fraud, certainly not enough to sway the election, and that Biden clearly was the victor.  Mostly, the lack of evidence and losses in court has caused the True Believers double down, has made them even surer that a vast conspiracy robbed Trump of his win, and the lack of any kind of factual credibility is because there's an even vaster conspiracy to cover it all up.

Essentially, people have gone from "believe this because there's evidence" to "believe this despite the fact there's no evidence" to "believe this because there's no evidence."

[Image licensed under the Creative Commons SkepticalScience, Conspiracy Theories Fallacy Icon, CC BY-SA 4.0]

Once you've landed in the last-mentioned category, it's hard to see what possible way there'd be to reach you.  But there may be hope, to judge by a study that came out last week in The Journal of Personality and Social Psychology.

In "Jumping to Conclusions: Implications for Reasoning Errors, False Belief, Knowledge Corruption, and Impeded Learning," by Carmen Sanchez of the University of Illinois - Urbana/Champaign and David Dunning of the University of Michigan (of Dunning-Kruger fame), we find out that there is a strong (and fascinating) correlation between four features of the human psyche:

  • Jumping to conclusions -- participants were given a task in which a computerized character was fishing in a lake.  The lake had mostly red fish and a few gray fish, and the researchers looked at how quickly the test subject was confident about predicting the color of the next fish pulled from the lake.
  • Certainty about false beliefs -- volunteers were given a test of their knowledge of American history, and for each four-answer multiple choice question they were asked how confident they were in their answer.  The researchers looked at people who got things wrong -- while simultaneously being certain they were right.
  • Understanding of basic logic -- participants were given a variety of logic puzzles, such as simple syllogisms (All fish can swim; sharks are fish; therefore sharks can swim), and asked to pick out which ones were sound logic and which were faulty.
  • Belief in conspiracy theories -- test subjects were given a variety of common conspiracy theories, such as the belief that cellphones cause cancer but it's being covered up by big corporations, and asked to rank how likely they thought the beliefs were to be true.

They found that the faster you are to jump to conclusions on the fish test, the worse you are at logic, and the more certain you are about your beliefs even if they are wrong -- and, most critically, the more likely you are to believe spurious, zero-evidence claims.

So far, nothing too earth-shattering, and I think most of us could have predicted the outcome.  But what makes this study fascinating is that Sanchez and Dunning looked at interventions that could slow people down and make them less likely to jump to false conclusions -- and therefore, less likely to feel certain about their own false or counterfactual beliefs.

The intervention had four parts:

  • An explanation of the "jumping to conclusions" phenomenon, including an explanation of why it happens in the brain and the fact that we are all prone to this kind of thing.
  • An acknowledgement of the difficulty of making a correct decision based on incomplete information.  Test subjects were shown a zoomed-in photo, and then it was zoomed out a little bit at a time, and the test subjects had to decide when they were sure of what they were looking at. 
  • An exercise in studying optical illusions.  Here, the point was to illustrate the inherent flaws of our own sensory-integrative mechanisms, and how focusing on one thing can make you miss details elsewhere that might give you more useful information.
  • A short video of a male jogger who compliments a female street artist, and gets no response.  He repeats himself, finally becoming agitated and shouting at her, but when she reacts with alarm he turns and runs away.  Later, he finds she has left him a picture she drew, along with a note explaining that she's deaf -- leaving the guy feeling pretty idiotic and ashamed of himself.  This was followed up by asking participants to write down snap judgments they'd made that later proved incorrect, and what additional information they'd have needed in order to get it right.

This is where I got a surprise, because I've always thought of believers in the counterfactual as being essentially unreachable.  And the intervention seems like pretty rudimentary stuff, something that wouldn't affect you unless you were already primed to question your own beliefs.  But what Sanchez and Dunning found is that the individuals who received the intervention did much better on subsequent tasks than the control group did -- they were more accurate in assessing their own knowledge, slower to make snap judgments, and less confident about crediting conspiracy theories.

I don't know about you, but I find this pretty hopeful.  It once again reinforces my contention that one of the most important things we can do in public schools is to teach basic critical thinking.  (And in case you didn't know -- I have an online critical thinking course through Udemy that is available for purchase, and which has gotten pretty good reviews.)

So taking the time to reason with people who believe in conspiracies can actually be productive, and not the exercise in frustration and futility I thought it was.  Maybe we can reach the "Stop the Steal" people -- with an intervention that is remarkably simple.  It's not going to fix them all, nor eradicate such beliefs entirely, but you have to admit that at this point, any movement in the direction of rationality is worth pursuing.

****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, April 11, 2018

Fact avoidance

I've learned through the years that my feelings are an unreliable guide to evaluating reality.

Part of this, I suppose, comes from having fought depression for forty years.  I know that what I'm thinking is influenced by my neurotransmitters, and given the fact that they spend a lot of the time out of whack, my sense that five different mutually-exclusive worst-case scenarios can all happen simultaneously is probably not accurate.  It could be that this was in part what drove me to skepticism, and to my understanding that my best bet for making good decisions is to rely not on feelings, but on evidence.

It surprises me how many people don't get that.  I saw two really good examples of this in the news last week, both of them centered around embattled President Donald Trump.  In the first, he was questioned about why he was putting so much emphasis on securing the border with Mexico -- to the extent of sending in the National Guard -- when in fact, illegal border crossings are at a 46-year low.  (You could argue that current levels are still too high; but the fact is, attempted border crossings have steadily dropped from a high of 1.8 million all the way back in 2000; the level now is about a quarter of that.)

I'm not here to discuss immigration policy per se.  It's a complex issue and one on which I am hardly qualified to weigh in.  What strikes me about this is that the powers-that-be are saying, "I don't care about the data, facts, and figures, the number of illegal migrants is increasing because I feel like it is."

An even more blatant example of trust-your-feelings-not-the-facts came from presidential spokesperson Sarah Huckabee Sanders, who has the unenviable and overwhelming job of doing damage control every time Trump lies about something.  This time, it was at a roundtable discussion on taxes in West Virginia, where he veered off script and started railing about voter fraud.  "In many places, like California, the same person votes many times — you've probably heard about that," he said.  "They always like to say 'oh, that's a conspiracy theory' — not a conspiracy theory, folks. Millions and millions of people."

Of course, the states he likes to claim were sites of rampant voter fraud are always states in which he lost, because the fact that Hillary Clinton won the popular vote still keeps him up at night.  But the fact is, he's simply wrong.  A fourteen-year study by Loyola law professor Justin Levitt found that a "specific, credible allegation existed that someone pretended to be someone else at the polls" accounted for 31 instances out of a billion votes analyzed.

To make it clear: 31 does not equal "millions and millions."  And a fraud rate of 0.0000031% does not constitute "many times."

So, Trump lied.  At this point, that's hardly news.  It'd be more surprising if you turned on the news and found out Trump had told the truth about something.  But when asked about this actual data, in juxtaposition to what Trump said, Sarah Sanders said, "The president still feels there was a large amount of voter fraud."

Wait, what?

What Trump or Sanders, or (for that matter) you or I, "feel" about something is completely irrelevant.  If there's hard data available -- which there is, both on the border crossings and on allegations of voter fraud -- that is what should be listened to.  And when you say something, and are confronted by someone who has facts demonstrating the opposite, the appropriate response is, "Whoa, okay.  I guess I was wrong."

But that's if you're not Donald Trump.  Trump never admits to being wrong.  He doesn't have to, because he's surrounded himself with a Greek chorus of people like Sanders (and his sounding boards over at Fox News) who, no matter what Trump says or does, respond, "Exactly right, sir.  You're amazing.  A genius.  Your brain is YUUUGE."

Hell, he said a couple of years ago that he could kill someone in full view on 5th Avenue and not lose a single supporter, and we had a rather alarming proof of that this week when a fire broke out at Trump Tower on, actually, 5th Avenue -- which, contrary to the law, had no fire alarms or sprinkler system installed -- killing one man and injuring six.

The response?  One Trump supporter said that the man who died had deliberately set the fire to make Trump look bad, and then didn't get out in time.


Facts don't matter.  "I feel like Trump is a great leader and a staunch Christian" wins over "take a look at the hard data" every time.

I'd like to say I have a solution to this, but this kind of fact-resistance is so self-insulating that there's no way in.  It's like living inside a circular argument.  "Trump is brilliant because I feel like he's brilliant, so anything to the contrary must be a lie."  And when you have Fox News pushing this attitude hard -- ignoring any information to the contrary -- you can't escape.

If you doubt that, take a look at what Tucker Carlson was talking about while every other news agency in the world was covering the raid on Trump lawyer Michael Cohen's office: a piece on how "pandas are aggressive and sex-crazed."  (No, I'm not making this up.  An actual quote: "You know the official story about pandas — they’re cute but adorably helpless, which is why they are almost extinct.  But like a lot of what we hear, that is a lie...  The real panda is a secret stud with a thirst for flesh and a fearsome bite.")

That's some cutting-edge reporting, right there.  No wonder Fox News viewers were found in a 2012 study to be the worst-informed of all thirty media sources studied, only exceeded by people who didn't watch the news at all.

So sorry to end on a rather dismal note, but it seems like until people decide to start valuing facts above feelings, we're kind of stuck.  Honestly, the only answer I can come up with is educating children to be critical thinkers, but in the current environment of attacking teachers and public schools, I'm not sure that's feasible either.

In the interim, though, I'm gonna avoid pandas.  Because they sound a lot sketchier than I'd realized.

Friday, November 17, 2017

Motivated reasoning

Last week there was a paper released in the Journal of Personality and Individual Differences called, "Epistemic Rationality: Skepticism Toward Unfounded Beliefs Requires Sufficient Cognitive Ability and Motivation to be Rational."  Understandably enough, the title made me sit up and take notice, as this topic has been my bread and butter for years.  The authors, Tomas StÃ¥hl (of the University of Illinois) and Jan-Willem van Prooijen (of the Vrije Universiteit Amsterdam), describe their work thus:
Why does belief in the paranormal, conspiracy theories, and various other phenomena that are not backed up by evidence remain widespread in modern society?  In the present research we adopt an individual difference approach, as we seek to identify psychological precursors of skepticism toward unfounded beliefs.  We propose that part of the reason why unfounded beliefs are so widespread is because skepticism requires both sufficient analytic skills, and the motivation to form beliefs on rational grounds...  [W]e show that analytic thinking is associated with a lower inclination to believe various conspiracy theories, and paranormal phenomena, but only among individuals who strongly value epistemic rationality...  We also provide evidence suggesting that general cognitive ability, rather than analytic cognitive style, is the underlying facet of analytic thinking that is responsible for these effects.
The first bit is hardly a surprise, and is the entire raison d'être of my Critical Thinking class.  Skepticism is not only a way of looking at the world, it's a skill; and like any skill, it takes practice.  Adopting a rational approach to understanding the universe means learning some of the ways in which irrationality occurs, and figuring out how to avoid them.

The second part, though, is more interesting, but also more insidious: in order to be a skeptic, you have to be motivated toward rational thought -- and value it.

Aristotle Teaching Alexander the Great (Charles Laplante, 1866) [image courtesy of the Wikimedia Commons]

This explains the interaction I had with one of my AP Biology students many years ago.  Young-Earth creationists don't, by and large, take my AP class.  My background is in evolutionary genetics, so most of them steer clear, sensing that they're in hostile territory.  (I will say in my own defense that I never treat students in a hostile manner; and the few times I have had a creationist take my class, it was a positive experience, and kept me on my toes to present my arguments as cogently as possible.)

This young lady, however, stood out.  She was absolutely brilliant, acing damn near every quiz I gave.  She had a knack for understanding science that was nothing short of extraordinary.  So we went through the unit on genetics, and I presented the introduction to the unit on evolution, in which I laid out the argument supporting the theory of evolution, explaining how it fits every bit of hard evidence we've got.

That day, she asked if she could talk to me after class.  I said, "Sure," and had no guess about what she might have wanted to talk to me about.

I was absolutely flabbergasted when she said, "I just want you to know that I'm a creationist."

I must have goggled at her for a moment -- after (at that point) two decades as a teacher, I had pretty good control over my facial expressions, but not that good.  She hastily added, "I'm not saying I'm going to argue with you, or that I'm refusing to learn the material, or anything.  I just wanted you to know where I was coming from."

I said, "Okay.  That's fine, and thanks for being up front with me.  But do you mind if I ask you a couple of questions?"

She said, "Not at all."

So I asked her where the argument I'd presented in class fell apart for her.  What part of the evidence or logical chain didn't work?

She said, "None of it.  It's all logical and makes perfect sense."

I must have goggled again, because she continued, "I understand your argument, and it's logically sound.  I don't disbelieve in the evidence you told us about.  But I still don't believe in evolution."

The upshot of it was that for her, belief and rationality did not intersect.  She believed what she believed, and if rational argument contradicted it, that was that.  She didn't argue, she didn't look for counterevidence; she simply dismissed it.  Done.

The research by StÃ¥hl and van Prooijen suggests that the issue with her is that she had no motivation to apply rationality to this situation.  She certainly wasn't short of cognitive ability; she outperformed most of the students in the class (including, I might add, on the test on evolutionary theory).  But there was no motive for her to apply logic to a situation that for her, was beyond the reach of logic.  You got there by faith, or not at all.

To this day, and of all the students I've taught, this young lady remains one of the abiding puzzles.  Her ability to compartmentalize her brain that way -- I'll apply logic here, and it gives me the right answers, but not here, because it'll give me the wrong answers -- is so foreign to my way of thinking that it borders on the incomprehensible.  For me, if science, logic, and rationality work as a way of teasing out fact from falsehood, then -- they work.  You can't use the same basic principles and have them alternate between giving you true and false conclusions, unless the method itself is invalid.

Which, interestingly, is not what she was claiming.

And this is a difficulty that I have a hard time seeing any way to surmount.  Anyone can be taught some basic critical thinking skills; but if they have no motivation to apply them, or (worse) if pre-existing religious or political beliefs actually give them a motivation not to apply them, the argument is already lost.

So that's a little depressing.  Sorry.  I'm still all for teaching cognitive skills (hell, if I wasn't, I'm seriously in the wrong profession).  But what to do about motivation is a puzzle.  It once again seems to me that like my student's attitude toward faith-based belief, being motivated to use logic to understand your world is something about which you have to make a deliberate choice.

You get there because you choose to accept rational argument, or you don't get there at all.

Saturday, January 21, 2017

Protecting the arts from ideology

It's the end of first semester at my school, which means my Critical Thinking classes are finishing up and ready to move on, and I'm preparing to start with a whole new group in a week and a half.  The first semester students are currently working on their final papers, which is a critical analysis of how their thinking has changed since the beginning of the class.

I received one paper early -- they're not officially due until next Thursday -- and one paragraph from it stood out.  The student wrote:
One thing that has become apparent to me through this course is that you can't separate critical thinking from creativity.  Critical thinking really means applying creativity and a broader perspective to everything -- seeing that there are many paths to understanding, and for most things in life, there is no single right answer.  This is why I believe that cutting arts education, which is happening in many schools, will have negative impacts on every subject.  By eliminating the arts, we are taking away one of the fundamentally unique things about being human -- the ability to create something entirely new.  How can we find creative solutions to problems if we've been taught that the most creative endeavors have no value?
Well, first, her perceptivity absolutely took my breath away.  Her observations are not only spot-on, they are even more pertinent than she may have realized, because just yesterday an announcement was made that the Trump administration is considering balancing the federal budget by (amongst other things) eliminating the National Endowment for the Arts.

It brings to mind a similar move that was proposed in England during World War II -- to eliminate funding for the arts in favor of diverting the money to the military.  Winston Churchill famously responded, "Then what are we fighting for?"

Which is it exactly.  Our lives are made immeasurably richer because of the arts -- not only art per se, but writing, music, theater, film, and dance.  The NEA has supported arts and artists of all genres, not to mention programs to encourage the next generation of creative young people.  So you might be asking yourself, why would the new administration target such an organization?

Make no mistake about it; this is an ideologically-based salvo.  It's not about saving money.  The NEA's contribution to the federal budget last year was $148 million out of a $3.9 trillion total, a portion that Philip Bump explains thusly:
If you were at Thanksgiving and demanded a slice of pecan pie proportionate to 2016 NEA spending relative to the federal budget, you'd end up with a piece of pie that would need to be sliced off with a finely-tuned laser.  Put another way, if you make $50,000 a year, spending the equivalent of what the government spends on these three programs would be like spending less than $10.
The conservative powers-that-be have targeted the arts for one reason and one reason only; artists are not controllable.  If you give people the power to create, they will do so -- but won't necessarily create something that makes your political party, religion, or gender comfortable.  One of the most widely-publicized examples of this is the NEA-supported work of American photographer Andres Serrano, who made headlines (and received death threats) for his piece Piss Christ, which was a photograph of a crucifix submerged in a jar of urine.

Sometimes the role of art is to shock, to jolt us out of our complacency.  I know as a writer, I am conscious of the fact that I'm writing to entertain -- but at the same time, if my readers' brains are the same when they're done with my book as they were when they started, I've failed.  All of the arts are about expanding our awareness -- twisting our minds around so we see things in a different way.

That twisting process isn't necessarily comfortable.  And for those of us who value conformity -- those who would like to see everyone follow the rules and march in tempo and draw inside the lines -- it can be profoundly frightening.  But that's exactly why we need the arts.  The capacity for turning your brain around and altering your perspective is not learned by rote.

And we'll need that sort of creativity, considering some of the issues we're currently facing.  As Albert Einstein put it, "We can't solve problems by using the same kind of thinking we used when we created them."


So this ideological shot-across-the-bow needs to be fought, and fought hard, even if you haven't always agreed with every project the NEA has supported.  We need our artists, and more importantly, we need our government and business leaders, our doctors, scientists, educators, and engineers to have the skills that the arts teach.  As my student put it -- if we devalue the arts, we devalue the creative approach to all aspects of life.

And to the artists, writers, musicians, actors, dancers, and all other creative people out there: keep creating.  Keep exploring, keep pushing the boundaries, keep making us see the universe in a different way.  Don't let your unique voice be silenced.  Even though things seem dark right now, recall what one of my favorite visionaries -- J. R. R. Tolkien -- put in the mouth of his iconic character Frodo Baggins, as he faced the overwhelming might of Mordor:  "They cannot win forever."

Monday, January 16, 2017

Sifting fact from fiction

President-elect Donald Trump's latest ploy, any time he is criticized in the press, is to claim that what they're saying is "fake news."  (That, and to threaten to revoke their right to cover his speeches.)

Five days ago, he tweeted (of course, because that's how adults respond to criticism) that the Russian dossier alleged to have compromising information on him was "fake news and crap."  The, um, interaction he is alleged to have had with some Russian prostitutes was likewise "fake news, phony stuff, it did not happen."  About CNN, he said the "organization’s terrible...  You are fake news."  He's banned reporters from The Washington Post from attending his events, calling it "incredibly inaccurate... phony and dishonest."

[image courtesy of the Wikimedia Commons]

There are two things that are troubling about this.

One is that Trump himself has been responsible for more than one demonstrably false claim intended to do nothing but damage his opponents.  Kali Holloway of AlterNet found fourteen, in fact.  Trump either created himself, or was responsible for publicizing, claims such as the following:
  • Barack Obama was a Kenyan Muslim and never attended Columbia University
  • Hillary Clinton was covering up a chronic debilitating illness and was too sick to serve
  • Ted Cruz's father was involved in the plot to kill John F. Kennedy
  • Thousands of Muslims in and around New York City had a public demonstration to cheer the events of 9/11
  • Supreme Court Justice Antonin Scalia was murdered
  • 97% of the murders in the United States are blacks killing other blacks (when confronted on this blatantly false claim, he said, "It was just a retweet... am I going to check every statistic?")
  • Millions of votes in the presidential election were cast illegally
  • Climate change is a Chinese hoax
  • Vaccines cause autism -- and that the doctors opposing this fiction deliberately lied to cover it up
And so on and so forth.

So Trump calling out others for fake news should definitely be an odds-on contender for the "Unintentional Irony of the Year" award for 2017.

The more upsetting aspect of this, however, is that Trump is implying that you can't trust anything on the media -- except, of course, what comes out of his mouth.  The implication is that nothing you see on the news or read in the newspaper is true, that the default stance is to say it's all fake.

This is a profoundly disturbing claim.  For one thing, as I've said many times before, cynicism is no more noble (or correct) than gullibility; disbelieving everything is exactly as lazy and foolish as believing everything.  For another, the media are really our only way of finding out what is happening in the world.  Without media, we would not only have no idea what was going on in other countries, our own government would be operating behind a smokescreen, their machinations invisible to everyone but those in on the game.

Which is a fine way to turn a democracy into a dictatorship.

There is some small kernel of truth to the accusation, however; it is true that all media are biased.  That CNN and MSNBC slant to the left and Fox and The Wall Street Journal slant to the right is so obvious that it hardly bears mention.  To jump from there to "everything they say is a lie," however, is to embrace a convenient falsehood that allows you to reject everything you hear and read except for what fits with your preconceived notions -- effectively setting up your own personal confirmation bias as the sine qua non of understanding.

The truth, of course, is more nuanced than that, and also far more powerful.  We are all capable of sifting fact from fiction, neither believing everything nor rejecting everything.  It's called critical thinking, and in these rather fractious times it's absolutely... well, critical.  As biologist Terry McGlynn put it, "When we teach our students to distinguish science from pseudoscience, we are giving them the skills to identify real and fake journalism."

I won't lie to you.  Sorting fact from fiction in the media (or anywhere else) is hard work, far harder than simply accepting what we'd like to believe and rejecting what we'd like to be false.  But it's possible, and more than that, it's essential.  Check sources -- even if (especially if) they're from your favorite media source.  Check them using sources that have a different slant.  Go to the original documents instead of merely reading what someone else has written about them.  Apply good rules of thumb like Ockham's Razor and the ECREE (Extraordinary Claims Require Extraordinary Evidence) principle.  Pay special attention to claims from people who have proven track records of lying, or people who are making claims outside of their area of expertise.

Donald Trump's snarling of "fake news, phony journalism" every time he's criticized should immediately put you on notice that what he's saying is questionable -- not (again) that it should be disbelieved out of hand, but that it should be scrutinized.  Over the next four years, people on both sides of the aisle are going to have to be on guard -- never in my memory has the country been so polarized, so ready to begin that precipitous slide into sectarian violence that once begun is damn near impossible to halt.  Our leaders are showing no inclination to address the problems we face honestly and openly -- so it falls to us as responsible citizens to start sifting through their claims more carefully instead of simply accepting whatever half-truths or outright lies fit our preconceived notions.

Wednesday, November 30, 2016

Educating more than the sheep

I have had frequent cause to bemoan the fact that we in the educational establishment are teaching 21st century students using a 19th century model.

Let me explain what I mean.  Back in the 19th and early 20th century, it was critical for a well-educated person to know lots of facts.  If you were conversing with a doctor about your health, and you didn't know the names of basic human organs and tissues, you were likely to be entirely lost, and unless you had a medical text handy, there was no way to figure it all out.  On a less dire level, even when I was a kid (1960s and 70s) if you didn't know something -- perhaps even a simple fact, like what is the name of the cellular structure that provides cells with energy -- you had to go and look it up in an encyclopedia or textbook, if you were lucky enough to own them.  Failing that, you took a lengthy trip to a library to see if you could dig it up.

Or you just decided that it wasn't worth the time and stopped worrying about it.

(Nota bene: it's the mitochondria.)

Now?  Most students have access either to cellphones or to other internet-connective devices.  Access to facts and terminology is trivial.  Sometimes a student will ask me something I don't know the answer to -- such as yesterday, when someone wanted to know the gestation period of a sheep -- and within seconds, answers are being shouted out from all over the room.

(Nota bene: it's 152 days.)

[image courtesy of the Wikimedia Commons]

Far more important than simple facts are two things, one of which is taught less often than mere terminology, and the other of which is hardly taught at all.  The more common one is process.  Not just the name "mitochondria," but how it goes about breaking down glucose to release energy for cellular function.  Not just the names of Mendel's four laws of genetics, but why they work (and why there are cases where they don't -- thus, "non-Mendelian inheritance").

Process, though, is hard to teach.  It requires not only that the teacher thoroughly understand it, but that (s)he finds ways to make the subject accessible to students.  It's much easier simply to teach laundry lists of disconnected facts and terms -- but I would question if such a thing is actually "education."

Teaching process, though, is downright common when compared to the other more important skill, which is how to tell false claims from true ones.  Okay, fine, you can look something up on your cellphone, tell us the gestation period of a sheep in five seconds flat.  How do you know if it's right?  How could you tell if it were false?  What does it mean if the source of the information has a bias or an agenda -- admittedly unlikely in the case of pregnant sheep, but a huge deal with respect to science, current events, or politics?

The sad truth that today's students are not being taught to sift fact from fiction was highlighted by a study released last week by some researchers at Stanford University that came to the rather horrifying conclusion that middle, high school, and college students, when presented with various combinations of news articles, opinions, outright falsehoods, biased stories, "sponsored content" (i.e., advertisements), and unsupported claims, couldn't tell one from the other.  Across the board, students scored very poorly on their ability to question source validity, discern bias, and tell real news from fake news.

"Many assume that because young people are fluent in social media they are equally savvy about what they find there," said Sam Wineburg, lead author of the study.  "Our work shows the opposite... What we see is a rash of fake news going on that people pass on without thinking.  And we really can't blame young people because we've never taught them to do otherwise."

To combat this, however, would take a major overhaul of the way we teach.  Unlikely, given the increasing reliance on easy-to-measure "learning standards" -- most of which are taught and assessed using shallow, vocabulary-based factoids, not deep understanding (which is hard to quantify, and therefore to the policy wonks at the state and federal Departments of Education, doesn't seem to matter).  Couple this with the ongoing slicing of funding from public schools, and you can easily see why there's a significant incentive to keep doing things the old way.

But as the study by Wineburg et al. shows, what we're doing is inadequate for preparing young people to be smart consumers of media in the 21st century.  It's no wonder "fake news" has gotten such traction; the consumers can't tell it from the real thing.  Unsurprising, too, that our tendency to place ourselves in echo chambers where we only hear opinions we already believed, and therefore are unlikely to question them, makes for increasing political polarization and people making decisions based on what they think they understand rather than the actual facts.

If this is going to change, we'll need a bottom-up revamping of how teaching is done, and a rethinking of what it means to educate children in the 21st century.  Otherwise, we'll fall victim to the old adage -- "If you always do what you've always done, you'll always get what you've always got."

Saturday, October 8, 2016

Skeptic's curriculum

Thanks to a forward-thinking principal about ten years ago, my high school developed an electives program based on the philosophy that there needs to be more than one path to graduation.  He said to the teachers, "If there's a topic you're passionate about and have always wanted to teach, now's your chance.  Put together a proposal for the school board.  If it flies, go for it!"

This was the genesis of the Critical Thinking class that it is my privilege to teach.  I was given the green light to develop the curriculum, and (if I can indulge in a moment of self-congratulation here) it has become one of the most popular electives in the school.

Critical thinking is a skill, and like every skill, it (1) doesn't necessarily come naturally, but (2) becomes easier the more you do it.  As humans, we come pre-programmed with a whole host of cognitive biases we have to learn to work around -- dart-thrower's bias (the tendency of people to pay more attention to outliers), a natural bent for magical thinking, the unfortunate likelihood of our memories being malleable, inaccurate, or outright false.  But with time and effort, you can learn some strategies for sifting fact from fiction, for detecting it if you're being hoodwinked or misled.

In other words, a skeptical approach can be taught.

I'm delighted to say that great strides are being taken in this area outside of my little rural school district.  Right now, a pilot program in Uganda, led by Sir Iain Chalmers of the Cochrane Foundation, has tested a new curriculum for critical thinking with respect to health and medicine with 15,000 grade-school children.  Chalmers is unequivocal about the program's intent; what he wants, he says, is for kids to be able to "detect bullshit when bullshit is being presented to them."

[image courtesy of the Wikimedia Commons]

It's an essential skill.  Here in the west we have such purveyors of health woo as Dr. Oz, Joel Wallach,  Joseph Michael Mercola, and Vani "The Food Babe" Hari persuading people that their food is contaminated by "chemicals," their prescription medications are poisoning them, and that diseases are caused by everything from not having enough "natural minerals" to disturbances in quantum vibrations.  Modern medical practitioners, they tell us, are being held hostage by "Big Pharma" to fool us all and make money hand over fist, and all the while we get sicker and sicker.

Yes, I know that in the industrialized world we have the highest human life expectancy the world has ever seen, and we've virtually eradicated dozens of infectious diseases using exactly the sort of "allopathic" medicine that Oz and his cronies rail against.  This isn't about fact; it's about being swung around by your fears and emotions.

But we're not the only place in the world that has this problem.  Central Africa, where Chalmers's trial is being run, is a hotbed of superstition, with people rejecting vaccines and antibiotics in favor of "herbal remedies" based on fear.  Quack cures are common -- for example, putting cow dung on burns.  Allen Nsangi, a researcher in Uganda who is working with Chalmers on the project, said that this practice is "almost the best-known treatment."

The Uganda project was the brainchild of Andy Oxman, research director at the Norwegian Institute of Public Health. "Working with policymakers made it clear most adults don’t have time to learn, and they have to unlearn a lot of stuff," Oxman said.  "I’m looking to the future. I think it’s too late for my generation... My hope is that these resources get used in curricula in schools around the world, and that we end up with the children ... who become science-literate citizens and who can participate in sensible discussion about policy and our health."

All of which I find tremendously encouraging.  (Not the part about my generation being a lost cause, because I don't really think that's true, honestly.)  If we can equip children with a good skeptical toolkit, they'll be much less likely to get taken advantage of -- not only in the realm of health, but in every other way.  These skills aren't limited to one discipline.  Once you've adopted a skeptical outlook, you'll find that you apply it to everything.

At least that's my hope.  It's certainly what I've seen in my own classes.  As one of my students told me not long ago, "I thought at first that it was impossible to do what you were asking us to do -- to read and listen to evaluate, not just to memorize and regurgitate.  But now I can't help myself.  When I read something, I think, 'Okay, how do I know this is true?  What's the evidence?  Could there be another explanation?'"

Which is it exactly.  Skepticism isn't cynicism; disbelieving everything out of hand is as lazy as gullibility.  But it's essential that we learn to consider what we're hearing rather than simply trusting that we're being told the truth.  As Satoshi Kanazawa put it: "There are only two legitimate criteria by which you may evaluate scientific ideas: logic and evidence."

Monday, January 12, 2015

An aura of divergent thinking

First, let me just say that I love my students.

Far from conforming to the slacker, disaffected teenage stereotype, I find that nearly all of my students are natural questioners, are interested in the world around them, and are willing to be engaged with learning.  We as teachers have only to hook on to that energy, avoid putting a bell jar over the flame of their inborn curiosity, and half the battle over "higher standards and academic achievement" will be won.

Take, for example, my Critical Thinking classes.  An ongoing exercise we do once weekly through the entire semester is media analysis; students submit an analysis of an example from popular media, as an illustration of some concept we've studied during the previous weeks.  We look in turn at print media, audio/visual media, and online media, but other than that, there are few strictures on what they can turn in to receive credit for this project.

It's amazing what they find.  Once tuned in to a few basic principles of media analysis, high schoolers rapidly become adept at sorting fact from fiction from outright bullshit.

As an example of the last-mentioned, take a look at the site one of my students submitted last week, a little gem called Reading Auras.  In particular, she drew my attention to the page, "Aura Dating for Seniors -- A New Way of Looking at Love."

If you're sitting there thinking, "No... that can't mean what it sounds like...", unfortunately, you're wrong.  This is precisely what it sounds like.

This site is suggesting that senior citizens find new love by comparing the color of their aura with that of a potential significant other.

[image courtesy of the Wikimedia Commons]

It starts out in a remarkably condescending fashion:
One of the more interesting applications of this knowledge is in dating, with special emphasis on senior dating.  Seniors are less familiar with the Internet and because of this they might not be able to give an accurate or complete description of their personality details and likes and dislikes.  Neither they would [sic] be able to describe too well what they are looking for in a partner.
Now that we've established that once you pass the age of 65, you are no longer articulate, let's take a look at the solution:
The aura personality map, in this case, would work like an automatic scanner that reads and translates all that the seniors could not put in words.  Besides, matching the auras is much more accurate in finding the right partner than any other method.  This is because each color of the aura would provide information about the person that would be a better guide to find the right person. 
The relationships that come out from aura colors matching are more meaningful for it is based on the vital energy sources.  This means there would be a better chemistry between the matched senior dates right from the beginning, which in turn would have better chances of developing into a long term significant relationship.
Well, this is correct in one sense; if I was re-entering the Dating Game, I'd want to know right away if a potential partner thought I had a nice-looking aura, because no way would I want to become romantically involved with someone who sees nonexistent halos around people.

On this site, we also find out that it's not a good thing if you have a brown aura, unless it's "caramel brown," which means that you're "fun;" that you can compliment your aura readings with reading a person's tongue, because the tongue's "size, shape, color, and topography" tells you a lot (for example, if your tongue is blue, you have circulatory problems); that your pets have auras, and that if you tune in to your pet's aura (s)he will "show more pleasure than usual;" and that children are naturally adept at seeing auras, and we should encourage them in this rather than dissuading them by silly old narrow-minded rationalist nonsense like teaching them to sort fact from fantasy.

This last-mentioned is at least within hailing distance of the truth.  There is one skill at which children outstrip most adults by a mile, and that's divergent, creative thinking.  A study by Robert McGarvey, which gained traction largely because of its use as an example of how schools fail by the phenomenal speaker Sir Ken Robinson, shows that by one measure of divergent thinking, preschoolers score 84% -- and second graders an average of 10%.  This is, Robinson says, because by second grade, kids have already learned "that there is one answer, and it's in the back of the book -- but don't look."

In my experience, though, you can resurrect this long-suppressed ability for creative critical thinking, but it requires teachers to do something that many of us find pretty scary -- to let go of the reins some. Turned loose on academics, most students can re-engage their curiosity and capacity for divergent thinking quickly.  Take, for example, a recent study that showed that when students are given the opportunity to make choices about what they read for their classes, they read more often and more enthusiastically.  Who wouldn't?  It doesn't take a Rhodes Scholar to see that autonomy is a motivator.  As literacy advocate Pam Allyn put it, "You become a lifelong reader when you're able to make choices about the books you read, and when you love the books you read.  You tend to get better at something you love to do."

But our response, as educators, has been to tighten down more, to place more restrictions on how students learn and on how they demonstrate that they have learned, all behind the rallying cry of "raising standards."

As my student's analysis of the aura website showed, when given the opportunity to dig into a topic, students are capable of doing so with gusto.  My student's presentation of her media submission to the class began with the statement, "This may be the most extreme example of confirmation bias that I've ever seen -- these people are literally seeing what they want to see."

But are we, as educators, doing the inverse of that -- not seeing what we would prefer not to see?  More student autonomy, more divergent thinking, more ways of getting to answers, more ways of expressing them?

And how much of that reluctance comes from our conviction that there should be only a single way to learning?

It seems fitting to end with a quote from Sir Ken Robinson:
We have to go from what is essentially an industrial model of education, a manufacturing model, which is based on linearity and conformity and batching people.  We have to move to a model that is based more on principles of agriculture.  We have to recognize that human flourishing is not a mechanical process; it's an organic process.  And you cannot predict the outcome of human development.  All you can do, like a farmer, is create the conditions under which they will begin to flourish.

Saturday, June 21, 2014

Seeds of doubt

C'mon, people, it's time to grow up a little.

When we're toddlers, we accept things without question.  If our parents say something, we pretty much believe that it must be true.  (Whether we do what they tell us afterwards, though, is another issue.)  After a time, we start experimenting, and testing the world -- sometimes with unfortunate results, such as when we decide to find out why Mommy says that Mr. Finger and Mr. LightSocket can't be friends.

But this highlights an important principle, which is that our first and best way to find out about things is by finding evidence.  "Show me why" is a pretty important first step to knowledge.

It's not the last step, though.  After the "show me why" stage we should move on to "but how do you know it's true?", which is a deeper and more sophisticated question.  Okay, from the evidence of my eyes, it looks like the Sun is moving across the sky.  In order to move past that to the correct explanation, we have to ask the question, "What if there is a better explanation that still accounts for all of the evidence?"

And in this case, of course, it turns out that there is.

There are other facets to this mode of inquiry.  What confounding factors could there be?  What if there are uncontrolled variables?  What if the person who made the original claim was lying?  What if my preconceived biases made me misjudge the evidence, or (perhaps) ignore some of the evidence entirely?  What if there is correlation between A and B, but instead of A causing B, B causes A -- or, perhaps, some third factor caused them both?

This whole process is what is collectively known as "Critical Thinking."  What is unfortunate, though, is that a lot of people seem to be stuck at the "I see evidence, so it must be true" stage, which is probably why the whole WiFi-kills-plants thing is making the rounds of social media... again.  Just a couple of days ago, a friend of mine ran across it, and asked the right question: "can this actually be true?"


The claim is that five ninth-graders from Denmark had noticed that if they slept near their WiFi routers, they "had trouble concentrating in school the next day."  Because clearly, if ninth graders are distracted, it must be because of WiFi.  So the kids allegedly set up an experiment with cress seeds, placed some near a router, and had others in a "room without radiation," and had the results pictured above.

Well.  The whole thing is suspect from the get-go, because we're told nothing about other conditions the seeds were experiencing -- light, humidity, temperature, air flow, and so forth.  Was the "room without radiation" well-lit?  Were the seeds near the router warmer than the supposed control group?  There are a hundred things about this so-called experiment that we're not being told, and yet we're supposed to buy the results -- in spite of the fact that "control all variables but one, or the results are suspect" is the first thing taught in high school science classes.  (For a nice take-apart of this "experiment," take a look here -- and note, especially, that attempts to replicate the girls' experiment have not produced any results.)

What else?  First, it's from Spirit Science, a notorious peddler of woo.  Second, unless they were in a lead-lined vault, I doubt whether the control seeds were actually in a "room without radiation."  Even if you're some distance from the nearest router, you (and your room) are constantly being pierced by radio waves, which pass easily through most solid objects (if they didn't, old-fashioned (i.e. pre-cable) televisions and almost all modern radios would not work inside houses or cars).  Then there's the issue of how many thousands of WiFi routers in the world are sitting near perfectly healthy house plants -- for years, not just for thirteen days.  And even if WiFi did kill cress seeds, there's no guarantee that it would have the same (or any) effect on humans.  Don't believe me?  Go for a nice swim in the ocean, and then pour a cup of seawater on your marigolds, and see if the results are the same.  (In all seriousness, researchers face this all the time when developing medications -- therapies that work well in vitro or on lab animals might have different effects on human subjects.)

So to the people who are unquestioningly passing this around, just stop.  Exercise something past the You-Showed-Me-A-Picture-So-It's-True level of critical thinking.  If you see something that seems suspect, ask someone who might know the answer (as my friend did with this claim).  Or, in this day of information accessibility, you could simply Google "cress seeds WiFi experiment debunked" and you'll find everything you needed to know.

We all were toddlers once, and no harm done, unless you count unfortunate encounters with light sockets.  But let's exercise a little higher-level thinking, here, and not just accept whatever comes down the pike.

Friday, June 29, 2012

The critics of critical thinking

I fear for the future of education.

I am about tennish-or-so years from retirement, depending on whether New York State decides in the interim to offer any retirement incentives to get us old guys out, and also whether there's any money to pay for my pension by the time I get there.  Be that as it may, I do find myself wondering sometimes how much longer I'll be able to do this job in this increasingly hostile climate.  Teachers are, more and more, being treated with distrust by the people charged with their governance, and are micromanaged to a fare-thee-well.  As of next school year, New York teachers are going to be given a numerical grade at the end of the year -- the school year starts in two months and the state has yet to determine the formula by which this grade will be calculated.

The worst part, though, is the increasingly intense effort by legislators to control what we teach, despite the fact that they're not the ones who have training in pedagogy (or, necessarily, any expertise in educational policy).  And I'm not just talking here about the repeated attempts by fundamentalist elected officials to mandate the teaching of creationism in biology classrooms; I'm talking about something far scarier, and further reaching. 

Yesterday, a friend of mine who lives in Texas sent me a link to the Texas GOP website, which contains a summary of their official platform.  (The platform itself is a pdf, so here's a link to a webpage where you can access it if interested.)  And on page 13, under "Educating Our Children," we find the following:
Knowledge-Based Education – We oppose the teaching of Higher Order Thinking Skills (HOTS) (values clarification), critical thinking skills and similar programs that are simply a relabeling of Outcome-Based Education (OBE) (mastery learning) which focus on behavior modification and have the purpose of challenging the student’s fixed beliefs and undermining parental authority.
This was one of those "I can't be reading that correctly" moments for me; I read it three times, and finally said, with some incredulity, "Nope, that's what it actually says."  They're against critical thinking?  They're against values clarification?  Education should never challenge a student's fixed beliefs?

I'm sorry, Texas GOP.  That's not just wrong, it's dangerously wrong.  Might I remind you that the the most successful historical example of what you're proposing was the Hitler Youth program in Nazi Germany?

Even the word education, at its origin, doesn't mean "shut up and memorize this;" the word comes from the Latin verb educare, which means "to draw out."  The idea is to give students ownership and pride in their own learning, to encourage them to draw out from their own minds creative solutions to problems and novel syntheses of the facts they've learned.  In order to accomplish this, critical thinking is... well, critical.  Great innovation does not come from blindly accepting the fixed beliefs and authority of your parents' generation -- it comes from questioning your own assumptions, and putting what you know together in a new, unexpected way.

And for me personally, I'm not going to stop challenging.  In fact, I teach a semester-long elective class called Critical Thinking that is one of the most popular electives in the school, and on the first day of class, I walk in and say, "Hi, class.  My name is Mr. Bonnet.  Why should you believe anything I say?"

After a moment's stunned silence, someone usually says, "Because you're a teacher."  (Every once in a while some wag will shout, "We don't!"  To which I respond, "Good!  You're on the right track.")  To those who say, "Because you're a teacher," I say, "Why does that matter?  Could a teacher be wrong?  Could a teacher lie?"

Of course, they acquiesce (some of them with a bit of discomfort).  So then I repeat my question; why would you believe what I'm saying?

This starts us off on an exploration of how you tell truth from lies; how you detect spin, marketing, bias, and half-truth; how to recognize logical fallacies; how to think critically in the realm of ethics and morals; and we end by taking apart the educational system, to give a thoughtful look at its successes and failures.  And (importantly!) I never once interject my own beliefs; I needle everyone equally.  When a student presses me to tell the class what I believe on a particular subject, my stock response is, "What I believe is irrelevant.  My job is to challenge you to examine your own beliefs, not to superimpose mine."

And this sort of thing is, apparently, what the Texas GOP would like to see eliminated from schools.  We mustn't have kids doubting the wisdom of the Powers-That-Be.  We must keep education in the realm of the vocabulary list and worksheet packet.  We mustn't challenge the status quo.  (And the darker, more suspicious side of my brain adds, "And we mustn't have the younger generation recognizing it when they're being lied to or misled.")

Well, I'm sorry.  You're wrong.  What you're suggesting is the very antithesis of education.  And the day I'm told that I can't do this any more -- that my teaching can't provoke, can't knock kids' preconceived notions off balance, can't ask the all-important question "Why do you think that?" -- that will be my last day in the classroom, because there won't be any place left in education for teachers like me.

Tuesday, June 19, 2012

Ten days till book release!

I'm going to take a brief diversion from our regularly-scheduled analysis of irrational nonsense to do a public service announcement (or shameless self-promotion, depending on how you see it).  I will be publishing (for Kindle [Amazon] and Nook [Barnes & Noble]) a collection of essays, the best of Skeptophilia, ten days from now -- it will be available on Friday, June 30, Lord willin' an' the creek don't rise.

Reasons you will want to buy this book:
  • It will be an opportunity to have all of your favorite essays from this blog in one place.
  • It contains lovely photographs of UFOs, spirits, and animals that probably don't exist, the latter including Bownessie, Japanese Sky Jellyfish, the Beast of Gévaudun, and Florida Skunk Apes.
  • You will find out why I still occasionally get hate mail from a bunch of irate British ghost hunters.
  • You will hear why a pissed off Young-Earth Creationist sent me a three-page long screed in which he referred to me as a "worthless wanker."
  • The cover photograph, which was designed and shot by the phenomenal Alex Solla, features me wearing a kickass sequined turban, to wit:

The collection contains 120 essays, each of them a wry, humorous, and occasionally incredulous look at why people believe crazy, counterfactual nonsense.

I hope you'll support my ongoing mission to foster critical thinking, rationalism, and skepticism -- both by continuing to read this blog, and also by buying this book (and reviewing it and recommending it to all of your friends).

Okay, that's it for the advertisement, at least for now.  Tomorrow, we'll be back to our regularly-scheduled hijinks. 

Saturday, April 21, 2012

You're my type

A few days ago I posted an article about a claim that Rh negative individuals are descended from aliens, or possibly Jesus, and this allows them to have a variety of superpowers.  The outcome of writing this is that my blog has been bombarded by a slew of advertisements revolving around blood types (not to mention aliens and Jesus),and this included one that claimed that before dating, you should always check your potential romantic interest's blood type.

Intrigued, I clicked the link, and after about a half-hour's rooting around online (during which thousands of innocent cells in my prefrontal cortex were subjected to unmentionable agony) I found this site, which seems to have the most detail about the whole thing.  It turns out that for some years now, woo-woos in Japan have claimed that your blood type (just the A/B/O group, not the Rh group; almost no one in Japan is Rh negative) influences your personality.  And of course, there's no way that Americans are going to read about any damnfool unscientific idea without a significant number of them going, "Wow, I never thought of that!"  Especially if the idea originated in Japan, which always seems to add a nice cachet of credibility.  So this has led to a whole new branch of personality-analysis pseudoscience, as if astrology wasn't enough.

According to Natalie Josef, the writer on the above website, not only does your blood type tell you your personality and who you should try to hook up with, it also predicts what career you should pursue:
Type O
You are the social butterflies. Often popular and self-confident, you are very creative and always seem to be the center of attention. You make a good impression on people and you’re often quite attractive. Organized and determined, your stubbornness will help you reach your goals. You make good leaders. Lovewise, O is most compatible with O and AB. Common career choices: banker, politician, gambler, minister, investment broker, and pro athlete.
Type A
Type As may seem calm on the outside, but inside, you’re filled with anxiety and worry. You’re perfectionists and often shy and sensitive. Usually introverted, you’re stable and thoughtful. You make good listeners and are sensitive to color and your surroundings. You like to be fashionable and are up on the latest trends, but never flashy or gaudy. You like romantic settings and often shun reality for fantasy worlds. A is most compatible with A and AB in the love department. Common career choices: accountant, librarian, economist, writer, computer programmer, and gossip columnist.
Type B
You can be very goal-oriented and often complete the ambitious tasks set before you. Outgoing and very charming, you’re good at reading people and providing support. Though critical of appearance (but not your own), you aren’t picky and are unlikely to dwell over the little things. Type Bs are impulsive individualists who often create their own path in life. You are very strong and optimistic. B is most compatible with B and AB lovers. Common career choices: cook, hairdresser, military leader, talk show host, and journalist.
Type AB
Not surprisingly, ABs can be quite dualistic, possessing both A and B traits. You may be shy and outgoing, and hesitant and confident. You often stand out from others, don’t like labels, and are nice and easy going. You are logical and determined to do things correctly. Usually trustworthy, you like to help others. You often speak in a serious manner. Your patience, concentration, and intelligence are admirable. AB can find a soul mate with any other blood type. Common career choices: bartender, lawyer, teacher, sales representative, and social worker.
Well, I'm a type A, and I have to admit that I am a bit of a border collie, personality-wise; but as far as being "fashionable," all I can say is that usually I go to work looking like I've been put through a dryer without "Cling-Free."  I probably own an iron, but I have no idea where it is, and my idea of color matching usually revolves around the concept of "everything goes well with khaki."  And in the career department, "writer" is an obvious hit, but the other ones ("Gossip columnist?" "Accountant?" What the hell?) are, shall we say, not very accurate.

What strikes me about all of this is the usual dart-thrower's bias phenomenon; we tend to notice the hits and ignore the misses.  But really, come on.  Are you really claiming that there are only four basic personality types?  Even the astrologers divide all of humanity twelve ways; the best you can do is four?

Then, after reading the article, I made the mistake of scrolling down to the comments.  This is, as I have mentioned before, usually a mistake.  My favorite one was the second comment, which revolved around the fact that the article had made a point that in Japan, believers in the whole blood-type-is-destiny don't like ABs very much.  This reader was upset by that:
Kudos on your article Natalie. I love learning something new all the time. I'm an AB+ as well, plus Asian astrology sign of Fire Horse. Not only did they abort as many unborn fire horses back in 1966 as they were able, (fire was considered an undesirable element with horse sign) but now I find out they also wouldn't want me due to my blood type! However, I have to say I love Asian food!
Okay.  Sure.  "Fire horses."  "Fire horse" + AB = "really bad."  But at least I like shrimp fried rice!  Yay!

I have to admit to deep mystification as to why an obviously absurd idea could possibly convince anyone, and I'm forced to the conclusion that the main problem is that a large fraction of humanity has no real understanding of the principles of scientific induction.  We are so immersed in a world of advertising claims, political sound-bites, and media glitz that "well, that sounds right!" has become the gold standard for belief.  Remarkably few people, upon reading a claim, seem even to take the next step, which is to ask the question, "how do I know that claim is true?", much less go on to asking, "if it is true, how could that possibly work?"  All in all, it makes me realize that as a science teacher, I have my work cut out for me.