Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label logic. Show all posts
Showing posts with label logic. Show all posts

Thursday, May 8, 2025

Fact blindness

[Spoiler alert!  This post contains spoilers for the most recent Doctor Who episode, "Lucky Day."  If you're planning on watching it and would prefer not to know about the episode's plot, watch it first -- but don't forget to come back and read this.]

In his book The Magician's Nephew, C. S. Lewis writes the trenchant line, "The trouble with trying to make yourself stupider than you actually are is that you usually succeed."

In one sentence, this sums up the problem I have with cynics.  Cynicism is often glorified, and considered a sign of intelligence -- cynics, so the argument goes, have "seen through" the stuff that has the rest of us hoodwinked.  It's a spectrum, they say, with gullibility (really dumb) on one end and cynicism (by analogy, really smart) on the other.

In reality, of course, cynicism is no better than gullibility.  I wouldn't go so far as to call either one "dumb" -- there are a lot of reasons people fall into both traps -- but they're both equally lazy.  It's just as bad to disbelieve and dismiss everything without thought as it is to believe and accept everything without thought.

The difficulty is that skepticism -- careful consideration of the facts before either believing or disbelieving a claim -- is hard work, so both gullibility and cynicism can easily become habits.  In my experience, though, cynicism is the more dangerous, because in this culture it's become attractive.  It's considered edgy, clever, tough, a sign of intelligence, of being a hard-edged maverick who isn't going to get taken advantage of.  How often do you hear people say things like "the media is one hundred percent lies" and "all government officials are corrupt" and even "I hate all people," as if these were stances to be proud of?

I called them "traps" earlier, because once you have landed in that jaundiced place of not trusting anything or anyone, it's damn hard to get out of.  After that, even being presented with facts may not help; as the old saw goes, "You can't logic your way out of a position you didn't logic your way into."  Which brings us to the most recent episode of Doctor Who -- the deeply disturbing "Lucky Day."

The episode revolves around the character of Conrad Clark (played to the hilt by Jonah Hauer-King), a podcast host who has become obsessed with the Doctor and with UNIT, the agency tasked with managing the ongoing alien incursions on Earth.  Conrad's laser focus on UNIT, it turns out -- in a twist I did not see coming -- isn't because he is supportive of what they do, but because he disbelieves it.


To Conrad, it's all lies.  There are no aliens, no spaceships, no extraterrestrial technology, and most critically, no threat.  It's all been made up to siphon off tax money to enrich the ones who are in on the con.  And he is willing to do anything -- betray the kindness and trust of Ruby, who was the Doctor's confidant; threaten UNIT members who stand in his way; even attempt to murder his friend and helper Jordan who allowed him to infiltrate UNIT headquarters -- in order to prove all that to the world.

It's a sharp-edged indictment of today's click-hungry podcasters and talk show celebrities, like Joe Rogan, Alex Jones, and Tucker Carlson, who promote conspiracies with little apparent regard for whom it harms -- and how hard it can be to tell if they themselves are True Believers or are just cold, calculating, and in it for the fame and money.  (And it's wryly funny that in the story, it's the people who disbelieve in aliens who are the delusional conspiracy theorists.)

The part that struck me the most was at the climax of the story, when Conrad has forced his way into UNIT's Command Central, and has UNIT's redoubtable leader, Kate Lethbridge-Stewart, held at gunpoint.  Kate releases an alien monster not only to prove to Conrad she and the others have been telling the truth all along, but to force his hand -- to make him "fish or cut bait," as my dad used to say -- and finally, finally, when the monster has Conrad pinned to the floor and is about to bite his face off, he admits he was wrong.  Ruby tases the monster (and, to Conrad's reluctant "thank you," tells him to go to hell -- go Ruby!).

But then, as he stands up and dusts himself off, he looks down at the monster and sneeringly says, "Well, at least your props and costumes are getting better."  And the monster suddenly lurches up and bites his arm off.

That's the problem, isn't it?  Once you've decided to form your beliefs irrespective of facts and logic, no facts or logic can ever make you budge from that position.

The world is a strange, chaotic place, filled with a vast range of good and bad, truth and lies, hard facts and fantasy, and everything in between.  If we want to truly understand just about anything we can't start out from a standpoint either of gullible belief or cynical disbelief.  Yes, teasing apart what's real from what's not can be exhausting, especially in human affairs, where motives of greed, power, and bigotry can so often twist matters into knots.  But if, as I hope, your intent is to arrive at the truth and not at some satisfying falsehood that lines up with what you already believed, it's really the only option.

I'm reminded of another passage from Lewis, this one from the end of his novel The Last Battle.  In it, the main characters and a group of Dwarves, led by one Diggle, have been taken captive and held in a dark, filthy stable.  All around them, the world is coming to an end; the stable finally collapses to reveal that they've all been transported to a paradisiacal land, and that the dire danger is, miraculously, over.  But the Dwarves, who had decided that everyone -- both the Good Guys and the Bad Guys -- were lying to them, still can't believe it, to the extent that they're certain they're still imprisoned:

"Are you blind?" said Tirian.

"Ain't we all blind in the dark?" said Diggle.

"But it isn't dark!" said Lucy.  "Can't you see?  Look up!  Look round!  Can't you see the sky and the tree and the flowers?  Can't you see me?"

"How in the name of all humbug can I see what ain't there?  And how can I see you any more than you can see me in this pitch darkness?"

Further attempts to prove it to them meet with zero success.  They've become so cynical even the evidence of their own eyes and ears doesn't help.  At that point, they are -- literally, in the context of the story -- fact blind.  Finally Diggle snarls:

"How can you go on talking all that rot?  Your wonderful Lion didn't come and help you, did he?  Thought not.  And now -- even now -- when you've been beaten and shoved into this black hole, just the same as the rest of us, you're still at your old game.  Starting a new lie.  Trying to make us believe we're none of us shut up, and it ain't dark, and heaven knows what."

Ultimately Lucy and Tirian and the others have to give up; nothing they can say or do has any effect.  Aslan (the lion referenced in the above passage) sums it up as follows:

"They will not let us help them.  They have chosen cunning instead of understanding.  Their prison is only in their own minds, yet, they are in that prison; and so afraid of being taken in that they can not be taken out."
****************************************


Friday, November 1, 2024

Wrongness

I get a lot of negative comments.

It comes with the territory, I suppose, and I knew when I started writing this blog fourteen years ago that I would have to develop a thick skin.  Given the subject matter, there's hardly a post I do that won't piss someone off.  Here's a sampling of comments, and a brief description of the topic that elicited them:
  • You are either ignorant or just stupid.  I'm putting my bet on the latter.  (after a post on machines that are supposed to "alkalinize" water to make it more healthful)
  • Narrow-minded people like you are the worst problem this society faces.  (after a post on "crystal healing")
  • I am honestly offended by what you wrote.  (after a post on alternative medicine)
  • I can't say I warm to your tone.  (after a post on ghost hunting)
  • That is the most ignorant thing I have ever read.  I could feel my IQ dropping as I read it.  (after a post in which I made a statement indicating that I think recent climate change is anthropogenic in origin)
  • I hate smug dilettantes like you.  (after a post on mysticism vs. rationalism)
  • You are a worthless wanker, and I hope you rot in hell.  (from a young-earth creationist)
My skin isn't thick enough that some of these don't sting.  For example, the one that called me a "smug dilettante" has a grain of truth to it; I'm not a scientist, just a retired science teacher, and if my educational background has a flaw it's that it's a light year across and an inch deep.  Notwithstanding that in a previous century people like me were called "polymaths," not "dabblers" or "dilettantes," the commenter scored a point, whether he knew it or not.  I'm well-read, and have a decent background in a lot of things, but I'm not truly an expert in anything.

Other disagreements on this list have been resolved by discussion, which is honestly what I prefer to do.  The comments that came from the posts on alternative medicine and ghost hunting generated fruitful discussion, and understanding (if not necessarily agreement) on both sides.

Most of the time, though, I just don't engage with people who choose to use the "Comments" section (or email) as a venue for snark.  You're not going to get very far by calling me ignorant, for example.  I make a practice of not writing about subjects on which I am ignorant, so even if I make an offhand comment about something, I try to make sure that I could back it up with facts if I needed to.  (Cf. this site, apropos of the individual who thinks I am ignorant for accepting the anthropogenic nature of recent climate change.  Plus, I once had the amazing Bill McKibben give me a thumbs-up for one of my climate change posts, which counts for a great deal.)

That said, what a lot of people don't seem to recognize about me is the extent to which my understanding of the world is up for grabs.  Like anyone, I do have my biases, and my baseline assumptions -- the latter including the idea that the universe is best understood through the dual lenses of logic and evidence.


But everything else?  My attitude is, if you want to try to convince me about Bigfoot or chakras or crystals or astrology or your particular take on religion or anything else, knock yourself out.  But you'd better have the evidence on your side, because even if I am a dilettante, I have read up on the topics on which I write.

I am as prone as the next guy, though, to getting it wrong sometimes.  And I am well aware of the fact that we can slide into error without realizing it.  As journalist Kathryn Schulz said, in her phenomenal lecture "On Being Wrong" (which you should all take fifteen minutes and watch as soon as you're done reading this):
How does it feel to be wrong?  Dreadful, thumbs down, embarrassing.  Those are great answers.  But they're answers to a different question.  (Those are) the answers to the question, "How does it feel to realize you're wrong?"  Realizing you're wrong can feel like all of that, and a lot of other things.  It can be devastating.  It can be revelatory.  It can actually be quite funny...  But just being wrong?  It doesn't feel like anything...  We're already wrong, we're already in trouble, but we still feel like we're on solid ground.  So I should actually correct something I said a moment ago: it does feel like something to be wrong.  It feels like being right.
To those who are provoked, even pissed off by what I write: good.  We never discover our errors -- and I'm very much including myself in this assessment -- without being knocked askew once in a while.  Let yourself be challenged without having a knee-jerk kick in response, and you have my word that I'll do the same.  And while I don't like having my erroneous thinking uncovered any more than anyone else, I will take a deep breath and admit it when I screw up.  I've published retractions in Skeptophilia more than once, which has been a profoundly humbling but entirely necessary experience.

So keep those cards and letters coming.  Even the negative ones.  I'm not going to promise you I'll change my mind on every topic I'm challenged on, but I do promise that I'll consider what you've said.

On the other hand, calling me a "worthless wanker" didn't accomplish much but making me choke-snort a mouthful of coffee all over my computer.  So I suppose that the commenter even got his revenge there, if only in a small way.

****************************************


Wednesday, January 27, 2021

Overcoming the snap

One of the most frustrating thing about conspiracy theorists is how resistant they are to changing their minds, even when presented with incontrovertible evidence.

Look, for example, at the whole "Stop the Steal" thing.  There are a significant number of Republicans who still won't acknowledge that Biden won the election fair and square, despite the fact that the opposite claim -- that there was widespread voter fraud that favored the Democrats, and an organized effort by the Left to make it seem like Trump lost an election he actually "won in a landslide" -- has gone to court in one form or another over sixty times, and in all but one case the lawsuit was thrown out because of a complete lack of evidence.  The judges who made these decisions include both Republicans and Democrats; the legal response to "Stop the Steal" has been remarkably bipartisan.

Which, you'd think, would be enough, but apparently it isn't.  An amazingly small number of Republicans have said publicly that they were wrong, there was little to no fraud, certainly not enough to sway the election, and that Biden clearly was the victor.  Mostly, the lack of evidence and losses in court has caused the True Believers double down, has made them even surer that a vast conspiracy robbed Trump of his win, and the lack of any kind of factual credibility is because there's an even vaster conspiracy to cover it all up.

Essentially, people have gone from "believe this because there's evidence" to "believe this despite the fact there's no evidence" to "believe this because there's no evidence."

[Image licensed under the Creative Commons SkepticalScience, Conspiracy Theories Fallacy Icon, CC BY-SA 4.0]

Once you've landed in the last-mentioned category, it's hard to see what possible way there'd be to reach you.  But there may be hope, to judge by a study that came out last week in The Journal of Personality and Social Psychology.

In "Jumping to Conclusions: Implications for Reasoning Errors, False Belief, Knowledge Corruption, and Impeded Learning," by Carmen Sanchez of the University of Illinois - Urbana/Champaign and David Dunning of the University of Michigan (of Dunning-Kruger fame), we find out that there is a strong (and fascinating) correlation between four features of the human psyche:

  • Jumping to conclusions -- participants were given a task in which a computerized character was fishing in a lake.  The lake had mostly red fish and a few gray fish, and the researchers looked at how quickly the test subject was confident about predicting the color of the next fish pulled from the lake.
  • Certainty about false beliefs -- volunteers were given a test of their knowledge of American history, and for each four-answer multiple choice question they were asked how confident they were in their answer.  The researchers looked at people who got things wrong -- while simultaneously being certain they were right.
  • Understanding of basic logic -- participants were given a variety of logic puzzles, such as simple syllogisms (All fish can swim; sharks are fish; therefore sharks can swim), and asked to pick out which ones were sound logic and which were faulty.
  • Belief in conspiracy theories -- test subjects were given a variety of common conspiracy theories, such as the belief that cellphones cause cancer but it's being covered up by big corporations, and asked to rank how likely they thought the beliefs were to be true.

They found that the faster you are to jump to conclusions on the fish test, the worse you are at logic, and the more certain you are about your beliefs even if they are wrong -- and, most critically, the more likely you are to believe spurious, zero-evidence claims.

So far, nothing too earth-shattering, and I think most of us could have predicted the outcome.  But what makes this study fascinating is that Sanchez and Dunning looked at interventions that could slow people down and make them less likely to jump to false conclusions -- and therefore, less likely to feel certain about their own false or counterfactual beliefs.

The intervention had four parts:

  • An explanation of the "jumping to conclusions" phenomenon, including an explanation of why it happens in the brain and the fact that we are all prone to this kind of thing.
  • An acknowledgement of the difficulty of making a correct decision based on incomplete information.  Test subjects were shown a zoomed-in photo, and then it was zoomed out a little bit at a time, and the test subjects had to decide when they were sure of what they were looking at. 
  • An exercise in studying optical illusions.  Here, the point was to illustrate the inherent flaws of our own sensory-integrative mechanisms, and how focusing on one thing can make you miss details elsewhere that might give you more useful information.
  • A short video of a male jogger who compliments a female street artist, and gets no response.  He repeats himself, finally becoming agitated and shouting at her, but when she reacts with alarm he turns and runs away.  Later, he finds she has left him a picture she drew, along with a note explaining that she's deaf -- leaving the guy feeling pretty idiotic and ashamed of himself.  This was followed up by asking participants to write down snap judgments they'd made that later proved incorrect, and what additional information they'd have needed in order to get it right.

This is where I got a surprise, because I've always thought of believers in the counterfactual as being essentially unreachable.  And the intervention seems like pretty rudimentary stuff, something that wouldn't affect you unless you were already primed to question your own beliefs.  But what Sanchez and Dunning found is that the individuals who received the intervention did much better on subsequent tasks than the control group did -- they were more accurate in assessing their own knowledge, slower to make snap judgments, and less confident about crediting conspiracy theories.

I don't know about you, but I find this pretty hopeful.  It once again reinforces my contention that one of the most important things we can do in public schools is to teach basic critical thinking.  (And in case you didn't know -- I have an online critical thinking course through Udemy that is available for purchase, and which has gotten pretty good reviews.)

So taking the time to reason with people who believe in conspiracies can actually be productive, and not the exercise in frustration and futility I thought it was.  Maybe we can reach the "Stop the Steal" people -- with an intervention that is remarkably simple.  It's not going to fix them all, nor eradicate such beliefs entirely, but you have to admit that at this point, any movement in the direction of rationality is worth pursuing.

****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Friday, November 17, 2017

Motivated reasoning

Last week there was a paper released in the Journal of Personality and Individual Differences called, "Epistemic Rationality: Skepticism Toward Unfounded Beliefs Requires Sufficient Cognitive Ability and Motivation to be Rational."  Understandably enough, the title made me sit up and take notice, as this topic has been my bread and butter for years.  The authors, Tomas StÃ¥hl (of the University of Illinois) and Jan-Willem van Prooijen (of the Vrije Universiteit Amsterdam), describe their work thus:
Why does belief in the paranormal, conspiracy theories, and various other phenomena that are not backed up by evidence remain widespread in modern society?  In the present research we adopt an individual difference approach, as we seek to identify psychological precursors of skepticism toward unfounded beliefs.  We propose that part of the reason why unfounded beliefs are so widespread is because skepticism requires both sufficient analytic skills, and the motivation to form beliefs on rational grounds...  [W]e show that analytic thinking is associated with a lower inclination to believe various conspiracy theories, and paranormal phenomena, but only among individuals who strongly value epistemic rationality...  We also provide evidence suggesting that general cognitive ability, rather than analytic cognitive style, is the underlying facet of analytic thinking that is responsible for these effects.
The first bit is hardly a surprise, and is the entire raison d'être of my Critical Thinking class.  Skepticism is not only a way of looking at the world, it's a skill; and like any skill, it takes practice.  Adopting a rational approach to understanding the universe means learning some of the ways in which irrationality occurs, and figuring out how to avoid them.

The second part, though, is more interesting, but also more insidious: in order to be a skeptic, you have to be motivated toward rational thought -- and value it.

Aristotle Teaching Alexander the Great (Charles Laplante, 1866) [image courtesy of the Wikimedia Commons]

This explains the interaction I had with one of my AP Biology students many years ago.  Young-Earth creationists don't, by and large, take my AP class.  My background is in evolutionary genetics, so most of them steer clear, sensing that they're in hostile territory.  (I will say in my own defense that I never treat students in a hostile manner; and the few times I have had a creationist take my class, it was a positive experience, and kept me on my toes to present my arguments as cogently as possible.)

This young lady, however, stood out.  She was absolutely brilliant, acing damn near every quiz I gave.  She had a knack for understanding science that was nothing short of extraordinary.  So we went through the unit on genetics, and I presented the introduction to the unit on evolution, in which I laid out the argument supporting the theory of evolution, explaining how it fits every bit of hard evidence we've got.

That day, she asked if she could talk to me after class.  I said, "Sure," and had no guess about what she might have wanted to talk to me about.

I was absolutely flabbergasted when she said, "I just want you to know that I'm a creationist."

I must have goggled at her for a moment -- after (at that point) two decades as a teacher, I had pretty good control over my facial expressions, but not that good.  She hastily added, "I'm not saying I'm going to argue with you, or that I'm refusing to learn the material, or anything.  I just wanted you to know where I was coming from."

I said, "Okay.  That's fine, and thanks for being up front with me.  But do you mind if I ask you a couple of questions?"

She said, "Not at all."

So I asked her where the argument I'd presented in class fell apart for her.  What part of the evidence or logical chain didn't work?

She said, "None of it.  It's all logical and makes perfect sense."

I must have goggled again, because she continued, "I understand your argument, and it's logically sound.  I don't disbelieve in the evidence you told us about.  But I still don't believe in evolution."

The upshot of it was that for her, belief and rationality did not intersect.  She believed what she believed, and if rational argument contradicted it, that was that.  She didn't argue, she didn't look for counterevidence; she simply dismissed it.  Done.

The research by StÃ¥hl and van Prooijen suggests that the issue with her is that she had no motivation to apply rationality to this situation.  She certainly wasn't short of cognitive ability; she outperformed most of the students in the class (including, I might add, on the test on evolutionary theory).  But there was no motive for her to apply logic to a situation that for her, was beyond the reach of logic.  You got there by faith, or not at all.

To this day, and of all the students I've taught, this young lady remains one of the abiding puzzles.  Her ability to compartmentalize her brain that way -- I'll apply logic here, and it gives me the right answers, but not here, because it'll give me the wrong answers -- is so foreign to my way of thinking that it borders on the incomprehensible.  For me, if science, logic, and rationality work as a way of teasing out fact from falsehood, then -- they work.  You can't use the same basic principles and have them alternate between giving you true and false conclusions, unless the method itself is invalid.

Which, interestingly, is not what she was claiming.

And this is a difficulty that I have a hard time seeing any way to surmount.  Anyone can be taught some basic critical thinking skills; but if they have no motivation to apply them, or (worse) if pre-existing religious or political beliefs actually give them a motivation not to apply them, the argument is already lost.

So that's a little depressing.  Sorry.  I'm still all for teaching cognitive skills (hell, if I wasn't, I'm seriously in the wrong profession).  But what to do about motivation is a puzzle.  It once again seems to me that like my student's attitude toward faith-based belief, being motivated to use logic to understand your world is something about which you have to make a deliberate choice.

You get there because you choose to accept rational argument, or you don't get there at all.

Tuesday, September 26, 2017

Right in the gut

I know I've said it before, but it bears saying again: the strength of science lies in its reliance on hard evidence as the sine qua non of understanding.

I've tried to embrace this outlook myself, insofar as a fallible and biased human can do so.  Okay, so every day I poke fun at all sorts of odd beliefs, sometimes pissing people off.  But you know what?  You want to convince me, show me some reliable evidence.  For any of the claims I've scoffed at.  Bigfoot.  Ghosts.  ESP.  Astrology.  Tarot divination.  Homeopathy.

Even the existence of god.

I'm convinceable.  All you have to do is show me one piece of irrefutable, incontrovertible evidence, and I'm sold.

The problem is, to my unending frustration and complete bafflement, most people don't approach the world that way.  Instead, they rely on their gut -- which seems to me to be a really good way to get fooled.  I'm a pretty emotional guy, and I know my gut is unreliable.

Plus, science just doesn't seem to obey common sense at times.  As an example, consider the Theory of Relativity.  Among its predictions:
  • The speed of light is the ultimate universal speed limit.
  • Light moves at the same speed in every reference frame (i.e., your own speed relative to the beam of light doesn't matter; you'll still measure it as traveling at 300,000,000 meters per second).
  • When you move, time slows down.  The faster you move, the slower time goes.  So if you took off in a rocket ship to Alpha Centauri at 95% of the speed of light, when you came back from your trip you'd find that while twelve years or so would have passed for you, hundreds of years would have passed on Earth.
  • When you move, to a stationary person your mass increases and your length in the direction of motion contracts.  The faster you move, the more pronounced this effect becomes.
And so on.  But the kicker: all of these predictions of the Theory of Relativity have been experimentally verified.  As counterintuitive as this might be, that's how the world is.  (In fact, relativistic effects have to be taken into account to have accurate GPS.)

None of which we would know now if people relied solely on their gut to tell them how things work.

Despite all this, there are people who still rely on impulse and intuition to tell them what's true and what's not.  And now a study jointly conducted by researchers at Ohio State University and the University of Michigan has shown conclusively that if you do this, you are more prone to being wrong.

[image courtesy of the Wikimedia Commons]

Kelly Garrett and Brian Weeks decided to look into the connection between how people view evidence, and their likelihood of falling for incorrect information.  They looked at survey data from almost 3,000 people, in particular focusing on whether or not the respondents agreed with the following statements:
  • I trust my gut to tell me what’s true and what’s not. 
  • Evidence is more important than whether something feels true.
  • Facts are dictated by those in power.
They then correlated the responses with the participants' likelihood of believing a variety of conspiracy theories.  Unsurprisingly, they found that the people who relied on gut feelings and emotions to determine the truth were far more likely to fall for conspiracies and outright untruths.

"Misperceptions don’t always arise because people are blinded by what their party or favorite news outlet is telling them," Weeks said.  "While trusting your gut may be beneficial in some situations, it turns out that putting faith in intuition over evidence leaves us susceptible to misinformation."

"People sometimes say that it’s too hard to know what’s true anymore," Garrett said.  "That’s just not true.  These results suggest that if you pay attention to evidence you’re less likely to hold beliefs that aren’t correct...  This isn’t a panacea – there will always be people who believe conspiracies and unsubstantiated claims – but it can make a difference."

I'd say it makes all the difference.  And in the current political environment -- where accusations of "fake news" are thrown around right and left, and what people consider to be the truth depends more on political affiliation than it does on rational fact -- it's more than ever absolutely essential.

Wednesday, May 31, 2017

The fact of the matter

A couple of days ago I made the mistake of participating in that most fruitless of endeavors: an online argument with a total stranger.

It started when a friend of mine posted the question of whether the following quote was really in Hillary Clinton's book, It Takes a Village:


It isn't, of course, and a quick search was enough to turn up the page on Snopes that debunks the claim.  I posted the link, and my friend responded with a quick thanks and a comment that she was glad to have the straight scoop so that she wasn't perpetuating a falsehood.  And that should have been that.

And it would have been if some guy hadn't commented, "Don't trust Snopes!!!"  A little voice in the back of my head said, "Don't take the bait...", but a much louder one said, "Oh, for fuck's sake."  So I responded, "Come on.  Snopes is one of the most accurate fact-checking sites around.  It's been cross-checked by independent non-partisan analysts, and it's pretty close to 100% correct."

The guy responded, "No, it's not!"

You'd think at this point I'd have figured out that I was talking to someone who learned his debate skills in Monty Python's Argument Clinic, but I am nothing if not persistent.  I found the analysis I had referred to in my previous comment, and posted a clip from a summary of it on the site Skeptical Science:
Jan Harold Brunvand, a folklorist who has written a number of books on urban legends and modern folklore, considered the site so comprehensive in 2004 as to obviate launching one of his own.[10] 
David Mikkelson, the creator of the site, has said that the site receives more complaints of liberal bias than conservative bias,[23] but insists that the same debunking standards are applied to all political urban legends.  In 2012, FactCheck.org reviewed a sample of Snopes’ responses to political rumors regarding George W. Bush, Sarah Palin, and Barack Obama, and found them to be free from bias in all cases.  FactCheck noted that Barbara Mikkelson was a Canadian citizen (and thus unable to vote in US elections) and David Mikkelson was an independent who was once registered as a Republican.  “You’d be hard-pressed to find two more apolitical people,” David Mikkelson told them.[23][24]  In 2012, The Florida Times-Union reported that About.com‘s urban legends researcher found a “consistent effort to provide even-handed analyses” and that Snopes’ cited sources and numerous reputable analyses of its content confirm its accuracy.[25]
And he responded, "I disagree with you, but I respect your right to your opinion."

At that point, I gave up.

But I kept thinking about the exchange, particularly his use of the word "opinion."  It's an odd way to define the term, isn't it?  It's an opinion that I think single-malt scotch tastes good with dark chocolate.  It's an opinion that I detest the song "Stayin' Alive."

But whether Snopes is accurate or not is not an opinion.  It is either true, or it is not.  It's a little like the "flat Earth" thing.  If you believe, despite the overwhelming evidence, that the Earth is anything but an oblate spheroid, that is not "your opinion."

You are simply "wrong."

Now, I hasten to add that I don't think all of my own beliefs are necessarily correct.  After all, I haven't cross-checked Snopes myself, so I'm relying on the expertise of Brunvand et al. and trusting that they did their job correctly.  To the best of my knowledge, Snopes is accurate; and if anyone wants me to think otherwise, they need to do more than say "No, it isn't" every time I open my mouth.

But to call something like that an "opinion" implies that we all have our own sets of facts, even though many of them contradict each other, with the result that we all do what writer Kathryn Schulz calls "walking around in our little bubbles of being right about everything."  It's a little frightening how deep this mindset goes -- up to and including Donald Trump's shrieking "Fake news!" every time he hears something about him or his administration that he doesn't like.

I can understand wanting reality to be a different way than it is.  Hell, I'd rather teach Defense Against the Dark Arts at Hogwarts than biology in a public high school.  But wishin' don't make it so, as my grandma used to say, and once you grow up you need to face facts and admit it when you're wrong.  And, most importantly, recognize that the evidence won't always line up with your desires.  As President John Adams put it, "Facts are stubborn things.  Whatever our wishes, inclinations, and passions, they cannot alter the facts and the evidence."

Saturday, August 30, 2014

Tarring with one brush

I frequently visit the r/atheism subreddit as a way of keeping abreast of current happenings in the world of the irreligious.  Although I find a good many of the articles linked on the site to be interesting, there's one frequent type of post that drives me crazy.

Almost every time I visit the site, there is at least one article that has to do with some religious person doing a bad thing.  Today when I checked, there was an article about a teacher at a Baptist religious school who is accused of raping one of his (male) students, and an article about the leader of an evangelical Christian megachurch in Nigeria who is being divorced by his wife for "adultery and unreasonable behavior."

And every time these sorts of stories are posted, there are numerous comments to the effect that this sort of behavior shows that the religious worldview is wrong.

Can we be clear on something, here?  Finding people who do bad things has no bearing on whether their views on god's existence are correct or not.  People who preach holiness and then victimize their fellow humans are hypocrites.  Depending on what kind of victimization they perpetrated, they may also be evil.

But neither of those has any relevance to the correctness of their philosophy.

It's not, of course, only something atheists do.  This kind of illogic is no respecter of worldview. This is, in part, why we atheists hate it when one of our number says something outrageous.  (Richard Dawkins' recent statements regarding Down syndrome and abortion are a good case in point.)  It raises the unfortunate tendency for people to tar all atheists with the same brush -- as if either (1) my agreement with Dawkins about god's existence means I agree with him on everything, or (2) Dawkins' views on the ethics of carrying a Down syndrome fetus to term is an inescapable conclusion of not believing in a higher power.

Neither one of these statements is logically correct.

You can be an atheist and be an utter asshole.  You can be an atheist and be wrong about damn near everything else.  Conversely, you can be a kind, compassionate, moral atheist whose other views are brilliantly well thought-out and rational.

And anyone who agrees with the above statement -- which, I hope, includes virtually all of the people reading this -- then the implication is that we shouldn't do the same thing to the religious.

Cherry-picking a few hypocritical nasties who are Christian leaders does not bolster the atheist viewpoint, any more than pointing out that Stalin was an atheist bolsters the Christian one.  Now mind you, I don't think there's anything wrong with calling out a hypocrite on his hypocrisy; we gain nothing by covering up the truth, as (it is to be hoped) the Vatican is finally learning with respect to pedophile priests.

But we have to be careful to separate the logical arguments for and against a particular philosophical view with our pointing fingers at the moral lapses of the people who hold those views.  The two are not the same, and neither side does itself any favors by blurring those boundaries.

Don't get me wrong.  I still think the support for the religious worldview is thin at best.  I'd much rather trust the evidence to lead me where logic and rationality demand, and thus far, that's very much in the direction of there not being some sort of divine Prime Mover.


But that says nothing about whether or not I am a moral person.  And this is why using the transgressions of Christians as an argument for atheism doesn't gain us anything.  All it means is that some of us don't understand the rules of logic ourselves.

Monday, August 11, 2014

Wrongness

I get a lot of negative comments.

It comes with the territory, I suppose, and I knew when I started writing this blog four years ago that I would have to develop a thick skin.  Given the subject matter, there's hardly a post I do that won't piss someone off.  Here's a sampling of comments, and a brief description of the topic that elicited them:

  • You are either ignorant or just stupid.  I'm putting my bet on the latter.  (after a post on machines that are supposed to "alkalinize" water to make it more healthful)
  • Narrow-minded people like you are the worst problem this society faces.  (after a post on "crystal healing")
  • I am honestly offended by what you wrote.  (after a post on alternative medicine)
  • I can't say I warm to your tone.  (after a post on ghost hunting)
  • That is the most ignorant thing I have ever read.  (after a post in which I made a statement indicating that I think recent climate change is anthropogenic in origin)
  • I hate smug dilettantes like you.  (after a post on mysticism vs. rationalism)
  • You are a worthless wanker, and I hope you rot in hell.  (from a young-earth creationist)
My skin isn't thick enough that some of these don't sting.  For example, the one that called me a "smug dilettante" has a grain of truth to it; I'm not a scientist, just a science teacher, and if my educational background has a flaw it's that it's a light year across and an inch deep.  Notwithstanding that in a previous century people like me were called "polymaths," not "dabblers" or "dilettantes," the commenter scored a point, whether he knew it or not.  I'm well-read, and have a decent background in a lot of things, but I'm not truly an expert in anything.

Other disagreements on this list have been resolved by discussion, which is honestly what I prefer to do.  The comments that came from the posts on alternative medicine and ghost hunting generated fruitful discussion, and understanding (if not necessarily agreement) on both sides.

Most of the time, though, I just don't engage with people who choose to use the "Comments" section (or email) as a venue for snark.  You're not going to get very far by calling me ignorant, for example.  I make a practice of not writing about subjects on which I am ignorant (e.g. politics), so even if I make an offhand comment about something, I try to make sure that I could back it up with facts if I needed to.  (Cf. this site, apropos of the individual who thinks I am ignorant for accepting the anthropogenic nature of recent climate change.)

That said, what a lot of people don't seem to recognize about me is the extent to which my understanding of the world is up for grabs.  Like anyone, I do have my biases, and my baseline assumptions -- the latter including the idea that the universe is best understood through the dual lenses of logic and evidence.


But everything else?  My attitude is, if you want to try to convince me about Bigfoot or chakras or crystals or astrology or anything else, knock yourself out.  But you'd better have the evidence on your side, because even if I am a dilettante, I have read up on the topics on which I write.

I am as prone as the next guy, though, to getting it wrong sometimes.  And I am well aware of the fact that we can slide into error without realizing it.  As journalist Kathryn Schulz said, in her phenomenal lecture "On Being Wrong" (which you should all take ten minutes and watch as soon as you're done reading this):
How does it feel to be wrong?  Dreadful, thumbs down, embarrassing.  Those are great answers.  But they're answers to a different question.  (Those are) the answers to the question, "How does it feel to realize you're wrong?"  Realizing you're wrong can feel like all of that, and a lot of other things.  It can be devastating.  It can be revelatory.  It can actually be quite funny...  But just being wrong?  It doesn't feel like anything...  We're already wrong, we're already in trouble, but we still feel like we're on solid ground.  So I should actually correct something I said a moment ago: it does feel like something to be wrong.  It feels like being right.
To those who are provoked, even pissed off by what I write: good.  We never discover our errors -- and I'm including myself in this assessment -- without being knocked askew once in a while.  Let yourself be challenged without having a knee-jerk kick in response, and you have my word that I'll do the same.  And while I don't like having my erroneous thinking uncovered any more than anyone else, I will own up when I screw up.  I've published retractions in Skeptophilia more than once, which has been a profoundly humbling but entirely necessary experience.

So keep those cards and letters coming.  Even the negative ones.  I'm not going to promise you I'll change my mind on every topic I'm challenged on, but I do promise that I'll consider what you've said.

On the other hand, calling me a "worthless wanker" didn't accomplish much but making me choke-snort a mouthful of coffee all over my computer.  So I suppose that the commenter even got his revenge there, if only in a small way.

Tuesday, April 1, 2014

Dreams, wishful thinking, and religious belief

This post is, honestly, a question rather than an answer.

I know I come across as critical of religion at times, and in my own defense I have to say that usually it has to do with the kinds of things that religion incites people to do -- such as Pat Robertson's recent pronouncement that Christians are being oppressed by gays, and that Jesus would have been in favor of stoning gays to death, and evangelist Tristan Emmanuel's recommendation that Bill Maher should be publicly whipped because he's an atheist.

But as far as the religious beliefs themselves, mostly what I feel is incomprehension.  When I've asked people why they believe in god -- something I tend not to do, being that I'm not so excited about being publicly whipped myself -- I usually get answers that fall into one of the following categories:
  1. Personal revelation -- the individual has had some kind of experience that convinces him/her that a deity exists.
  2. Authority -- being raised in the church, and/or respecting its leaders and their views, have led the person to accept those beliefs as true.
  3. It's appealing -- they'd like there to be a god, so there is one.
My problem with all of that is that I'm not especially confident of my own brain's ability, in the absence of hard evidence, to tell truth from fiction.   I know there have been times that I have desperately wanted something to be true -- usually in the realm of personal relationships -- but my own dubious ability to read the signs correctly, plus a regrettable tendency toward wishful thinking, led me to the wrong answer on more than one occasion.

So how likely would I be to land on the right answer with respect not only to whether or not a god exists, but what his/her/its nature is, given the thousands of different answers humans have come up with over the centuries?  It'd be pretty embarrassing, for example, to spend my life worshiping Yahweh, and then die and find out too late that I should have been making sacrifices to Anubis or something.

[image courtesy of Jeff Dahl and the Wikimedia Commons]

I ran into an especially good example of this yesterday on the site Charisma News, where a writer tried to explain how to know if your dreams come from god or not.  Because, I suppose, if you buy into that worldview, there are three choices: (1) your dream comes from god, and you should obey whatever it says; (2) your dream comes from the devil, and you should not do whatever it says; (3) or your dream is just a dream and you shouldn't worry about it.  I suspect that most of mine fall into the last category, because they tend to be bizarre, like my dream a couple of nights ago wherein I was trying to fight off a werewolf by spraying it in the face with a garden hose.

But Audrey Lee tells us in the Charisma News article that it's a real problem, and we don't want to get it wrong:
It would be naive and irresponsible to suggest that all spiritual dreams result in a true God connection.  Dreamers who mistake their own subconscious thoughts or even demonic influence as divine instruction can make grim and historic mistakes.  Recently a woman in a rural village sacrificed her child in the river out of obedience to what she thought was a dream from God.
So, yeah.  That'd be bad.  Lee goes on to tell us that there are four criteria that we should use to determine if our dreams are god-induced: (1) the dream's content doesn't contradict the bible; (2) it's "convicting" [sic]; (3) it lingers in the memory; and (4) it predicts things that come to pass.

So based on these four criteria, I'd guess the werewolf-and-garden-hose dream doesn't measure up except for the fact that I still remember it.  But it does raise a question, which is, couldn't you have a non-bible-contradicting dream that you remember and find convincing, and it still is just a dream?  Doesn't the whole thing still turn on your kind of looking at it and saying, "Yeah, seems right to me?", without anything resembling hard evidence?

I simply don't find that sort of thing a reliable protocol for determining the truth.  Maybe it's because I don't trust myself enough; but I think that our brains come pre-installed with so many ways of getting it wrong that we need to have an external standard in order to be certain.  For me, that standard is science -- i.e., evidence, logic, and rationality.  None of the "internal ways of knowing" have ever really made sense to me.

Now, I'll admit up front that I'm no philosopher, and deeper minds than mine may well have a better answer to all of this.  If so, I'm open to listening.  But until then, I still can't see any dependable way to get at the truth other than hard evidence -- much as my wishful thinking would like to say otherwise.

Monday, March 4, 2013

The case of the telepathic mice

One area in which a lot of people could use some work is in how to draw logical connections.

It's not that it's necessarily that simple.  Given a lot of facts, the question, "Now what does this all mean?" can be decidedly non-trivial.  After all, if it were trivial, there would be only one political party, and the only job we skeptics would have would be uncovering what the facts actually are.  The deductive work, the drawing of a conclusion, would be quick and unanimous, and Washington DC would be a decidedly more congenial place.

To take a rather simpler example, let's look at the following picture, that's been making the rounds of the social network lately:


Even ignoring the rather dubious religious aspect, this seems to me to be a rather ridiculous conclusion.  Just because these foods vaguely resemble a human organ (really vaguely, in the case of the tomato and the heart), is their supposed beneficial effect on that organ why they look that way?  It doesn't take a rocket scientist, nor a botanist, to find a dozen counter-examples, of plants that look like a human organ, but which have no beneficial effects on that organ whatsoever.  (This whole idea goes back to medieval times, when it was known as the "Doctrine of Similars."  It's why so many plants' names end in "-wort" -- wyrt was Old English for "plant," and the doctors of that time, whom we must hope had their malpractice insurance paid up, used lungwort, liverwort, spleenwort, and the rest to try to cure their patients.  No wonder the life expectancy back then was so low.)

On the other hand, Amanita mushrooms look a little like a penis, and if you eat one, you're fucked.  So maybe there's something to this after all.

In any case, let's move on to something a little trickier -- last week's story of the telepathic mice.

Miguel Nicolelis, of Duke University, announced last week that he'd been able to accomplish something that no one had done -- to create a device that allowed the electrical firings in one brain (in this case, a mouse) to be beamed to another brain, influencing that brain's firing.  In his paper, released in Nature, Nicolelis and his team describe engineering microelectrodes that were then implanted in the primary motor cortex of mouse #1.  These electrodes are capable of detecting the neural firing pattern in the mouse's brain -- specifically, to determine which of a pair of levers the mouse selected to pull.  A second mouse has a different set of implants -- one which stimulate neurons.  If mouse #1 pulls the right hand lever, and mouse #2 does, too, they both get a treat.  They can't see each other -- but the electrodes in the brain of mouse #1 sends a signal, via the electrode array, to the electrodes in the brain of mouse #2, stimulating it to choose the correct lever.

Direct, brain-to-brain communication.  Obvious application to medicine... and the military.  But my problem is how it's been described in popular media.  Everyone's calling it "telepathy" -- making a number of psychic websites erupt in excited backslapping, claiming that this "scientifically proves telepathy to be real."  "They just showed what we've been claiming for decades," one thrilled woo-woo stated.

The problem is -- is this actually telepathy?  Well, in one limited sense, yes; the word, after all, comes from the Greek tele (distant) + pathéia (feeling).  So, yes, the mice were able to feel, or at least communicate, at a distance.  But remember that the only reason it worked was that both the encoder and the decoder mouse had electrode arrays stuck into their brains.  There's an understood mechanism at work here; Nicolelis knows exactly how the signal from mouse #1 got to mouse #2 and stimulated its brain to perform the task correctly.  This is in exact opposition to the usual claims of telepathy -- that somehow (no mechanism specified) one human brain can pick up information from another, sometimes over great distances.  Complex information, too; not just enough to know which lever to choose, but whole conversations, visual images, sounds, and emotions.

Oh, and some people think they can get into telepathic contact with their pets.  Which adds a whole new level of craziness to the claim.

So, actually, what Nicolelis got his mice to do isn't telepathy at all, at least not in the usual sense of the word.  But on a surface read, it would be easy to miss the difference, to see why (in fact) his experiment makes the claims of the telepaths less likely, not more.  If it takes fancy arrays of electrodes to allow the transmission of even the simplest of information, how on earth could two brains communicate far more complex information, without any help at all?  Add that to the fact that there has not been a single experiment that has conclusively demonstrated that telepathy, as advertised, actually exists (for an excellent, and unbiased, overview of the history of telepathy experiments, go here).  It seems very likely, just based on the evidence, that telepathy doesn't exist -- not between Nicolelis' mice, and certainly not between humans.

Just as well, really.  I'd really rather people not read my mind.  For one thing, my brain can be a little... distractible:


Most days, reading my mind would be the telepathic equivalent of riding the Tilt-o-Whirl.  So probably better that my thoughts remain where they are, bouncing randomly off the inside of my skull as usual.

Friday, August 3, 2012

Irony, irrationality, and self-contradiction

It is a source of immense frustration to me that people seem to be quite good at accusing those they disagree with of being irrational, while ignoring completely the irrationality of their own arguments.

And I'm not pointing fingers at any particular political or philosophical stance here; liberals and conservatives both seem to do this with equal frequency.  For example, take the recent Chick-fil-A kerfuffle.

Probably all of you know that the controversy started when Dan Cathy, CEO of Chick-fil-A, told the Baptist Press that his company is "very supportive... of the biblical definition of the family unit."  This started a firestorm of reaction, with gay rights advocates clamoring for a boycott (and organizing a "kiss-in," in which same-sex couples would kiss in a Chick-fil-A).  All of the "sanctity of marriage" folks responded by singing Cathy's praises.  Mike Huckabee organized a "Chick-fil-A Appreciation Day," and from the preliminary numbers, it looks like the company may have had its best sales day ever.

Now, I have no intent in this post to address the human rights issue; I've stated my opinion on that subject loud and clear in other posts.  What I'd like to look at here is the fact that Chick-fil-A's supporters characterized this as a free-speech issue -- that Cathy had a perfect right to state his opinion, and those supporting a boycott were advocating a restriction on constitutionally protected free speech.

Interesting that when the tables were turned, exactly the opposite happened.

Remember the "rainbow Oreo?"  Of course, the huge rainbow cookie itself was never manufactured; but a photoshopped image of an Oreo with rainbow layers was widely publicized, and Kraft Foods captioned the image, "Proudly Support Love."  Gay rights supporters gave the advertisements shouts of acclamation, while religious conservatives advocated boycotts, with one outraged customer stating, "I'll never eat an Oreo again" -- and the gay rights supporters objected to the conservatives' proposed boycotts on the basis of free speech!

It puts me in mind of Ted Rall's quote, "Everyone supports the free speech they agree with."

Honestly, my own position is that if you don't like a particular company's political stance, it is entirely your choice not to patronize it.  But in this country, a CEO -- like the rest of us -- has the constitutionally-protected right to state his or her opinion.  And this includes opinions that might not be popular.

The acceptance of contradictory stances (often while decrying the contradictory stances in our opponents) doesn't end there, however.  Take a look at this website, entitled "Confuse a Liberal Use Facts and Logic" (lack of punctuation is the author's).  A brief look at the statements there (I hesitate to dignify them with the name "arguments") will suffice, because the majority of them are classic examples of the Straw Man fallacy -- take an example of a view held by the most extreme of your opponents, exaggerate it, and then knock it down, and claim that thereby you have destroyed his/her entire political party's platform.  The most interesting ones, however, are:
  • Ask them why they oppose the death penalty but are okay with killing babies.
  • Ask them why homo****** parades displaying drag, tran******s and bestiality should be protected under the First Amendment, but manger scenes at Christmas should be illegal.
  • Ask them why criticizing a left-wing actor or musician for the things they say or do, and refusing to attend their concerts, buy their albums, or see their movies, amounts to censorship, but boycotting Rush Limbaugh's or Laura Ingraham's advertisers is free speech. 
Okay, fair enough (even though I have to wonder why this guy thinks that "sexual" is a dirty word and needs to be bleeped out; but let's ignore that for the moment).  Does he really not see that the same arguments could be flipped around, and would be equally contradictory?  "Thou shalt not kill" means, so far as I can see, "thou shalt not kill;" if you're using that to argue against abortion, you have a lot of explaining to do if you support the death penalty.  (One commenter said, when confronted with this question, "A fetus never brutally murdered an innocent person," which is true but doesn't answer the question.)  Liberals who support gay-pride parades and the like as free speech, but object to a manger scene at Christmas, are espousing a contradiction, sure; especially if the manger scene is in someone's yard or in a privately-owned business, and the issues of taxpayer money and church/state separation don't enter into it.  But the reverse is an equal contradiction -- as long as the gay paraders follow the law, they are just as covered under free speech as the Christmas crèche creators are.  And conservatives are just as guilty of #3 as the liberals are; ask the Dixie Chicks.

The bottom line is that you have no real right to call out your opponents for holding self-contradictory stances while you're doing the same thing.  Both sides do it, with equal abandon, and neither one seems to notice as long as these crimes against logic are being committed by people whose position on the issues they already agree with.  And if you haven't already had enough irony in your diet from reading this, I'll end with a quote from Jesus (Matthew 7:5):  "Thou hypocrite!  First cast out the beam out of thine own eye; and then shalt thou see clearly to cast out the mote out of thy brother's eye."