Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label belief. Show all posts
Showing posts with label belief. Show all posts

Monday, June 9, 2025

All rights reversed

In his book Nothing's Sacred: The Truth About Judaism, media scholar Douglas Rushkoff discusses his concept of "open-source religion," which he contrasts to the more traditional, handed-down-from-on-high types:
An open-source religion would work the same way as open-source software development: it is not kept secret or mysterious at all.  Everyone contributes to the codes we use to comprehend our place in the universe.  We allow our religion to evolve based on the active participation of its people...  An open-source relationship to religion would likewise take advantage of the individual points of view of its many active participants to develop its more resolved picture of the world and our place within it...  [R]eligion is not a pre-existing truth but an ongoing project.  It may be divinely inspired, but it is a creation of human beings working together.  A collaboration.

Which all sounds lovely and democratic and ecumenical, but it brings up the problem of how exactly you can tell if the "codes" contributed by people are correct or not.  In science, there's a standard protocol -- alignment of a model with the known data, and the use of the model to make predictions that then agree with subsequent observations -- but here, I'm not sure how you could apply anything like that.  The fact that religion seems, at its heart, to be an intensely individual experience, varying greatly from one person to another, suggests that reconciling each person's contributions may not be so easy.  Wars have been fought and lives lost over people's notions about the nature of God; saying "let's all collaborate" is a little disingenuous.

This is problematic not only between the world's major religions, but within them.  How, for example, could you bring together my Unitarian Universalist friend, who is more or less a pantheist; another friend who is a devout and very traditional Roman Catholic; and someone who is an evangelical biblical literalist who thinks everyone who doesn't believe that way is headed to the Fiery Furnace for all eternity?  All three call themselves Christian, but they all mean something very different by it.

The Discordians' clever labeling of their own founding doctrine as "All Rights Reversed" -- quote, reprint, or jigger around anything you want, it's all yours to do with as you please -- sounds good, but in practice, it relies on an undeserved trust in the minds of fallible humans of varying backgrounds and educational levels, who sometimes can't even agree on what the evidence itself means.

It's not that I'm certain that my own "there's probably no all-powerful deity in charge" is correct, mind you.  It's more that -- as eminent astrophysicist Neil deGrasse Tyson put it -- "humans are rife with all sorts of ways of getting it wrong," and that assessment very much includes me.  I'm wary of other people's biases, and far more wary of my own.  Physicist Richard Feynman said, "The first principle is that you must not fool yourself, and you are the easiest person to fool."  Even C. S. Lewis saw the danger in the "everyone's voice counts" approach.  He wrote, "A great deal of democratic enthusiasm descends from the ideas of people like Rousseau, who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government.  The danger of defending democracy on those grounds is that they’re not true."

George Carlin put it another way.  He said, "Think of a guy you know who has 'average intelligence.'  Then keep in mind that half of humanity is stupider than that guy."

The problem is that just about every religious person in the world (1) believes what they do because they were told about it by someone else, and (2) believes they've got it one hundred percent right and everyone else is wrong.  And, as Richard Dawkins troublingly points out, what people do believe is often a matter of nothing more than geography.  I was raised Roman Catholic because I grew up in a French-speaking part of southern Louisiana.  If I'd been born to Saudi parents in Riyadh I'd have been Muslim; to Thai parents in Bangkok, I'd likely be Buddhist; to Israeli parents in Tel Aviv, I'd be Jewish; and so on.  I'm suspicious of the whole enterprise because, even given the same universe to look at, people all come up with different answers.

[Image licensed under the Creative Commons Sowlos, Religious symbols-4x4, CC BY-SA 3.0]

And not only are there the ones with lots of adherents, there are countless fringe groups that have spun their own wild takes on how the world works.  Some, like the guy in Tennessee who believed that God told him to build the world's biggest treehouse church, are more amusing than dangerous.  (For what it's worth, the treehouse church was shut down because it was a poorly-constructed safety hazard, and a month later burned to the ground under mysterious circumstances.)  Others, like Jim Jones's People's Temple and the mystical cult that grew up around Carlos Castaneda, are downright deadly.  I have to admit the "open-source religion" idea is good at least from the standpoint of throwing the question back on your own intellect rather than saying, "Just believe what the priest/minister/imam/holy man is telling you," but it does leave the possibility open of getting it very, very wrong. 

As Susan B. Anthony put it, "I distrust those people who know so well what God wants them to do because I notice it always coincides with their own desires."

Again, as I said earlier, it's not that I'm sure myself.  Part of my hesitancy is because I'm so aware of my own capacity for error.  Even though I left Catholicism in my twenties and, for the most part, haven't looked back, I have to admit that there's still an attraction there, something about the mystery and ritual of the church of my childhood that keeps me fascinated.

All the baggage that comes with it -- the patriarchalism and sectarianism and misogyny and homophobia -- not so much.

So right now I'll remain a de facto atheist, although in some ways a reluctant one.  The idea that the universe has some deeper meaning, that things happen because there's a Grand Plan (even if it is, in Aziraphale's words, "Ineffable"), has undeniable appeal.  But if there's one thing I've learned in my sixty-four years, it's that the universe is under no compulsion to arrange itself so as to make me happy.

Or, as my beloved grandma used to say, "Wishin' don't make it so."

****************************************


Saturday, May 10, 2025

Mystery, certainty, and heresy

I've been writing here at Skeptophilia for fourteen years, so I guess it's to be expected that some of my opinions have changed over that time.

I think the biggest shift has been in my attitude toward religion.  When I first started this blog, I was much more openly derisive about religion in general.  My anger is understandable, I suppose; I was raised in a rigid and staunchly religious household, and the attitude of "God as micromanager" pervaded everything.  It brings to mind the line from C. S. Lewis's intriguing, if odd, book The Pilgrim's Regress: "...half the rules seemed to forbid things he'd never heard of, and the other half forbade things he was doing every day and could not imagine not doing; and the number of rules was so enormous that he felt he could never remember them all."

But the perspective of another fourteen years, coupled with exploring a great many ideas (both religious and non-religious) during that time, has altered my perspective some.  I'm still unlikely ever to become religious myself, but I now see the question as a great deal more complex than the black-and-white attitude I had back then.  My attitude now is more that everyone comes to understand this weird, fascinating, and chaotic universe in their own way and time, and who am I to criticize how someone else squares that circle?  As long as religious people accord me the same right to my own beliefs and conscience as they have, and they don't use their doctrine to sledgehammer in legislation favoring their views, I've got no quarrel.

The reason this comes up is, of course, because of the election of a new Pope, Leo XIV, to lead the Roman Catholic Church.  I watched the scene unfold two days ago, and I have to admit it was kind of exciting, even though I'm no longer Catholic myself.  The new Pope seems like a good guy.  He's already pissed off MAGA types -- the white smoke had barely dissipated from over St. Peter's before the ever-entertaining Laura Loomer shrieked "WOKE MARXIST POPE" on Twitter -- so I figure he must be doing something right.  I guess in Loomer's opinion we can't have a Pope who feeds the poor or treats migrants as human beings or helps the oppressed.

Or, you know, any of those other things that were commanded by Jesus.

The fact remains, though, that even though I have more respect and tolerance for religion than I once did, I still largely don't understand it.  After Pope Leo's election, I got online to look at other Popes who had chosen the name "Leo," and following that thread all the way back to the beginning sent me down a rabbit hole of ecclesiastical history that highlighted how weird some of the battles fought in the church have been.

The first Pope Leo ruled back in the fifth century, and his twenty-one year reign was a long and arduous fight against heresy.  Not, you understand, people doing bad stuff; but people believing wrongly, at least in Leo's opinion.

Pope Leo I (ca. 1670) by Francisco Herrera [Image is in the Public Domain]

The whole thing boils down to the bizarre argument called "Christology," which is doctrine over the nature of Jesus.  Leo's take on this was that Jesus was the "hypostatic union" of two natures, God-nature and human nature, in one person, "with neither confusion or division."  But this pronouncement immediately resulted in a bunch of other people saying, "Nuh-uh!"  You had the:

  • Monophysites, who said that Jesus only had one nature (divine);
  • Dyophysites, who said that okay, Jesus had two natures, but they were separate from each other;
  • Monarchians, who said that God is one indivisible being, so Jesus wasn't a distinct individual at all;
  • Docetists, who said that Jesus's human appearance was only a guise, without any true reality;
  • Arianists, who said that Jesus was divine in origin but was inferior to God the Father;
  • Adoptionists, who said that Jesus only became the Son of God at his baptism; and
  • probably a dozen or so others I'm forgetting about.

So Leo called together the Council of Chalcedon and the result was that most of these were declared heretical.  This gave the church leaders license to persecute the heretics, which they did, with great enthusiasm.  But what occurs to me is the question, "How did they know any of this?"  They were all working off the same set of documents -- the New Testament, plus (in some cases) some of the Apocrypha -- but despite that, all of them came to different conclusions.  Conclusions that they were so certain of they were completely confident about using them to justify the persecution of people who believed differently (or, in the case of the heretics themselves, that they believed so strongly they were willing to be imprisoned or executed rather than changing their minds).

Myself, I find it hard to imagine much of anything that I'm that sure of.  I try my hardest to base my beliefs on the evidence and logic insofar as I understand them at the time, but all bets are off if new data comes to light.  That's why although I consider myself a de facto atheist, I'm hesitant to say "there is no God."  The furthest I'll go is that from what I know of the universe, and what I've experienced, it seems to me that there's no deity in charge of things. 

But if God appeared to me to point out the error of my ways, I'd kind of be forced to reconsider, you know?  It's like the character of Bertha Scott -- based very much on my beloved grandmother -- said, in my novella Periphery:

"Until something like this happens, you can always talk yourself out of something."  Bertha chuckled.  "It’s like my daddy said about the story of Moses and the burning bush.  I remember he once said after Mass that if he was Moses, he’d’a just pissed himself and run for the hills.  Mama was scandalized, him talking that way, but I understood.  Kids do, you know.  Kids always understand this kind of thing...  You see, something talks to you out of a flaming bush, you can think it’s God, you can lay down and cry, you can run away, but the one thing you can’t do is continue to act like nothing’s happened."

So while my own views are, in some sense, up for grabs, my default is to stick with what I know from science.  And the fifth century wrangling by the first Pope Leo over the exact nature of Jesus strikes me as bizarre.  As former Secretary of the Treasury Robert Rubin put it, "Some people are more certain of everything than I am of anything."

Be that as it may, I wish all the best to this century's Pope Leo.  Like I said, he looks like a great choice, and a lot of my Catholic friends seem happy with him.  As far as my own mystification about a lot of the details of religion, it's hardly the only thing about my fellow humans I have a hard time understanding.  But like I said earlier, as long as religious people don't use their own certainty to try to force me into belief, I'm all about the principle of live and let live.

****************************************


Thursday, May 8, 2025

Fact blindness

[Spoiler alert!  This post contains spoilers for the most recent Doctor Who episode, "Lucky Day."  If you're planning on watching it and would prefer not to know about the episode's plot, watch it first -- but don't forget to come back and read this.]

In his book The Magician's Nephew, C. S. Lewis writes the trenchant line, "The trouble with trying to make yourself stupider than you actually are is that you usually succeed."

In one sentence, this sums up the problem I have with cynics.  Cynicism is often glorified, and considered a sign of intelligence -- cynics, so the argument goes, have "seen through" the stuff that has the rest of us hoodwinked.  It's a spectrum, they say, with gullibility (really dumb) on one end and cynicism (by analogy, really smart) on the other.

In reality, of course, cynicism is no better than gullibility.  I wouldn't go so far as to call either one "dumb" -- there are a lot of reasons people fall into both traps -- but they're both equally lazy.  It's just as bad to disbelieve and dismiss everything without thought as it is to believe and accept everything without thought.

The difficulty is that skepticism -- careful consideration of the facts before either believing or disbelieving a claim -- is hard work, so both gullibility and cynicism can easily become habits.  In my experience, though, cynicism is the more dangerous, because in this culture it's become attractive.  It's considered edgy, clever, tough, a sign of intelligence, of being a hard-edged maverick who isn't going to get taken advantage of.  How often do you hear people say things like "the media is one hundred percent lies" and "all government officials are corrupt" and even "I hate all people," as if these were stances to be proud of?

I called them "traps" earlier, because once you have landed in that jaundiced place of not trusting anything or anyone, it's damn hard to get out of.  After that, even being presented with facts may not help; as the old saw goes, "You can't logic your way out of a position you didn't logic your way into."  Which brings us to the most recent episode of Doctor Who -- the deeply disturbing "Lucky Day."

The episode revolves around the character of Conrad Clark (played to the hilt by Jonah Hauer-King), a podcast host who has become obsessed with the Doctor and with UNIT, the agency tasked with managing the ongoing alien incursions on Earth.  Conrad's laser focus on UNIT, it turns out -- in a twist I did not see coming -- isn't because he is supportive of what they do, but because he disbelieves it.


To Conrad, it's all lies.  There are no aliens, no spaceships, no extraterrestrial technology, and most critically, no threat.  It's all been made up to siphon off tax money to enrich the ones who are in on the con.  And he is willing to do anything -- betray the kindness and trust of Ruby, who was the Doctor's confidant; threaten UNIT members who stand in his way; even attempt to murder his friend and helper Jordan who allowed him to infiltrate UNIT headquarters -- in order to prove all that to the world.

It's a sharp-edged indictment of today's click-hungry podcasters and talk show celebrities, like Joe Rogan, Alex Jones, and Tucker Carlson, who promote conspiracies with little apparent regard for whom it harms -- and how hard it can be to tell if they themselves are True Believers or are just cold, calculating, and in it for the fame and money.  (And it's wryly funny that in the story, it's the people who disbelieve in aliens who are the delusional conspiracy theorists.)

The part that struck me the most was at the climax of the story, when Conrad has forced his way into UNIT's Command Central, and has UNIT's redoubtable leader, Kate Lethbridge-Stewart, held at gunpoint.  Kate releases an alien monster not only to prove to Conrad she and the others have been telling the truth all along, but to force his hand -- to make him "fish or cut bait," as my dad used to say -- and finally, finally, when the monster has Conrad pinned to the floor and is about to bite his face off, he admits he was wrong.  Ruby tases the monster (and, to Conrad's reluctant "thank you," tells him to go to hell -- go Ruby!).

But then, as he stands up and dusts himself off, he looks down at the monster and sneeringly says, "Well, at least your props and costumes are getting better."  And the monster suddenly lurches up and bites his arm off.

That's the problem, isn't it?  Once you've decided to form your beliefs irrespective of facts and logic, no facts or logic can ever make you budge from that position.

The world is a strange, chaotic place, filled with a vast range of good and bad, truth and lies, hard facts and fantasy, and everything in between.  If we want to truly understand just about anything we can't start out from a standpoint either of gullible belief or cynical disbelief.  Yes, teasing apart what's real from what's not can be exhausting, especially in human affairs, where motives of greed, power, and bigotry can so often twist matters into knots.  But if, as I hope, your intent is to arrive at the truth and not at some satisfying falsehood that lines up with what you already believed, it's really the only option.

I'm reminded of another passage from Lewis, this one from the end of his novel The Last Battle.  In it, the main characters and a group of Dwarves, led by one Diggle, have been taken captive and held in a dark, filthy stable.  All around them, the world is coming to an end; the stable finally collapses to reveal that they've all been transported to a paradisiacal land, and that the dire danger is, miraculously, over.  But the Dwarves, who had decided that everyone -- both the Good Guys and the Bad Guys -- were lying to them, still can't believe it, to the extent that they're certain they're still imprisoned:

"Are you blind?" said Tirian.

"Ain't we all blind in the dark?" said Diggle.

"But it isn't dark!" said Lucy.  "Can't you see?  Look up!  Look round!  Can't you see the sky and the tree and the flowers?  Can't you see me?"

"How in the name of all humbug can I see what ain't there?  And how can I see you any more than you can see me in this pitch darkness?"

Further attempts to prove it to them meet with zero success.  They've become so cynical even the evidence of their own eyes and ears doesn't help.  At that point, they are -- literally, in the context of the story -- fact blind.  Finally Diggle snarls:

"How can you go on talking all that rot?  Your wonderful Lion didn't come and help you, did he?  Thought not.  And now -- even now -- when you've been beaten and shoved into this black hole, just the same as the rest of us, you're still at your old game.  Starting a new lie.  Trying to make us believe we're none of us shut up, and it ain't dark, and heaven knows what."

Ultimately Lucy and Tirian and the others have to give up; nothing they can say or do has any effect.  Aslan (the lion referenced in the above passage) sums it up as follows:

"They will not let us help them.  They have chosen cunning instead of understanding.  Their prison is only in their own minds, yet, they are in that prison; and so afraid of being taken in that they can not be taken out."
****************************************


Friday, August 30, 2024

Word association

There's an odd claim circulating social media these days.  This is the form of it I've seen most frequently:



First, just to get this out of the way: there is no luciferase in vaccinesLuciferase is a bioluminescent protein found in a variety of organisms, from dinoflagellates to fireflies, and was named not for Lucifer but because the root word of luciferase (and Lucifer as well, of course) is a Latin word meaning "light bearer."  Luciferase isn't used for "tracking" people (how the hell would that even work?  Would you be trackable because you'd glow in the dark?), but it is used as a fluorescent marker in antibody assays in vitro.

As easy as it is to laugh at Emerald for her obvious ignorance of (1) how vaccines work, (2) how bioluminescent markers are used, and (3) basic linguistics, what interests me more is how odd a claim this really is.  Because the idea here is that the name of the enzyme somehow creates a link between it and Satan, and this marks you -- in the sense used in the Book of Revelation.  

You know, the "Mark of the Beast."

I ran into another example of this kind of thinking a few weeks ago, with someone who recounts being in line at a convenience store, and the woman ahead of him had her total rung up, and it came out $6.66.  She got a scared look on her face, and said, "Oh, no, I don't like that total.  Better throw in a corndog."

The man who posted about it marveled at what a badass she is -- going into battle with the Forces of Darkness, armed with a corndog.

How do people come to believe so fervently in associations like this?  Clearly they were both taught in a religious context, since both of them made reference to the End Times, but how do you get to the point where any association with words or numbers connected with the Bad Place -- even an obviously accidental or circumstantial one -- causes an immediate and powerful fear response?

A study by Fatik Mandal (of Bankura College, India) found an interesting pattern:

Superstitious beliefs help to decrease [people's] environmentally-induced stress.  Superstition produces a false sense of having control over outer conditions, reduces anxieties, and is prevalent in conditions of absence of confidence, insecurity, fear and threat, stress, and anxiety.  When the events are interpretable, environment is transparent, and conditions are less ambiguous, individuals become less superstitious.

This was supported by a study in 2022 by Hoffmann et al., which suggested that holding superstitions -- especially ones that have the backing of authority figures (e.g. church leaders) -- gives you a sense of control over circumstances that are actually uncertain, random, or inherently uncontrollable.

But what still strikes me as odd is that the reason these people were fearful in the first place was because the church leaders had convinced them that the Antichrist and the Four Apocalyptic Horsepersons and other assorted special offers were on their way, so they'd better get ready to fight.  The superstitions about avoiding vaccines and convenience store bills totaling $6.66 were incidental, and only occurred because the people holding them had already been convinced that the Book of Revelation was actually true.

So this can be summed up as, "Here's how not to be afraid about this thing that I just now made you afraid of."  Which strikes me as just plain weird.

What's certain, though, is how far back in our history this sort of thinking goes.  A study in 2023 by Amar Annus of the University of Chicago looked at the origins of superstitions in the Middle East, and found that the associations between certain words and (usually bad) outcomes has a deep history, and no more rational that the ones people hold today.  In the literature of ancient Mesopotamia, we see ample evidence of detailed superstitions, but:

Only exceptionally are we able to detect any logical relationship between portent and prediction...  In many cases, subconscious association seems to have been at work, provoked by certain words whose specific connotations imparted to them a favorable or an unfavorable character, which in turn determined the general nature of the prediction.

Because those connotations aren't logical, they have to be learned -- transmitted orally or in written form from one generation to another, and undoubtedly embellished as time goes on.  At that point, in just about every culture, you end up with adepts who claim that they know better than anyone else how to interpret the omens, and avoid the unpleasant outcome that would pertain if you get it wrong.  Annus writes about a Mandaean priest in Iraq who spoke with the anthropologist Ethel Drower in the 1920s, and who boasted,

If a raven croaks in a certain burj (= astrological house), I understand what it says, also the meaning when the fire crackles or the door creaks.  When the sky is cloudy and there are shapes in the sky resembling a mare or a sheep, I can read their significance and message.  When the moon is darkened by an eclipse, I understand the portent; when a dust-cloud arises, black, red, or white, I read these signs, and all this according to the hours and the aspects.

So it seems like part of it has to do with powerful or charismatic people saying, "Look, I understand everything way better than you do, and you'd damn well better listen to what I'm saying."  

If you can hook in strong emotions like fear, so much the better.  At that point it turns into a Pascal's Wager sort of thing; what if the scary stuff this guy is saying actually turns out to be true?  What if getting the vaccine does mark me as one of Satan's own?

Better not take the chance.

Of course, the solution to all this is knowledge and rationality, but I'm not sure how well that'd work with someone who already has accepted the fundamentally irrational premises of superstition.  As has been so often commented before, you can't logic your way out of a belief you didn't logic your way into.

So I'm not sure how helpful all this is in the bigger picture.  Superstition has always been with us, and probably always will be.  The best you can do is arm yourself against it in whatever way you can.

Here.  Have a corndog.

****************************************


Saturday, January 28, 2023

The roots of conspiracy

It's all too easy to dismiss conspiracy theorists as just being dumb, and heaven knows I've fallen into that often enough myself.

Part of the problem is that if you know any science, so many conspiracy theories just seem... idiotic.  That 5G cell towers cause COVID.  That eating food heated up in a microwave causes cancer.  As we just saw last week, that Satan's throne is located in Geneva and that's why the physicists at CERN are up to no good.

And sure, there's a measure of ignorance implicit in most conspiracy theories.  To believe that Buffalo Bills player Damar Hamlin's on-field collapse was caused by the COVID vaccine -- as both Charlie Kirk and Tucker Carlson stated -- you have to be profoundly ignorant about how vaccines work.  (This claim led to a rash of people on Twitter who demanded that anything with mRNA in it be officially banned, apparently without realizing that mRNA is in every living cell and is a vital part of your protein-production machinery.  And, therefore, it is not only everywhere in your body, it's present in every meat or vegetable you've ever consumed.)

But simple ignorance by itself doesn't explain it.  After all, we're all ignorant about a lot of stuff; you can't be an expert in everything.  I, for example, know fuck-all about business and economics, which is why it's a subject I never touch here at Skeptophilia (or anywhere else, for that matter).  I'm fully aware of my own lack of knowledge on the topic, and therefore anything I could say about it would have no relevance whatsoever.

Scientists have been trying for years to figure out why some people fall for conspiracies and others don't.  One theory which at least partially explains it is that conspiracy theorists tend to score higher than average in the "dark triad" of personality traits -- narcissism, sociopathy, and black-and-white thinking -- but that isn't the whole answer, because there are plenty of people who score high on those assessments who don't espouse crazy ideas.

But now a psychologist at the University of Regina, Gordon Pennycook, thinks he has the right answer.

The defining characteristic of a conspiracy theorist isn't ignorance, narcissism, or sociopathy; it's overconfidence.

Pennycook designed a clever test to suss out people's confidence levels when given little to nothing to go on.  He showed volunteers photographs that were blurred beyond recognition, and asked them to identify what the subject of the photo was.  ("I don't know" wasn't an option; they had to choose.)  Then, afterward, they were asked to estimate the percentage of their guesses they thought they'd gotten right.

That self-assessment correlated beautifully with belief in conspiracy theories.

"Sometimes you're right to be confident," Pennycook said.  "In this case, there was no reason for people to be confident...  This is something that's kind of fundamental.  If you have an actual, underlying, generalized overconfidence, that will impact the way you evaluate things in the world."

The danger, apparently, is not in simple ignorance, but in ignorance coupled with "of course I understand this."  It reminds me of the wonderful study done by Leonid Rozenblit and Frank Keil about a phenomenon called the illusion of explanatory depth -- that many of us have the impression we understand stuff when we actually have no idea.  (Rozenblit and Keil's examples were common things like the mechanisms of a cylinder lock and a flush toilet, how helicopters fly and maneuver, and how a zipper works.)  Most of us could probably venture a guess about those things, but would add, "... I think" or "... but I could be wrong." 

The people predisposed to belief in conspiracy theories, Pennycook says, are the ones who would never think of adding the disclaimer.

That kind of overconfidence, often crossing the line into actual arrogance, seems to be awfully common.  I was just chatting a couple of weeks ago with my athletic trainer about that -- he told me that all too often he runs into people who walk into his gym and proceed to tell him, "Here's what I think I should be doing."  I find that attitude baffling, and so does he.  I said to him, "Dude, I'm hiring you because you are the expert.  Why the hell would I pay you money if I already knew exactly how to get the results I want?"

He said, "No idea.  But you'd be surprised at how often people come in with that attitude."  He shook his head.  "They never last long here."

The open question, of course, is how you inculcate in people a realistic self-assessment of what they do know, and an awareness that there's lots of stuff about which they might not be right.  In other words, a sense of intellectual humility.  To some extent, I think the answer is in somehow getting them to do some actual research (i.e. not just a quick Google search to find Some Guy's Website that confirms what they already believed).  For example, reading scientific papers, finding out what the actual experts have discovered.  Failing that -- and admittedly, a lot of scientific papers are tough going for non-specialists -- at least reading a damn Wikipedia page on the topic.  Yeah, Wikipedia isn't perfect, but the quality has improved dramatically since it was founded in 2001; if you want a quick overview of (for example) the Big Bang theory, then just read the first few paragraphs of the Wikipedia page on the topic, wherein you will very quickly find that it does not mean what the creationists are so fond of saying, that "nothing exploded and made everything."

Speaking of being overconfident on a topic about which they clearly know next to nothing.

In any case, I'll just exhort my readers -- and I'm reminding myself of this as well -- always to keep in mind the phrase "I could be wrong."  And yes, that applies even to your most dearly held beliefs.  It doesn't mean actively doubting everything; I'm not trying to turn you into wishy-washy wafflers or, worse, outright cynics.  But periodically holding our own beliefs up to the cold light of evidence is never a bad thing.

As prominent skeptic (and professional stage magician) Penn Jillette so trenchantly put it: "Don't believe everything you think."

****************************************


Monday, May 2, 2022

The illusion of cynicism

"All politicians are liars."

"I don't trust anyone."

"You have to watch your back constantly."

"Nothing you read in media is true."

When I taught Critical Thinking -- one of my favorite classes to teach -- I found that it was much harder to counteract cynicism than it was gullibility.  Just about everyone knows that gullibility is a mistake; if you "fall for anything," or "believe whatever's told to you," you are automatically considered to be less smart or less sophisticated (at least by people who aren't gullible themselves).  Many of my students thought that the primary reason to learn critical thinking strategies was to make themselves less likely to get suckered by lies and half-truths.

This is itself half true.  As I told my classes, cynicism is exactly as lazy as gullibility.  Disbelieving everything without consideration is no wiser than believing everything without consideration.  It's why I hate the use of the word "skeptic" to mean doubter.  A true skeptic believes what the evidence supports.  The people who disbelieve in anthropogenic climate change, for example, aren't skeptics; they're rejecting the evidence collected over decades, and the theories that have passed the rigors of peer review to become accepted by 97% of the scientific establishment.

But somehow, cynicism has gained a veneer of respectability, as if there's something brave or smart or noble about having the sour attitude that no one and nothing can be trusted.  This was the subject of a paper that appeared in the journal Personality and Social Psychology Bulletin last week, called "The Cynical Genius Illusion: Exploring and Debunking Lay Beliefs About Cynicism and Competence."  The authors, Olga Stavrova of Tilburg University and Daniel Ehlebracht of the University of Cologne, studied a huge amount of data, and found that the public tends to think cynics and scoffers are smarter than average -- but on actual tests of intelligence, people identified as cynics tend to perform more poorly.  The authors write:
Cynicism refers to a negative appraisal of human nature—a belief that self-interest is the ultimate motive guiding human behavior.  We explored laypersons’ beliefs about cynicism and competence and to what extent these beliefs correspond to reality.  Four studies showed that laypeople tend to believe in cynical individuals’ cognitive superiority.  A further three studies based on the data of about 200,000 individuals from 30 countries debunked these lay beliefs as illusionary by revealing that cynical (vs. less cynical) individuals generally do worse on cognitive ability and academic competency tasks.  Cross-cultural analyses showed that competent individuals held contingent attitudes and endorsed cynicism only if it was warranted in a given sociocultural environment.  Less competent individuals embraced cynicism unconditionally, suggesting that—at low levels of competence—holding a cynical worldview might represent an adaptive default strategy to avoid the potential costs of falling prey to others’ cunning.

So a strategy that might have come about because of a desire to avoid being hoodwinked morphs into the conviction that everyone is trying to hoodwink you.  While I understand why someone would want to avoid the former, especially if (s)he's fallen prey in the past, assuming everyone is out to get you is not only the lazy way out, it's factually wrong.

[Image licensed under the Creative Commons Wetsun, Cynicism graffiti, CC BY 2.0]

You know, I think that's one of the most important things I've learned from all the traveling I've done; that everywhere you go, there are good people and bad, kind people and unkind, and that regardless of differences of culture the vast majority of us want the same things -- food, shelter, security, love, safety for our families and friends, the freedom to voice our opinions without fear of repercussions.  The number of people I've run into who really, honestly had ill intent toward me (or toward anyone) were extremely few.

I'll admit, though, that maintaining a healthy, balanced skepticism is hard at times, especially given the polarization of the media lately.  We are very seldom presented with a fair assessment of what's happening, especially insofar as what the opposite side is doing.  Much of the media is devoted to whipping up hatred and distrust of the "other" -- convincing listeners/readers that the opposite party, the other religion(s), the other races or ethnic groups, are unequivocally bad.  Presenting the more complex, nuanced view that there are a few horrible people in every group but that most people are on balance pretty okay, takes a lot more work -- and doesn't attract sponsorship from the corporations who are profiting off the fear, panic, and anger.

It's nice that the Stavrova and Ehlebracht paper supports what I've been claiming for years.  And I'd like to ask you to make a practice of this -- setting aside your preconceived notions and what you've heard from the media, simply looking at the facts and evidence rather than the spin.  I think you'll find that the world is neither the Pollyanna paradise that the gullible believe nor the horrid hellscape in the cynics' minds, but somewhere in that wide middle ground.

And that honestly, it's a much better place to live than either extreme.

**************************************

Wednesday, October 27, 2021

Antique ghosts

I've long been curious how far back beliefs in the supernatural go.

From what we know about ancient religions and mythologies, belief in something that transcends ordinary human experience seems nearly ubiquitous.  I suppose that without adequate scientific model to explain natural phenomena, nor even the tools to study them, deciding that it must be supernatural forces at work is natural enough.  What puzzles me, though, is how detailed some of those beliefs are.  The ancient Norse didn't just say that there was some powerful guy up there causing thunder somehow; they said it was Thor throwing a giant hammer called "Mjölnir" that was forged by the dwarves Brokk and Sindri, but while they were forging it a fly bit Sindri on the forehead and made him stop pumping the bellows (the fly was Loki in disguise, trying to fuck things up as usual), so the hammer came out with a handle that was too short.

So I guess our belief in the supernatural is not the only thing that goes back a long way.  So does our penchant for telling elaborate stories.

A recent analysis of a 3,500 year old Babylonian artifact that had been gathering dust in the British Museum has shown that our belief not only in gods but in ghosts has a long history.  The tablet came to the attention of Irving Finkel, one of the world's authorities on cuneiform and ancient languages of the Mesopotamian region.  And when Finkel took a look at the tablet, he realized the previous translation had been incomplete and at least partly incorrect.  The text is about how to get rid of a ghost -- and the front of it has the faint outlines of a male ghost being led at the end of a rope by a woman.

The gist of the text is that the way to deal with a haunting is to give the ghost what (s)he wants.  In this case, the ghost is horny and wants some female companionship.  How exactly the owners of the haunted house would talk a (living) woman into being the ghost's lover is an open question.

"It’s obviously a male ghost and he’s miserable," Finkel explained.  "You can imagine a tall, thin, bearded ghost hanging about the house did get on people’s nerves.  The final analysis was that what this ghost needed was a lover...  You can’t help but imagine what happened before.  'Oh God, Uncle Henry’s back.'  Maybe Uncle Henry’s lost three wives.  Something that everybody knew was the way to get rid of the old bugger was to marry him off...  It’s a kind of explicit message.  There’s very high-quality writing there and immaculate draughtsmanship.  That somebody thinks they can get rid of a ghost by giving them a bedfellow is quite comic."

The exact details of the ritual are as complex as the story of the forging of Thor's hammer.  You're to make figurines of a man and a woman, then, "... dress the man in an everyday shift and equip him with travel provisions.  You wrap the woman in four red garments and clothe her in a purple cloth.  You give her a golden brooch.  You equip her fully with bed, chair, mat and towel; you give her a comb and a flask.  At sunrise towards the sun you make the ritual arrangements and set up two carnelian vessels of beer.  You set in place a special vessel and set up a juniper censer with juniper.  You draw the curtain like that of the diviner.  You [put] the figurines together with their equipment and place them in position… and say as follows, Shamash [god of the sun and judge of the underworld by night]."

It ends by cautioning, "Don't look behind you."

Once again, you have to wonder how they figured all this out.  Were there other ghosts they tried to get rid of, but they only clothed the female figurine in three red garments, and it didn't work?  Did they serve the beer in alabaster vessels instead of carnelian, and the ghost said, "Well, fuck that, I'm not leaving."  What happens if you use cedar wood instead of juniper?  Most importantly, what happens if you look behind you?

Whatever the source of those details, it certainly demonstrates the antiquity of myth-making.  "All the fears and weaknesses and characteristics that make the human race so fascinating, assuredly were there in spades 3,500 years ago," Finkel said.

Here in our modern world, we tend to blithely dismiss such beliefs as "primitive" or "unsophisticated," but it bears keeping in mind that they were trying to explain what they experienced, just like we do.  Our scientific advancements have allowed us to peer deeper, and (more importantly) to make tests of our explanations to see if they fit the data -- but what the beliefs of the ancients lacked in rigor they made up for in a strange and intricate beauty.  And this little tablet gives us a window into a long-gone civilization -- making me wonder what other artifacts are still out there to be discovered, and what else we might be able to learn about the myths and folklore of our distant ancestors.

**********************************

Some of the most enduring mysteries of linguistics (and archaeology) are written languages for which we have no dictionary -- no knowledge of the symbol-to-phoneme (or symbol-to-syllable, or symbol-to-concept) correspondences.

One of the most famous cases where that seemingly intractable problem was solved was the near-miraculous decipherment of the Linear B script of Crete by Alice Kober and Michael Ventris, but it bears keeping in mind that this wasn't the first time this kind of thing was accomplished.  In the early years of the nineteenth century, this was the situation with the Egyptian hieroglyphics -- until the code was cracked using the famous Rosetta Stone, by the dual efforts of Thomas Young of England and Jean-François Champollion of France.

This herculean, but ultimately successful, task is the subject of the fascinating book The Writing of the Gods: The Race to Decode the Rosetta Stone, by Edward Dolnick.  Dolnick doesn't just focus on the linguistic details, but tells the engrossing story of the rivalry between Young and Champollion, ending with Champollion beating Young to the solution -- and then dying of a stroke at the age of 41.  It's a story not only of a puzzle, but of two powerful and passionate personalities.  If you're an aficionado of languages, history, or Egypt, you definitely need to put this one on your to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, July 5, 2021

Lost in the shadows

Some historical discoveries are in that gray area between evocative and frustrating as hell.  The evocative part because it gives us a glimpse into a long-gone culture; frustrating because the great likelihood is we'll never know anything more about it.

It's why I will never get over the loss of the Great Library of Alexandria.  Destroyed piece by piece, starting with a strike against secular intellectuals by King Ptolemy VIII Physcon in 145 B.C.E. and an apparently accidental fire during Julius Caesar's attack on Egypt a hundred years later, the library lost most of its holdings -- and its reputation -- and was gone entirely by the end of the third century C.E.  At its height it had books from all over Europe, North Africa, the Middle East, and Asia, and almost certainly contained the complete catalogue of works by such luminaries as Aeschylus, Sophocles, Euripides, Theophrastus, and Aristotle -- the vast majority of which no longer exist.

It's as if we only knew about Shakespeare because of fragmentary copies of Cymbeline and Timon of Athens.  All his other works are gone, known only by title -- or perhaps completely unknown.  That's the situation we're in with most early authors.  The most painful part is that they're gone forever, irreclaimable, disappeared beyond rescue into the murky waters of our past.

That was the reaction I had to a discovery I found out about because a friend of mine sent me a link a couple of days ago, regarding an archaeological discovery in Finland.  Near the town of Järvensuo, northwest of Helsinki, a team of archaeologists from the University of Turku and University of Helsinki found a 4,400-year-old shaman's staff, the top of which was carved into the likeness of a snake.  It resembles depictions of ritual staffs in cave art from the area, so there isn't much doubt about what it is.


It's a pretty spectacular discovery.  "My colleague found it in one of our trenches last summer," said research team member Satu Koivisto.  "I thought she was joking, but when I saw the snake’s head it gave me the shivers."

The discovery brings up inevitable questions about how it was used, and what it tells us about religions and beliefs back then.  "There seems to be a certain connection between snakes and people," said team member Antti Lahelma.  "This brings to mind northern shamanism of the historical period, where snakes had a special role as spirit-helper animals of the shaman…  Even though the time gap is immense, the possibility of some kind of continuity is tantalizing: Do we have a Stone Age shaman's staff?"

Tantalizing indeed, in the full sense of the word.  Like the lost books of the Library of Alexandria, the knowledge, culture, and rituals of the people who used this staff are almost certainly gone forever.  While we can speculate, those speculations are unlikely to be complete (or even correct).  Imagine taking a random assortment of objects from our culture -- a pair of glasses, a stop sign, a computer mouse, a spoon, a garden rake -- and from those alone trying to figure out who we were, what we believed, what we did.

Note that I'm not diminishing the significance and interest of the find, which is pretty amazing.  It's just that it makes me even more cognizant of what we've lost.  It's inevitable, I know that -- nothing lasts forever, not artifacts, not knowledge, not culture.  It's just frustrating realizing how little we know, and worse still, how little we can know.

Maybe that's why I became a fiction author.  If you can't figure stuff out, make stuff up, that's my motto.  It doesn't replace what we've lost, but at least it provokes our imaginations to wonder what things were like back then, to ponder the lives of our distant ancestors, to picture what the world must have looked like to them.

It's better than nothing.  And until we create a time machine, I guess that'll have to be enough.

*************************************

Most people define the word culture in human terms.  Language, music, laws, religion, and so on.

There is culture among other animals, however, perhaps less complex but just as fascinating.  Monkeys teach their young how to use tools.  Songbirds learn their songs from adults, they're not born knowing them -- and much like human language, if the song isn't learned during a critical window as they grow, then never become fluent.

Whales, parrots, crows, wolves... all have traditions handed down from previous generations and taught to the young.

All, therefore, have culture.

In Becoming Wild: How Animal Cultures Raise Families, Create Beauty, and Achieve Peace, ecologist and science writer Carl Safina will give you a lens into the cultures of non-human species that will leave you breathless -- and convinced that perhaps the divide between human and non-human isn't as deep and unbridgeable as it seems.  It's a beautiful, fascinating, and preconceived-notion-challenging book.  You'll never hear a coyote, see a crow fly past, or look at your pet dog the same way again.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]


Saturday, March 20, 2021

Secrecy failure equation

Every once in a while a piece of scientific research comes along that is so clever and elegant that I read the entire paper with a smile on my face.

This happened today when I bumped into the study by David Robert Grimes (of the University of Oxford) published in PLoS ONE entitled, "On the Viability of Conspiratorial Beliefs."  What Grimes did, in essence, was to come up with an equation that models the likelihood of a conspiracy staying secret.  And what he found was that most conspiracies tend to reveal themselves in short order from sheer bungling and ineptitude.  In Grimes's words:
The model is also used to estimate the likelihood of claims from some commonly-held conspiratorial beliefs; these are namely that the moon-landings were faked, climate-change is a hoax, vaccination is dangerous and that a cure for cancer is being suppressed by vested interests. Simulations of these claims predict that intrinsic failure would be imminent even with the most generous estimates for the secret-keeping ability of active participants—the results of this model suggest that large conspiracies (≥1000 agents) quickly become untenable and prone to failure.
Grimes wasn't just engaging in idle speculation.  He took various examples of conspiracies that did last for a while (for example, the NSA Prism Project that was exposed by Edward Snowden) and others that imploded almost immediately (for example, the Watergate coverup) and derived a formula that expressed the likelihood of failure as a function of the number of participants and the time the conspiracy has been in action.  When considering claims of large-scale coverups -- e.g., chemtrails, the faking of the Moon landing, the idea that climatologists are participating in a climate change hoax -- he found the following:
The analysis here predicts that even with parameter estimates favourable to conspiratorial leanings that the conspiracies analysed tend rapidly towards collapse.  Even if there was a concerted effort, the sheer number of people required for the sheer scale of hypothetical scientific deceptions would inextricably undermine these nascent conspiracies.  For a conspiracy of even only a few thousand actors, intrinsic failure would arise within decades.  For hundreds of thousands, such failure would be assured within less than half a decade.  It’s also important to note that this analysis deals solely with intrinsic failure, or the odds of a conspiracy being exposed intentionally or accidentally by actors involved—extrinsic analysis by non-participants would also increase the odds of detection, rendering such Byzantine cover-ups far more likely to fail.
Which is something I've suspected for years.  Whenever someone comes up with a loopy claim of a major conspiracy -- such as the bizarre one I saw a couple of weeks ago that the Freemasons collaborated in faking the deaths of Larry King and Rush Limbaugh -- my first thought (after "Are you fucking kidding me?") is, "How on earth could you keep something like that hushed up?"  People are, sad to say, born gossips, and there is no way that the number of people that would be required to remain silent about such a thing -- not to mention the number required for faking the Moon landing or creating a climate change hoax -- would make it nearly certain that the whole thing would blow up in short order.

[Image licensed under the Creative Commons allen watkin from London, UK, Weird graffiti (3792781972), CC BY-SA 2.0]

It's nice, though, that I now have some mathematical support, instead of doing what I'd done before, which was flailing my hands around and shouting "It's obvious."  Grimes's elegant paper gives some serious ammunition against the proponents of conspiracy theories, and that's all to the good.  Anything we can do in that direction is helpful.

The problem is, Grimes's study isn't likely to convince anyone who isn't already convinced.  The conspiracy theorists will probably just think that Grimes is one of the Illuminati, trying to confound everyone with his evil mathe-magic.  Grimes alluded to this, in his rather somber closing paragraphs:
While challenging anti-science is important, it is important to note the limitations of this approach.  Explaining misconceptions and analysis such as this one might be useful to a reasonable core, but this might not be the case if a person is sufficiently convinced of a narrative.  Recent work has illustrated that conspiracy theories can spread rapidly online in polarized echo-chambers, which may be deeply invested in a particular narrative and closed off to other sources of information.  In a recent Californian study on parents, it was found that countering anti-vaccination misconceptions related to autism was possible with clear explanation, but that for parents resolutely opposed to vaccination attempts to use rational approach further entrenched them in their ill-founded views.  The grim reality is that there appears to be a cohort so ideologically invested in a belief that for whom no reasoning will shift, their convictions impervious to the intrusions of reality.  In these cases, it is highly unlikely that a simple mathematical demonstration of the untenability of their belief will change their view-point.
And there's also the problem that the conspiracy theorists view themselves as stalwart heroes, the only ones brave enough to blow the whistle on the Bad Guys.  My guess is that most of the adherents to conspiracy theories would read Grimes's paper, assume that the equation is correct, and conclude they're the geniuses who are exposing the conspiracy and causing it to fail.  You really can't win with these people.

Be that as it may, it's heartening to know that we now have some theoretical support for the idea that most conspiracy theories are bullshit.  Even if it doesn't change anyone's mind, it cheered me up considerably, and I'm thankful for that much.

***************************************

I've always been in awe of cryptographers.  I love puzzles, but code decipherment has seemed to me to be a little like magic.  I've read about such feats as the breaking of the "Enigma" code during World War II by a team led by British computer scientist Alan Turing, and the stunning decipherment of Linear B -- a writing system for which (at first) we knew neither the sound-to-symbol correspondence nor even the language it represented -- by Alice Kober and Michael Ventris.

My reaction each time has been, "I am not nearly smart enough to figure something like this out."

Possibly because it's so unfathomable to me, I've been fascinated with tales of codebreaking ever since I can remember.  This is why I was thrilled to read Simon Singh's The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography, which describes some of the most amazing examples of people's attempts to design codes that were uncrackable -- and the ones who were able to crack them.

If you're at all interested in the science of covert communications, or just like to read about fascinating achievements by incredibly talented people, you definitely need to read The Code Book.  Even after I finished it, I still know I'm not smart enough to decipher complex codes, but it sure is fun to read about how others have accomplished it.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, March 10, 2021

Shooting the bull

There's a folk truism that goes, "Don't try to bullshit a bullshitter."

The implication is that people who exaggerate and/or lie routinely, either to get away with things or to create an overblown image of themselves, know the technique so well that they can always spot it in others.  This makes bullshitting a doubly attractive game; not only does it make you slick, impressing the gullible and allowing you to avoid responsibility, it makes you savvy and less likely to be suckered yourself.

Well, a study published this week in The British Journal of Social Psychology, conducted by Shane Littrell, Evan Risko, and Jonathan Fugelsang, has shown that like many folk truisms, this isn't true at all.

In fact, the research supports the opposite conclusion.  At least one variety of regular bullshitting leads to more likelihood of falling for bullshit from others.

[Image licensed under the Creative Commons Inkscape by Anynobody, composing work: Mabdul ., Bullshit, CC BY-SA 3.0]

The researchers identified two main kinds of bullshitting, persuasive and evasive.  Persuasive bullshitters exaggerate or embellish their own accomplishments to impress others or fit in with their social group; evasive ones dance around the truth to avoid damaging their own reputations or the reputations of their friends.

Because of the positive shine bullshitting has with many, the researchers figured most people who engage either type wouldn't be shy about admitting it, so they used self-reporting to assess the bullshit levels and styles of the eight hundred participants.  They then gave each a more formal measure of cognitive ability, metacognitive insight, intellectual overconfidence, and reflective thinking, then a series of pseudo-profound and pseudoscientific statements mixed in with real profound and truthful statements, to see if they could tell them apart.

The surprising result was that the people who were self-reported persuasive bullshitters were significantly worse at detecting pseudo-profundity than the habitually honest; the evasive bullshitters were better than average.

"We found that the more frequently someone engages in persuasive bullshitting, the more likely they are to be duped by various types of misleading information regardless of their cognitive ability, engagement in reflective thinking, or metacognitive skills," said study lead author Shane Littrell, of the University of Waterloo.  "Persuasive BSers seem to mistake superficial profoundness for actual profoundness.  So, if something simply sounds profound, truthful, or accurate to them that means it really is.  But evasive bullshitters were much better at making this distinction."

Which supports a contention that I've had for years; if you lie for long enough, you eventually lose touch with what the truth is.  The interesting fact that persuasive and evasive bullshitting aren't the same in this respect might be because evasive bullshitters engage in this behavior because they're highly sensitive to people's opinions, both of themselves and of others.  This would have the effect of making them more aware of what others are saying and doing, and becoming better at sussing out what people's real motives are -- and whether they're being truthful or not.  But persuasive bullshitters are so self-focused that they aren't paying much attention to what others say, so any subtleties that might clue them in to the fact they they're being bullshitted slip right by.

I don't know whether this is encouraging or not.  I'm not sure if the fact that it's easier to lie successfully to a liar is a point to celebrate by those of us who care about the truth.  But it does illustrate the fact that our common sense about our own behavior sometimes isn't very accurate.  As usual, approaching questions from a skeptical scientific angle is the best.

After all, no form of bullshit can withstand that.

****************************************

Last week's Skeptophilia book-of-the-week was about the ethical issues raised by gene modification; this week's is about the person who made CRISPR technology possible -- Nobel laureate Jennifer Doudna.

In The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race, author Walter Isaacson describes the discovery of how the bacterial enzyme complex called CRISPR-Cas9 can be used to edit genes of other species with pinpoint precision.  Doudna herself has been fascinated with scientific inquiry in general, and genetics in particular, since her father gave her a copy of The Double Helix and she was caught up in what Richard Feynman called "the joy of finding things out."  The story of how she and fellow laureate Emmanuelle Charpentier developed the technique that promises to revolutionize our ability to treat genetic disorders is a fascinating exploration of the drive to understand -- and a cautionary note about the responsibility of scientists to do their utmost to make certain their research is used ethically and responsibly.

If you like biographies, are interested in genetics, or both, check out The Code Breaker, and find out how far we've come into the science-fiction world of curing genetic disease, altering DNA, and creating "designer children," and keep in mind that whatever happens, this is only the beginning.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]