Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, January 3, 2018

All by myself

A couple of days ago, a friend and loyal reader of Skeptophilia sent me a link, and asked, "What would an introvert make of this?"

He asked me for good reason.  I've been an introvert all my life, but it's only gotten more pronounced as I've aged.  I can be shy and socially awkward to the point of deer-in-the-headlights panic.  I go to parties, mostly on my (much more outgoing) wife's urging, but unless I know everyone there -- not that likely -- I'll be the guy with a glass of scotch in my hand, looking around for a dog to socialize with.  I've been known to spend an entire evening at a social gathering, and speak twice -- "Hi, thanks for inviting me," and "Thanks, it was great."

The upside: I'm a good listener.  But still.

In any case, the link was to a paper in the journal Current Opinion in Psychology called "Social Baseline Theory: The Social Regulation of Risk and Effort," by James A. Coan and David A. Sbarra, which makes an interesting claim; that humans are basically social animals, so if you want to see someone acting in the normal way -- his/her "social baseline" -- don't use the usual psychological norm of observation in solitude, see how (s)he behaves in a group.

The authors write:
According to [social baseline theory], the human brain assumes proximity to social resources—resources that comprise the intrinsically social environment to which it is adapted.  Put another way, the human brain expects access to relationships characterized by interdependence, shared goals, and joint attention.  Violations of this expectation increase cognitive and physiological effort as the brain perceives fewer available resources and prepares the body to either conserve or more heavily invest its own energy.  This increase in cognitive and physiological effort is frequently accompanied by distress, both acute and chronic, with all the negative sequelae for health and well being that implies.
The implications to psychological research, if this is true, are obvious.  Take, for example, how functional MRI research is generally conducted:
In functional magnetic resonance imaging (fMRI) research, a standard convention is to compare an experimental treatment to a “resting baseline” characterized by simply lying alone in the scanner.  This convention is predicated on the reasonable assumption that experimental treatments present stimuli otherwise absent from the sensorium while participants are alone.  But inspection of brain activity in several studies... now suggests the brain responds to being alone as if sensory stimuli have been added, not taken away.  That is, the brain looks more “at rest” when social resources are obviously available.  This presents a puzzle potentially resolvable by considering proximity to a familiar other the brain’s true “baseline” state, and being alone as more like an experimental treatment—a context that adds perceived work for the brain to do.
Which certainly adds a layer of complication to studying the workings of the brain.

However, my friend's question is well taken.  Wouldn't there be considerable variation in this response, so much so that it wouldn't represent a baseline in any kind of general manner?  Taking myself as an example -- and other introverts I've talked to agree -- it requires far more energy to be with people than to be alone.  I love my family and friends, but I'm never completely relaxed when I'm around other people.  Interacting is work.  It's rewarding work, and the connections I've made to people are not something I'd ever want to give up, but it's definitely more taxing than being alone.

Solitude (2006) [image courtesy of photographer Aiko Matsuoka and the Wikimedia Commons]

I notice this especially given my day job as a high school biology teacher.  My students are wonderful, and I definitely enjoy their energy and their interest, but being around them is simply exhausting for me.  At the end of the day, usually the last thing I want is to be around more people.  What I really want is to shut myself in my office, and put on some music, and relax.

Preferably with a dog and a glass of scotch.

So it's an intriguing idea, but I'm inclined to question its conclusions, at least as they apply to humanity as a whole.  I suppose it will always be hard to come up with any broad-brush generalizations with which to characterize the human mind.  Part of what makes us so interesting to each other is that no two of us reacts precisely the same way.

And now I must draw this to a close, because I've got to go to work.  First week back at school after a long break.  We'll see how it goes.  I don't have any other options, of course, given that I need the paycheck and it's a little early in the day to start drinking.

Tuesday, January 2, 2018

Steve, Steve, Jennifer, and Onesimus

Do you think I look like a "Gordon?"

According to some recent research, I might not have at birth, but I sure do now.  A study, conducted by Yonat Zwebner (which name I am not making up), Nir Rosenfeld, and Ruth Mayo of the Hebrew University of Jerusalem, Anne-Laure Sellier of HEC Paris, and Jacob Goldenberg of Columbia University, found that the old idea that people are named to match their facial features may actually work in reverse -- the name they're given might actually influence their features.

The paper, "We Look Like Our Names: The Manifestation of Name Stereotypes in Facial Appearance," released a few months ago in the journal Attitude and Social Cognition, found a peculiar pattern when they looked at matching names with faces; a person's name apparently subtly alters such malleable traits such as hairstyle, style of glasses, and resting facial expression.  The authors write:
Research demonstrates that facial appearance affects social perceptions.  The current research investigates the reverse possibility: Can social perceptions influence facial appearance?  We examine a social tag that is associated with us early in life—our given name.  The hypothesis is that name stereotypes can be manifested in facial appearance, producing a face-name matching effect, whereby both a social perceiver and a computer are able to accurately match a person’s name to his or her face.  In 8 studies we demonstrate the existence of this effect, as participants examining an unfamiliar face accurately select the person’s true name from a list of several names, significantly above chance level.  We replicate the effect in 2 countries and find that it extends beyond the limits of socioeconomic cues.  We also find the effect using a computer-based paradigm and 94,000 faces.  In our exploration of the underlying mechanism, we show that existing name stereotypes produce the effect, as its occurrence is culture-dependent.  A self-fulfilling prophecy seems to be at work, as initial evidence shows that facial appearance regions that are controlled by the individual (e.g., hairstyle) are sufficient to produce the effect, and socially using one’s given name is necessary to generate the effect.  Together, these studies suggest that facial appearance represents social expectations of how a person with a specific name should look.  In this way a social tag may influence one’s facial appearance.
Which is interesting in and of itself, but it makes me wonder about how this might be reflected in the changing of naming patterns over time, not to mention the other factors that drive name choice -- such as the fact that some names "run in the family."

I can say from experience that it's hard to decide on a name, which probably explains the plethora of baby name books out there on the market.  Parents want something that will be a source of pride for the child, and will give the child a sense of identity.  (Except, apparently, in my own parents' case, as I was named after my dad, something for which I still haven't forgiven them.)  But sometimes, in that search for uniqueness, parents land on a name that falls into the "It Seemed Like A Good Idea At The Time" department.

Which probably explains why a recent study of 3000 parents in Britain revealed the startling finding that twenty percent of parents regret the name they chose for their children.

His name is Oliver, but you probably already knew that.  [image courtesy of the Wikimedia Commons]

Even more common names sometimes have their downsides, suggests another study, done back in 2010 by David Figlio of Northwestern University.  Figlio and his team first did phonemic studies of thousands of names, to sort them into "masculine sounding" and "feminine sounding" names.  They then looked at data from schools, and came up with the amazing trend that boys given feminine-sounding names (e.g. Ashley, Shannon) were significantly more likely to cause discipline problems, and girls given masculine-sounding names (e.g. Madison, Morgan) were far less likely to choose academically rigorous courses of study.

Are names destiny?  There certainly have been general shifts in naming patterns; what is popular with one generation is out in the next, which is why some names end up sounding "old fashioned."  I recall a comic strip from the 1970s, depicting the typical group photo shot of a first grade class, the teacher sitting primly at the end of the first row.  The caption read: "Top Row: Steve, Steve, Jennifer, Steve, Jennifer, Jennifer, Steve.  Middle Row: Jennifer, Jennifer, Steve, Jennifer, Steve, Steve, Steve.  Bottom Row: Jennifer, Steve, Jennifer, Jennifer, Steve, Jennifer, Steve, and Mrs. Bertha Q. Wackenhorst."

This one struck a special note for me.  My grandmother's given name was Bertha Viola, and amongst her siblings were Roxzella Vandell, Orsa Osburne, Flossie Doris, Fanny Elinore, and Clarence Arnold.  Thank heaven their last name was Scott; with an odd-sounding last name, any of those combinations would have been unfortunate indeed.

I find it interesting to consider why the rather harsh-sounding, mostly Germanic names that were in vogue in the late 19th century are mostly gone.  These days you see few, if any, children named Hilda, Ethel, Edgar, Harold, Arthur, Gertrude, Archibald, and so on.  These were amongst the most popular names during the last decades of the 1800s and the first of the 1900s, and yet by the 1950s all of them were virtually gone from the baby name books.  Did parents of that era think that giving a child a strong-sounding name would be an asset in their making their way in the world?  If so, that gives us an interesting insight into the worldview of turn-of-the-century America.

Some names make you wonder what the parents were thinking at the time.  The parents of Chanda Lear, should, in my opinion, be kicked.  I also find myself wondering why parents would choose a relatively common name and then spell it strangely.  I suppose the desire is to impart a sense of uniqueness and individuality to the name, but the sheer inconvenience of it would (for me, at least) outweigh any sense of pride in having a name that has a twist in the way it's spelled.  This seems to be more common with girls' names, for some reason.  Naming a child Khrystee, Liane (pronounced like Leanne), or Erykah -- all monikers borne by former students of mine -- just seems to be asking for a lifetime of having your name misspelled.

However, it's not always the given name that results in a cross to bear for the individual, and a humorous effect for the rest of us.  Working for a registrar's office, one of my first jobs after graduating from college, I ran into transcripts for Turki Hasher, Celestina Crapp, Timothy Turnipseed, Carl Tolfree, and James Hollopeter.  Family allegiance notwithstanding, I can't imagine why Cloyd Dick IV wouldn't change his name.

Then, there's the never-to-be-forgotten woman I heard about because when she got married, it made the national news.  Her maiden name was Phoebe P. Peabody.  She married a guy named Paul Beebe, and decided to go with a hyphenated married name, so she became Phoebe P. Peabody-Beebe.  Which to my ears sounds like Morse Code.  I guess even without the hyphen, she'd still have been Phoebe Beebe, so I guess it's commendable that she decided to go big or go home.

As I mentioned earlier, I rather dislike my own first name, but not enough to go through the hassle of changing it.  But just considering what it would be like to go through life as Basile Bastard or Nancy Anne Seancey or Earless Romero (all real names, I swear) makes me unlikely to complain.  And if you think things are bad now, go back in history, and you run into some truly wacky ones.  My wife's ancestry boasts a woman named Albreda de Brumpton.  My own includes a German dude named Poppo von Rot.  My cousin in New Mexico descends from a Georgia plantation owner named Onesimus Futch.  My son thinks this sounds like an insult. ("You... you... onesimus futch, you!!!")  So, it could be worse.

A great deal worse.

Monday, January 1, 2018

The first word

Happy New Year to all of my devoted readers.  I appreciate you more than you know, and don't say it often enough.  I hope 2018 is a wonderful, rewarding, and productive year for you all.

And I sure as hell hope it's better than 2017.  I usually end the year with a retrospective of interesting stories month-by-month, and this time I thought, "Like I want to relive the last twelve months.  Once was enough."  While some good things happened, both personally and on a larger scale, 2017 was by and large a slow-motion train wreck.  Mostly what 2017 brought to the forefront was two things -- the power of ignorant people in large groups to sink to the lowest common denominator of human behavior, and our ability to elect incompetent, immoral, and unqualified people to public office, and to continue to support them even as they tear the house down around our ears.

Which, now that I come to think of it, are kind of the same thing.

So I'm not going to focus on that, being that I already focused on it plenty in posts I did in 2017.  Let's look ahead, instead.  Maybe it's time to think about our dreams and aspirations, to appeal to our highest impulses instead of our lowest ones.  I'm not a big believer in "visualize it and you can achieve it" -- that's always sounded like wishful thinking to me -- but you sure as hell can't achieve something if you don't believe it's possible.

Or, to quote William Lonsdale Watkinson, "It is far better to light a single candle than to curse the darkness."

[image courtesy of the Wikimedia Commons]

So here are a few things I'd like to see in 2018.

Let's start with the big picture.  I know "Peace on Earth" is a bit of a lofty goal, so how about: putting more time, effort, energy, and money into the things that improve people's quality of life instead of those that increase suffering, marginalization, and inequity?  Instead of building walls and deporting children and splitting up families, let's work on fixing the conditions that create refugees.  Instead of ceding more power to the corporations that are destroying the environment in the name of short-term profit, let's use the technology -- much of which is already cheap and available -- to convert to renewable energy, high-efficiency resource use, and low waste stream.  Instead of demonizing Planned Parenthood for their role in providing abortions (an extremely small part of what they do), let's work on eliminating the need for abortions by providing high-quality sex education and free access to birth control.  Instead of blaming schools and teachers for the poor performance of students, let's empower educators to make changes to the system based upon research in the psychology of learning -- treating as professionals the people who we've hired to spend thirteen years guiding and caring for our children.

If I could pick out one thing, however, that more than anything else created the shitstorm of 2017, it was the way that fear pushed so many of us into not listening to those with whom we disagreed -- or worse, considering them to be actively evil.  We stopped looking at the other political party, or people of another religion (or no religion at all), as being different, and started considering them the enemy, as people who were deliberately spreading (dare I say it) "fake news" for their own malign purposes.  2017 was the year of the echo chamber, the year that we started being afraid to switch the channel from MSNBC to Fox News or vice versa for fear we'd hear something that challenged our preconceived notions or made us uncomfortable.  It was the year of the Republitards and Democraps, the year we started looking at half of our fellow citizens as ethically bankrupt, morally degenerate, or stupid.

This works to the advantage of a group of people, and let me clue you in on something: it's not the average, middle-class working man or woman.  The ones who benefit by keeping you in fear are the oligarchs and plutocrats, who make you feel like if you don't keep voting them into office, The Bad Guys are gonna get you.  If you're scared that Party X is going to destroy your way of life, you'll keep voting for Party Y regardless of who they are -- a sexual predator, a cheat, a liar, a scoundrel, a narcissistic bully.  We have got to get back to the place where character and vision count for more than party affiliation.

This may all sound pretty pie-in-the-sky, but the thing is, it's all doable.  These are all things we can control, if we'll stop buying the horrible message that we're powerless.  As Christopher Robin said in Winnie the Pooh, "Promise me you will always remember that you are braver than you believe, stronger than you seem, and smarter than you think."

I will end with an exhortation.  Treat the people around you with a little more patience, compassion, and trust.  Most of us want the same things -- a stable place to live, clean food and water, love and acceptance, safety for our family and friends.  The number of people who want to hurt you are few in number, far fewer than the sensationalized media and clickbait websites would have you believe.  I've traveled a great deal, including places where most of the people had different faces than mine, spoke different languages, followed different belief systems.  Virtually everyone I came into contact with met smiles with smiles, kindness with kindness, generosity with generosity.  I think we could go a long way toward fixing our problems if we just stopped looking at the majority of our fellow humans as the enemy.

I'll wish for you all a bright new year.  To quote another great philosopher of our time, Anne Shirley of Anne of Green Gables: "Isn't it nice to think that tomorrow is a new day with no mistakes in it yet?"  Much more so an entire year of tomorrows.

Make the most of them you can.

Saturday, December 30, 2017

Clean sweep

It will hardly merit mention to regular readers of this blog that, given an odd circumstance, I will look first for a rational, scientific explanation.  Although my field is biology, I know enough of the basics of the other sciences to have a good shot at coming up with a plausible explanation for most of what I see -- or, failing that, at least to recognize when a proposed explanation doesn't make sense.

Which brings me to the strange case of the standing brooms.

Apparently over the last few months, there have been multiple reports of brooms staying standing up after being set on end, sometimes for hours.  People report that they were resistant to falling over even if bumped or pushed, and several folks stated that it felt like a "strange force" was keeping the brooms upright.


Naturally, once this sort of thing starts to be reported, we have a veritable explosion of silly explanations. Here is a sampling of ones I saw on various websites:
  • planetary alignment creating a change in the gravitational pull
  • solar flares
  • static electricity
  • Mercury going into retrograde motion
  • ghosts
  • the position of the Moon
  • the position of the broom relative to "ley lines"
  • tapping into "psychic energy currents"
Reading the impassioned exponents of each of those so-called explanations made me want to weep softly and bang my head on my computer keyboard, but I decided to gird my loins and see if I could find anyone who had a more sensible approach.  I found a wonderful and clear explanation on the site ThoughtCo, written only a couple of months ago, which attributes the phenomenon to simple physics -- almost any object will stand upright if it has a flat surface of some kind, and you can get the object's center of gravity to stay over its base of support.  VoilĂ  -- a standing broom!

Of course, woo-woos never give up that easily.  Or sometimes at all.  The "comments" section was filled with rants about how no, it wasn't simple physics, because the broom would only stand up on second Tuesdays when the Moon was full and the appropriate words were chanted.  It can't just be a simple explanation!  It can't!

It is a mystery to me why so many people don't find the world as it is sufficiently wonderful and weird -- they feel like they have to make stuff up, push natural phenomena into supernatural molds, turn everything into some kind of paranormal mystery.  Isn't what actual, reputable scientists are currently discovering -- especially in fields like quantum mechanics, cosmology, neurology, and nanotechnology -- awe-inspiring enough?  Why do you need to muddy the whole situation by making stuff up, or coming up with loony explanations for what you see?

Now, mind you, I'm not saying that there aren't things that haven't been explained yet.  There are plenty, and good science is always pushing the envelope of what's known.  But I am confident that any real phenomenon is ultimately going to be explainable by science, because that's what science does.  It may seem supernatural now, but that's just because we don't yet comprehend what's going on.  As Robert Heinlein said, "Magic is science we don't understand yet."

But the brooms, alas, aren't even that; it's just simple mechanics at work.  No need to invoke solar flares or planets in retrograde.   I'm glad, actually; the whole thing brought up memories of Fantasia, which I'd really rather not think about.  That movie scared the hell out of me when I was a kid.

Friday, December 29, 2017

Unalloyed truth

A couple of weeks ago, the New York Times had an article about claims of a decades-long investigation by the Pentagon of the UFO phenomenon.  While I don't doubt that such a program exists, the article claims that there are warehouses full of "alien alloys" that have been declared unanalyzable.

The conclusion, of course, can only be that they came from outer space.

The article's authors, Helene Cooper, Ralph Blumenthal, and Leslie Kean, write:
Under [NASA employee Robert] Bigelow’s direction, [Bigelow Aerospace Company] modified buildings in Las Vegas for the storage of metal alloys and other materials that [military intelligence expert Luis] Elizondo and program contractors said had been recovered from unidentified aerial phenomena.  Researchers also studied people who said they had experienced physical effects from encounters with the objects and examined them for any physiological changes...  
“We’re sort of in the position of what would happen if you gave Leonardo da Vinci a garage-door opener,” said Harold E. Puthoff, an engineer who has conducted research on extrasensory perception for the C.I.A. and later worked as a contractor for the program.  “First of all, he’d try to figure out what is this plastic stuff.  He wouldn’t know anything about the electromagnetic signals involved or its function.”
I have two responses to this.

First, we are way beyond da Vinci in our understanding of the universe and in the development of technology to study it; this is a serious false analogy.  Second, once you claim that there are actual artifacts to study, you've moved beyond the realm of anecdote into something that's scientifically verifiable.  At that point, you better have the goods -- and be willing to admit it if it turns out that the answer isn't what you hoped it would be.

The week after the article went public, Scientific American's Rafi Letzter wrote a response to it, saying much the same thing (although in far greater detail).  Letzter writes:
"I don't think it's plausible that there's any alloys that we can't identify," Richard Sachleben, a retired chemist and member of the American Chemical Society's panel of experts, told Live Science.  "My opinion? That's quite impossible." 
Alloys are mixtures of different kinds of elemental metals.  They're very common - in fact, Sachleben said, they're more common on Earth than pure elemental metals are - and very well understood.  Brass is an alloy.  So is steel.  Even most naturally occurring gold on Earth is an alloy made up of elemental gold mixed with other metals, like silver or copper... 
"There are databases of all known phases [of metal], including alloys," May Nyman, a professor in the Oregon State University Department of Chemistry, told Live Science.  Those databases include straightforward techniques for identifying metal alloys.  If an unknown alloy appeared, Nyman said it would be relatively simple to figure out what it was made of.
Well, as we've seen over and over, the woo-woos are nothing if not persistent.  Just a couple of days ago, a response to the response appeared over at Mysterious Universe.  The gist of the article is "there are too alien artifacts and UFOs," but there was one bit of it that stood out from the rest.  The author of the article, Brett Tingley, writes:
While I’m sure that's true enough of everything we’ve found on our planet, I just have to wonder: given the vastness of the universe, is it actually impossible for unknown elements or alloys to exist?  Seven new elements have been discovered here on Earth in the last thirty years, while the majority have been discovered in the last four hundred.  On a long enough timeline, who knows what tomorrow’s science will uncover?
This is a roundabout example of the Argument from Ignorance: we don't know, so the explanation must be _________ (fill in the blank with your favorite loopy claim, paranormal phenomenon, or deity).  Normally, the Argument from Ignorance is hard to counter except to point out that our ignorance of something isn't indicative of anything but our ignorance; you can't use it to prove anything.  But wound up in here is an interesting bit that we can analyze from a scientific perspective; the claim that there could be undiscovered elements in "the vastness of the universe."

Here's the problem.  Mendeleev constructed the first periodic table of the elements by noticing some odd patterns -- that there were groups of elements that had similar chemical properties.  After some years of messing about to figure out what was going on, he was able to construct a grid that placed these elements into columns and rows.  And, most interestingly, there were holes -- places in the grid that there should be an element, but none had thus far been discovered.

And one by one, those holes were filled.  Then advances in nuclear physics allowed the creation of the transuranic elements -- the ones beyond uranium, atomic number 92, which are short-lived radioactive substances that do not occur naturally (any of them created by the supernovae that gave rise to the elements in the Solar System would long ago have decayed away).  We're now up to element 118, oganesson.


So Tingley is right that there have been new elements discovered in the last thirty years.  The problem is that most of them have extremely short half-lives and are highly radioactive, so the idea that UFO debris could be made of any of these newly discovered (newly created, really) elements is ridiculous.  But how about the other piece of his claim, that there could be other stable elements we haven't discovered yet?

Sorry, but that doesn't work, either; the periodic table has no holes left to fill, as you can see on the above illustration.  We can be extremely confident that we've got 'em all, and the only additions will be at the unstable and short-lived upper end.  So despite Geordi LaForge on Star Trek: The Next Generation constantly blathering on about how the phaser beams can't damage the alien ship because it's made out of an alloy of the elements gorblimeyum and gobsmackite, this isn't really possible.

Thus our labeling of Star Trek as "fiction."

I'm pretty certain that if the metallurgists and chemists were to examine the warehouse full of debris, they'd find any metal fragments to be composed of plain old ordinary metallic elements.  Now, there could be some piece of alien technology in there -- Puthoff's "garage door opener" -- but my guess is that if there was such incontrovertible evidence of alien visitations, the scientists would know about it.

Sorry for raining on your parade, if you're a UFO enthusiast.  I get your angst.  I would like nothing better than to have proof of extraterrestrial intelligence (or, even better, extraterrestrial visits, because that would mean that the aliens had figured out how to manage travel across interstellar space).  But until we have more than talk about "mysterious alien alloys," I think we need to once again table this entire discussion.

Thursday, December 28, 2017

Daubenmire, pants afire

Things have come to a sorry state when it's the strict Christians who are advocating lying.

I wish I was making this up.  Look, I know I'm not religious myself, but if someone subscribes to a belief system that encourages them to be more honest, to treat their fellow humans with greater respect, to be more generous and compassionate, I've got no quarrel with it whatsoever.  But it's shocking how often it goes the other way -- religion being used as an excuse to exercise some of our worst instincts, including prejudice, insularity, bigotry, and suspicion.

And, apparently, dishonesty.  Evangelist and Christian activist "Coach" Dave Daubenmire, on his radio program Pass the Salt, was ranting against the people who voted against Roy Moore in the Alabama State Senate election, and said something that was more than a little troubling:
When I hear people say, "Well, Judge Moore is not worthy of the office if he’s lying about what he did," I want to grab them and I want to slap them upside the stinking head.  Judge Moore is trying to infiltrate an ungodly system and the stakes in this campaign are so great for the cause of Christ and Judge Moore is being lambasted by the holier-than-thou Christians who think [the Bible] says we can never lie. 
It’s best to lie if it advances the kingdom of God.  There, I said it.
Well, first; "think" the bible says you're not supposed to lie?  I mean, there's an entire freakin' commandment about not bearing false witness.  And I found the following without even trying hard:
  • There are six things that the Lord hates, seven that are an abomination to him: haughty eyes, a lying tongue, and hands that shed innocent blood, a heart that devises wicked plans, feet that make haste to run to evil, a false witness who breathes out lies, and one who sows discord among brothers. (Proverbs 6:16-19)
  • You shall not steal; you shall not deal falsely; you shall not lie to one another. (Leviticus 19:11)
  • The getting of treasures by a lying tongue is a fleeting vapor and a snare of death. (Proverbs 21:6)
  • Therefore, having put away falsehood, let each one of you speak the truth with his neighbor, for we are members one of another. (Ephesians 4:25)
  • No one who practices deceit shall dwell in my house; no one who utters lies shall continue before my eyes. (Psalm 101:7)
  • But as for the cowardly, the faithless, the detestable, as for murderers, the sexually immoral, sorcerers, idolaters, and all liars, their portion will be in the lake that burns with fire and sulfur, which is the second death. (Revelation 21:8)
  • A false witness will not go unpunished, and he who breathes out lies will perish. (Proverbs 19:9)
Which sounds pretty unequivocal, even to a godless heathen like myself.


Also, consider what it is that Daubenmire is excusing Moore from lying about.  Moore has steadfastly denied allegations of sexual harassment against girls as young as fourteen.  So it's not like he lied about how much beer he drank last night.  These lies are about hurting children, for fuck's sake.

Okay, yeah, I know at this point they're only allegations.  But what's interesting is that Daubenmire never argues that Moore didn't do these things.  He's saying that even if he did, and lied about it, he still deserves to be in the Senate because he will "advance the kingdom of God."

All I can say is, if the kingdom of God has Moore and Daubenmire as spokesmen, maybe the "ungodly system" would be a step up.

Oh, and before I get off the topic; there's another quote from the bible that doesn't so much apply to lying in general as it does to people like Daubenmire and Moore.  It's 1 John 4:1, do you know it?
Beloved, do not believe every spirit, but test the spirits to see whether they are from God, for many false prophets have gone out into the world.

Wednesday, December 27, 2017

Religious mutants

A couple of days ago, a reader of Skeptophilia sent me a link along with an email, the gist of which was, "Ha ha, how are you gonna argue your way out of this one, Mr. Smarty-Pants Atheist?"

The link was to a recent article in Newsweek entitled, "Religious People Live Healthier, Longer Lives -- While Atheists Collect Mutant Genes."  Notwithstanding the mental image this created -- of us atheists having stamp-collection-like binders of mutant genes on bookshelves in our studies -- the whole premise sounded idiotic.  The article quotes study co-author Edward Dutton as saying:
Maybe the positive relationship between religiousness and health is not causal—it's not that being religious makes you less stressed so less ill.  Rather, religious people are a genetically normal remnant population from preindustrial times, and the rest of us are mutants who'd have died as children back then...  [The Industrial Revolution caused us to develop] better and better medical care, easier access to healthy food and better living conditions.  Child mortality collapsed down to a tiny level and more and more people with more and more mutant genes have survived into adulthood and had children...  Religiousness makes you more pro-social, and you become more religious when you're stressed.  Religious people would have been sexually selected for because their pro-social, moral, unstressed nature would be attractive.
Well, my background is in evolutionary genetics, so I thought, "Here's a claim I'm qualified to evaluate."

Let's look first at his contention that religious people are healthier.  Turns out that there's some weak correlation there, but only if you look at First-World countries.  In the United States, for example, comparing religious people and non-religious people of similar socioeconomic status, there's a small improvement in health and longevity in the religious people over the non-religious ones.  (It very much remains to be seen that there's any kind of causal relationship there, however.)  But if you look at the human race as a whole -- comparing largely non-religious countries (Sweden, Finland, Iceland) with largely religious ones (Bangladesh, Malaysia, Egypt) gives you exactly the opposite pattern.  There's as much evidence that ill people in questionable living conditions seek out religion as solace as there is that religion itself makes you healthier.

[image courtesy of the Wikimedia Commons]

The last part of the claim, that religion is due to some kind of sexual selection, moves us into even muddier waters.  If this claim is true, people would be eliminating potential mates on the basis of being non-religious, something I see no evidence of whatsoever.  Also, there's the problem of people like me -- the child of a dad who was a Pascal's-wager kind of guy and a mom who was more or less the Cajun Mother Teresa.  So I almost certainly inherited "religiosity genes" (whatever those are).  My first wife, and the mother of my children, was an agnostic who didn't really care about the question of god one way or the other, and at the time our two sons were born, I was still trying like hell to find a reason to believe, a battle I gave up when my youngest son was about five.

So how do you classify me, on the Religious Mutant Gene scale?

Anyhow, as befits a good skeptic, I decided to go to the source, and went to the paper by Dutton et al. in the journal Evolutionary Psychological Science that makes the original claim.  The paper has the rather histrionic title, "The Mutant Says in His Heart, 'There Is No God': the Rejection of Collective Religiosity Centred Around the Worship of Moral Gods Is Associated with High Mutational Load," and although the entire paper is behind a paywall, the abstract reads as follows:
Industrialisation leads to relaxed selection and thus the accumulation of fitness-damaging genetic mutations.  We argue that religion is a selected trait that would be highly sensitive to mutational load.  We further argue that a specific form of religiousness was selected for in complex societies up until industrialisation based around the collective worship of moral gods.  With the relaxation of selection, we predict the degeneration of this form of religion and diverse deviations from it.  These deviations, however, would correlate with the same indicators because they would all be underpinned by mutational load.  We test this hypothesis using two very different deviations: atheism and paranormal belief.  We examine associations between these deviations and four indicators of mutational load: (1) poor general health, (2) autism, (3) fluctuating asymmetry, and (4) left-handedness.  A systematic literature review combined with primary research on handedness demonstrates that atheism and/or paranormal belief is associated with all of these indicators of high mutational load.
Mutational load is a real thing -- it's the number of lethal (or at least significantly deleterious) genes we carry around, the effects of which we are usually protected from by our diploidy (we've got two copies of every gene, and if one doesn't work, chances are the other one does).  But there is no indication that high mutational load is connected with autism (jury's still out on what exactly causes autism) or left-handedness, and "poor general health" is such a mushy term that if you select your data set carefully enough you could probably correlate it with anything you like, up to and including astrological sign.  (There is some indication that left-handedness correlates with some medical conditions, such as migraine, autoimmune disorders, and learning disability; but the heritability of left-handedness even when both parents are left-handed is only 29% anyhow, and what exactly causes it is still unknown.)

But then I did what (again) all skeptics should do, namely take a look at the paper's sources.  I noticed two things right away -- first, that the sources from highly-respected journals like Nature were only tangentially connected to Dutton et al.'s claim (such as an article on the heritability of longevity in Nature Genetics), and second, that the authors are really good at citing their own work.   No fewer than ten of the sources were authored or co-authored by Dutton or the other two authors of the Evolutionary Psychological Science paper, Curtis Dunkel and Guy Madison.

Then I scrolled a little further, and found these listed as sources:
So you're writing a serious paper in a (presumably) serious journal, and you want us to accept your claim, and you cite Yelp, Yahoo Answers, The Telegraph, The Guardian, The Daily Mail Fail, and -- for fuck's sake -- The Jesus Tribune?

What this makes me wonder -- besides the obvious question of how Dutton et al. pull this stuff out of their asses without cracking up -- is about the reliability of the journal Evolutionary Psychological Science itself.  I wasn't able to find any meta-analysis of EPS's reliability online; that sort of self-policing by academia is sorely lacking.  But this paper has all the hallmarks of a pay-to-play publication in a journal that honestly doesn't give a flying fuck about the study's quality.  It's hard to imagine any study that cites The Jesus Tribune making it into Science, for example.

So predictably, I'm unimpressed.  Nothing in my understanding of population genetics lends the slimmest credence to this claim.  It's unsurprising that Newsweek picked up the story, although one would hope that even popular media publications would be a little more careful what they print.  In any case, we atheists don't have to worry about our being poorly-fit unhealthy left-handed autistic mutants.  We're no more likely to be any of the above than the rest of the general population is.  Although, I have to say that while we're talking fiction, if mutations could work like they do in The X-Men, I'd be all for 'em.  I want a mutation that gives me wings.  Big, feathery hawk wings arising from my shoulders.  It'd make fitting into a shirt difficult, but that's a price I'm willing to pay.