Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, November 9, 2017

A self-portrait drawn by others

As you might imagine, I get hate mail pretty frequently.

Most of it has to do with my targeting somebody's sacred cow, be it homeopathy, fundamentalist religion, astrology, climate change denial, or actual sacred cows.  And it seems to fall into three general categories:
  • Insults, some of which never get beyond the "you stupid poopyhead fuckface" level.  These usually have the worst grammar and spelling.
  • Arguments that are meant to be stinging rebuttals.  They seldom are, at least not from the standpoint of adding anything of scientific merit to the conversation, although their authors inevitably think they've skewered me with the sharp rapier of their superior knowledge.  (Sometimes I get honest, thoughtful comments or criticisms on what I've written; I have always, and will always, welcome those.)
  • Diatribes that tell me what I actually believe, as if I'm somehow unaware of it.
I've gotten several of the latter in the last few weeks, and it's these that I want to address in this post, because they're the ones I find the most curious.  I've got a bit of a temper myself, so I can certainly understand the desire to strike back with an insult at someone who's angered you; and it's unsurprising that a person who is convinced of something will want to rebut anyone who says different.  But the idea that I'd tell someone I was arguing with what they believed, as if I knew it better than they did, is just plain weird.

Here are a handful of examples, from recent fan mail, to illustrate what I'm talking about:
  • In response to a post I did on the vitriolic nonsense spouted by televangelist Jim Bakker: "Atheists make me want to puke.  You have the nerve to attack a holy man like Jim Bakker.  You want to tear down the foundation of this country, which is it's [sic] churches and pastors, and tell Christian Americans they have no right to be here."
  • In response to my post on a group of alt-med wingnuts who are proposing drinking turpentine to cure damn near everything: "You like to make fun of people who believe nature knows best for curing us and promoting good health.  You pro-Monsanto, pro-chemical types think that the more processed something is, the better it is for you.  I bet you put weed killer on your cereal in the morning."
  • In response to the post in which I described a push by EPA chief Scott Pruitt to remove scientists from the EPA advisory board and replace them with corporate representatives: "Keep reading us your fairy tales about 'climate change' and 'rising sea levels.'  Your motives are clear, to destroy America's economy and hand over the reigns [sic] to the wacko vegetarian enviro nuts.  Now that we've got people in government who are actually looking out for AMERICAN interests, people like you are crapping your pants because you know your [sic] not in control any more."
  • And finally, in response to a post I did on the fact that the concept of race has little biological meaning: "You really don't get it, do you?  From your picture you're as white as I am, and you're gonna stand there and tell me that you have no problem being overrun by people who have different customs and don't speak English?  Let's see how you feel when your kid's teacher requires them to learn Arabic."
So, let's see.  That makes me a white English-only wacko vegetarian enviro nut (with crap in my pants) who eats weed killer for breakfast while writing checks to Monsanto and plotting how to tear down churches so I can destroy the United States.

Man, I've got a lot on my to-do list today.

I know it's a common tendency to want to attribute some set of horrible characteristics to the people we disagree with.  It engages all that tribal mentality stuff that's pretty deeply ingrained in our brains -- us = good, them = bad.  The problem is, reality is a hell of a lot more complex that that, and it's only seldom that you can find someone who is so bad that they have no admixture whatsoever of good, no justification for what they're doing, no explanation at all for how they got to be the way they are.  We're all mixed-up cauldrons of conflicting emotions.  It's hard to understand ourselves half the time; harder still to parse the motives of others.

So let me disabuse my detractors of a few notions.

While I'm not religious myself, I really have a live-and-let-live attitude toward religious folks, as long as they're not trying to impose their religion on others or using it as an excuse to deny others their rights as humans.  I have religious friends and non-religious friends and friends who don't care much about the topic one way or the other, and mostly we get along pretty well.

I have to admit, though, that being a card-carrying atheist, I do have to indulge every so often in the dietary requirements as set forth in the official Atheist Code of Conduct.


Speaking of diet, I'm pretty far from a vegetarian, even when I'm not dining on babies. In fact, I think that a medium-rare t-bone steak with a glass of good red wine is one of the most delicious things ever conceived by the human species.  But neither am I a chemical-lovin' pro-Monsanto corporate shill who drinks a nice steaming mug of RoundUp in the morning.  I'll stick with coffee, thanks.

Yes, I do accept climate change, because I am capable of reading and understanding a scientific paper and also do not think that because something is inconvenient to American economic expediency, it must not be true.  I'd rather that the US economy doesn't collapse, mainly because I live here, but I'd also like my grandchildren to be born on a planet that is habitable in the long term.

And finally: yes, I am white. You got me there.  If I had any thought of denying it, it was put to rest when I did a 23 & Me test and found out that I'm... white.  My ancestry is nearly all from western Europe, unsurprising given that three of my grandparents were of French descent and one of Scottish descent.  But my being white doesn't mean that I always have to place the concerns of other white people first, or fear people who aren't white, or pass laws making sure that America stays white.  For one thing, it'd be a little hypocritical if I demanded that everyone in the US speak English, given that my mother and three of my grandparents spoke French as their first language; and trust me when I say that I would have loved my kids to learn Arabic in school.  The more other cultures you learn about in school, the better, largely because it's hard to hate people when you realize that they're human, just like you are.

So anyway.  Nice try telling me who I am, but you got a good many of the details wrong.  Inevitable, I suppose, when it's a self-portrait drawn by someone else.  Next time, maybe you should try engaging the people you disagree with in dialogue, rather than ridiculing, demeaning, dismissing, or condescending to them.  It's in general a nicer way to live, and who knows?  Maybe you'll learn something.

And if you want to know anything about me, just ask rather than making assumptions.  It's not like I'm shy about telling people what I think.  Kind of hiding in plain sight, here.

Wednesday, November 8, 2017

Elegy for a dying language

In the village of Ayapa in southern Mexico there are two old men who don't much like each other, and despite the fact that they only live 500 meters away from each other, they haven't spoken in years.  One, Manuel Segovia, is described as being "a little prickly;" the other, Isidro Velazquez, is said to be stoic and a bit of a recluse.

All of which would be nothing more than a comical vignette into small-town life, except for the fact that they are the last two fluent speakers of the Ayapaneco language.  And, in fact, they have recently decided to put their feud behind them so they can work together to preserve it.

Ayapaneco is one of 68 indigenous languages in Mexico.  It is from the Mixe-Zoque family of languages, which are spoken by people of Olmec descent.  It survived the conquest of Mexico by the Spanish, but was finally done in by the institution of compulsory Spanish education in the 20th century and has been dwindling ever since.

My question of the day is: should we care?

Current estimates are that there are over 6,000 languages in daily use by native speakers (which excludes languages such as Latin, that are in daily use in schools but of which no one is a native speaker).  A great many of these are in danger of extinction -- they are spoken only by a handful of people, mostly the elderly, and the children aren't being raised fluent.  It's an eye-opening fact that 96% of the world's languages are spoken by 4% of the world's people, and the other 96% of the world's people speak the other 4% of the world's languages.

Run that one around in your head for a while.

On the top of the list is Mandarin, the most widely-spoken language in the world.  English, predictably, follows.  Of the people who speak neither Mandarin nor English, a substantial fraction speak Hindi, Spanish, Russian, or some dialect of Arabic.  Most of the rest of the world's languages?  Inconsequential -- at least in numbers.



15th century manuscript in medieval Gaelic [image courtesy of the Wikimedia Commons]

Linguists, obviously, care deeply about this.  Michael Krauss, professor emeritus of the University of Alaska at Fairbanks, has stated, "... it is catastrophic for the future of mankind.  It should be as scary as losing 90% of the biological species."

Is he right?  The argument for preserving languages is mostly derived from a cultural knowledge perspective; language is a way of encoding knowledge, and each different sort of code represents a unique body of that knowledge.  It's sort of an expanded version of the Sapir-Whorf hypothesis, that states that the language you speak alters how you think (and vice versa).  That argument has its points, but it is also specious in the sense that most languages can encode the same knowledge somehow, and therefore when the last native speaker of Ayapaneco dies, we won't have necessarily lost that culture's knowledge.  We may have lost the ability to figure out how that knowledge was encoded -- as we have with the Linear A writing of Crete -- but that's not the same as losing the knowledge itself.

The analogy to biodiversity is also a bit specious.  Languages don't form some kind of synergistic whole, as the species in an ecosystem do, where the loss of any one thread can cause the whole thing to come unraveled.  In fact, you might argue the opposite -- that having lots of unique languages in an area (such as the hundreds of mutually incomprehensible native languages in Australia) can actually impede cultural communication and understanding.  Species loss can destroy an ecosystem -- witness what's happening in Haiti and Madagascar.  It's a little hard to imagine language loss as having those same kinds of effects on the cultural landscape of the world.

Still, I can't help wishing for the extinction to stop.  It's just sad -- the fact that the number of native speakers of the beautiful Irish Gaelic and Breton languages are steadily decreasing, that there are languages (primarily in Australia and amongst the native languages of North and South America) for whom the last native speakers will die in the next five to ten years without ever having a linguist study, or even record, what it sounded like.  I don't have a cogent argument from a utilitarian standpoint abut why this is a bad thing.  It's aesthetics, pure and simple -- languages are cool.  The idea that English and Mandarin can swamp Twi and Yanomami is probably unavoidable, and it even follows the purely Dawkinsian concept of the competition between memes.  But I don't have to like it, any more than I like the fact that my bird feeders are more often visited by starlings than by indigo buntings.

Tuesday, November 7, 2017

Stopping the rumor machine

Twenty-six people are dead in yet another mass shooting, this one in a Baptist church in Sutherland Springs, a small community 21 miles from San Antonio, Texas.

The killer, Devin Patrick Kelley, died near the scene of the crime.  He had been fired upon by a local resident as he fled the church, and was later found in his car, dead of a gunshot wound.  It is at present undetermined if the bullet that killed him came from the resident's gun, or if it was a self-inflicted wound.

Devin Patrick Kelley

Wiser heads than mine have already taken up the issue of stricter gun control, especially in cases like Kelley's.  Kelley was court martialled in 2012 for an assault on his wife and child, spent a year in prison, and was dishonorably discharged.  All I will say is that I find it a little hard to defend an assault rifle being in the hands of a man who had been convicted of... assault.

I also have to throw out there that the whole "thoughts and prayers" thing is getting a little old.  If thoughts and prayers worked, you'd think the attack wouldn't have happened in the first place, given that the victims were in a freakin' church when it occurred.

But that's not why I'm writing about Kelley and the Sutherland Springs attack.  What I'd like to address here is how, within twelve hours of the attack, there was an immediate attempt by damn near everybody to link Kelley to a variety of groups, in each case to conform to the claimant's personal bias about how the world works.

Here are just a few of the ones I've run into:
  • Someone made a fake Facebook page for Kelley in which there was a photograph of his weapon, a Ruger AR-556, with the caption, "She's a bad bitch."
  • Far-right-wing activists Mike Cernovich and Alex Jones immediately started broadcasting the claim that Kelley was a member of Antifa.  This was then picked up by various questionable "news" sources, including YourNewsWire.com, which trumpeted the headline, "Texas Church Shooter Was Antifa Member Who Vowed to Start Civil War."
  • Often using the Alex Jones article as evidence, Twitter erupted Sunday night with a flurry of claims that Kelley was a Democrat frustrated by Donald Trump's presidential win, and was determined to visit revenge on a bunch of god-fearing Republicans.
  • An entirely different bunch of folks on Twitter started the story that Kelley was actually a Muslim convert named Samir al-Hajeeda.  Coincidentally, Samir al-Hajeeda was blamed by many of these same people for the Las Vegas shootings a month ago.  It's a little hard to fathom how anyone could believe that, given the fact that both gunmen died at the scene of the crime.
  • Not to be outdone, the website Freedum Junkshun claimed that Kelley was an "avid atheist" named Raymond Peter Littlebury, who was "on the payroll of the DNC."
And so on and so forth.

Look, I've made the point before.  You can't stop this kind of thing from zinging at light speed around the interwebz.  Fake news agencies gonna fake news, crazies gonna craze, you know?  Some of these sources were obviously pseudo-satirical clickbait right from the get-go.  I mean, did anyone even look at the name of the site Freedum Junkshun and wonder why they spelled it that way?

And for heaven's sake, Mike Cernovich and Alex Jones?  At this point, if Cernovich and Jones said the grass was green, I'd want an independent source to corroborate the claim.

So it's not the existence of these ridiculous claims I want to address.  It's the people who hear them and unquestioningly believe them.

I know it's easy to fall into the confirmation bias trap -- accepting a claim because it's in line with what you already believed, be it that all conservatives are violent gun nuts, all liberals scheming slimeballs, all Muslims potential suicide bombers, all religious people starry-eyed fanatics, all atheists amoral agents of Satan himself.  It takes work to counter our tendency to swallow whole any evidence of what we already believed.

But you know what?  You have to do it.  Because otherwise you become prey to the aforementioned crazies and promoters of fake news clickbait.  If you don't corroborate what you post, you're not supporting your beliefs; you're playing right into the hands of people who are trying to use your singleminded adherence to your sense of correctness to achieve their own ends.

At the time of this writing, we know next to nothing about Devin Patrick Kelley other than his military record and jail time.  We don't know which, if any, political affiliation he had, whether or not he was religious, whether he was an activist or simply someone who wanted to kill people.  So all of this speculation, all of these specious claims, are entirely vacuous.

Presumably at some point we'll know more about Kelley.  At the moment, we don't.

So please please please stop auto-posting these stories.  At the very least, cross-check what you post against other sources, and check out a few sources from different viewpoints.  (Of course if you cross-check Breitbart against Fox News, or Raw Story against ThinkProgress, you're gonna get the same answer.  That's not cross-checking, that's slamming the door on the echo chamber.)

Otherwise you are not only falling for nonsense, you are directly contributing to the divisiveness that is currently ripping our nation apart.

As the brilliant physicist Richard Feynman put it: "You must be careful not to believe something simply because you want it to be true.  Nobody can fool you as easily as you can fool yourself."

Monday, November 6, 2017

Tut tut

Most of you are probably familiar with the famous "King Tut's Curse."

The story goes that when British archaeologist Howard Carter opened the hitherto undisturbed tomb of King Tutankhamen, the "Boy King" of Egypt during the 18th dynasty, it unleashed a curse on the men who had desecrated it -- resulting in the deaths of (by some claims) twenty of the expedition members.

Tutankhamen was the son of the famous "Heretic King" Akhenaten, and died at the age of eighteen in 1341 BCE.  Some archaeologists speculate that he was murdered, but current forensic anthropology seems to indicate that he died of a combination of malaria and complications from a badly broken leg.

King Tutankhamen's death mask [image courtesy of photographer Carsten Frenzl and the Wikimedia Commons]

Be that as it may, shortly after Tut's tomb was opened, people associated with the expedition began to die.  The first was Lord Carnarvon, who had funded Carter's expedition, who cut himself badly while shaving and died shortly thereafter of sepsis from an infection.  While it's easy enough to explain a death from infection in Egypt prior to the advent of modern antibiotics, the deaths continued after the members of the expedition returned to London:
  • Richard Bethell, Carter's personal secretary, was found smothered in a Mayfair club.
  • Bethell's father, Lord Westbury, fell to his death from his seventh-floor flat -- where he had kept artifacts from the tomb his son had given him.
  • Aubrey Herbert, half-brother of the first victim Lord Carnarvon, died in a London hospital "of mysterious symptoms."
  • Ernest Wallis Budge, of the British Museum, was found dead in his home shortly after arranging for the first public show of King Tut's sarcophagus. 
And so on.  All in all, twenty people associated with the expedition died within the first few years after returning to England.  (It must be said that Howard Carter, who led the expedition, lived for another sixteen years; and you'd think that if King Tut would have wanted to smite anyone, it would have been Carter.  And actually, a statistical study done of Egyptologists who had entered pharaohs' tombs found that their average age at death was no lower than that of the background population.)

Still, that leaves some decidedly odd deaths to explain.  And historian Mark Benyon thinks he's figured out how to explain them.

In his book London's Curse: Murder Black Magic, and Tutankhamun in the 1920s West End, Benyon lays the deaths of Carter's associates in London -- especially Bethell, Westbury, Herbert, and Budge, all of which were deaths by foul play -- at the feet of none other than Aleister Crowley.

Crowley, the self-proclaimed "Wickedest Man on Earth," was a sex-obsessed heroin addict who had founded a society called "Thelema."  Thelema's motto was "Do what thou wilt," which narrowly edged out Crowley's second favorite, which was "Fuck anything or anyone that will hold still long enough."  His rituals were notorious all over London for drunken debauchery, and few doubted then (and fewer doubt now) that there was any activity so depraved that Crowley wouldn't happily indulge in it.

Aleister Crowley, ca. 1912 [image courtesy of the Wikimedia Commons]

One of Crowley's obsessions was Jack the Ripper.  He believed that the Ripper murders had been accomplished through occult means, and frequently was heard to speak of Jack the Ripper with reverence.  Benyon believes that when Crowley heard about Howard Carter's discoveries, he was outraged -- many of Thelema's rituals and beliefs were derived from Egyptian mythology -- and he came up with the idea of a series of copycat murders to get even with the men who had (in his mind) desecrated Tutankhamen's tomb.

It's an interesting hypothesis.  Surely all of the expedition members knew of Crowley; after all, almost everyone in London at the time did.  At least one (Budge) was an occultist who ran in the same circles as Crowley.  That Crowley was capable of such a thing is hardly to be questioned.  Whether Benyon has proved the case or not is debatable, but even at first glance it certainly makes better sense than the Pharaoh's Curse malarkey.  Whether Benyon's explanation is right in all the details or not  is probably impossible at this point to prove, rather like the dozens of explanations put forward to explain the Ripper murders themselves.  But this certainly makes me inclined to file the "Mummy's Curse" under "Another woo-woo claim plausibly explained by logic and rationality."

Saturday, November 4, 2017

Grammar wars

In linguistics, there's a bit of a line in the sand drawn between the descriptivists and the prescriptivists.  The former believe that the role of linguists is simply to describe language, not establish hard-and-fast rules for how language should be.  The latter believe that grammar and other linguistic rules exist in order to keep language stable and consistent, and therefore there are usages that are wrong, illogical, or just plain ugly.

Of course, most linguists don't fall squarely into one camp or the other; a lot of us are descriptivists up to a point, after which we say, "Okay, that's wrong."  I have to admit that I'm more of a descriptivist bent myself, but there are some things that bring out my inner ruler-wielding grammar teacher, like when I see people write "alot."  Drives me nuts.  And I know it's now become acceptable, but "alright" affects me exactly the same way.

It's "all right," dammit.

However, some research just published in Nature last week shows, if you're of a prescriptivist disposition, eventually you're going to lose.

In "Detecting Evolutionary Forces in Language Change," Mitchell G. Newberry, Christopher A. Ahern, Robin Clark, and Joshua B. Plotkin of the University of Pennsylvania describe that language change is inevitable, unstoppable, and even the toughest prescriptivist out there isn't going to halt the adoption of new words and grammatical forms.

The researchers analyzed over a hundred thousand texts from 1810 onward, looking for changes in morphology -- for example, the decrease in the use of past tense forms like "leapt" and "spilt" in favor of "leaped" and "spilled."  The conventional wisdom was that irregular forms (like pluralizing "goose" to "geese") persist because they're common; less common words, like "turf" -- which used to pluralize to "turves" -- eventually regularize because people don't use the word often enough to learn the irregular plural, and eventually the regular plural ("turfs") takes over.

The research by Newberry et al. shows that this isn't true -- when there are two competing forms, which one wins is more a matter of random chance than commonness.  They draw a very cool analogy between this phenomenon, which they call stochastic drift, to the genetic drift experienced by evolving populations of living organisms.

"Whether it is by random chance or selection, one of the things that is true about English – and indeed other languages – is that the language changes,” said Joshua Plotkin, who co-authored the study.  "The grammarians might [win the battle] for a decade, but certainly over a century they are going to be on the losing side.  The prevailing view is that if language is changing it should in general change towards the regular form, because the regular form is easier to remember.  But chance can play an important role even in language evolution – as we know it does in biological evolution."

So in the ongoing battles over grammatical, pronunciation, and spelling change, the purists are probably doomed to fail.  It's worthwhile remembering how many words in modern English are the result of such mangling; both "uncle" and "umpire" came about because of an improper split of the indefinite article ("a nuncle" and "a numpire" became "an uncle" and "an umpire").  "To burgle" came about because of a phenomenon called back formation -- when a common linguistic pattern gets applied improperly to a word that sounds like it has the same basic construction.  A teacher teaches, a baker bakes, so a burglar must burgle.  (I'm surprised, frankly, given how English yanks words around, we don't have carpenters carpenting.)


Anyhow, if this is read by any hard-core prescriptivists, all I can say is "I'm sorry."  It's a pity, but the world doesn't always work the way we'd like it to.  But even so, I'm damned if I'm going to use "alright" and "alot."  A line has to be drawn somewhere.

Friday, November 3, 2017

The persistence of belief

Two studies published last week were profoundly discouraging to people like me, who spend a lot of time trying to promote skepticism and critical thinking, and squelching loopy claims.

The first was a study of American beliefs done at Chapman University.  The study found that:
  • 55% of the Americans surveyed believed in ancient advanced civilizations such as Atlantis
  • 52% believed in ghosts, hauntings, or evil spirits
  • 35% believed that aliens visited the Earth long ago and influenced ancient civilizations
  • 26% believe that aliens are still visiting the Earth
  • 25% believe in telekinesis, the ability to move objects with your mind
  • 19% believe that psychics can foresee the future
  • 16% believe Bigfoot is real
Only a quarter of the people surveyed held no paranormal beliefs whatsoever.

If that's not discouraging enough, compare that to a Gallup poll this year that found only 19% of the Americans surveyed believed that evolution exists and operates through purely natural forces.  So yes: apparently more Americans believe Carrie is a historical documentary than believe in non-god-driven evolutionary biology.

[image courtesy of the Wikimedia Commons]

Then we had a paper called "Poor Metacognitive Awareness of Belief Change," by Michael B. Wolfe and Todd J. Williams of Grand Valley State University, that appeared in the Quarterly Journal of Experimental Psychology.  This study found that yes, you can sometimes change people's opinions with the facts.  They gave people (actual) studies to read showing that spanking is a lousy method of discipline -- it simply doesn't work, and has a number of well-documented bad side effects on children.  (And don't even start with me about "I was spanked as a child and I'm fine."  If that's true, I'm glad you turned out okay, but you should appreciate the fact that you were lucky -- the research is absolutely unequivocal about the negative effects and poor efficacy of spanking.)

And some people did change their minds.  Which is encouraging.  But when the subjects were questioned afterwards, the researchers found that the ones whose stance changed tended to misremember their original beliefs.

In other words: they reported that their beliefs hadn't shifted much, that apropos of their new position, they knew it all along.

The authors write:
When people change beliefs as a result of reading a text, are they aware of these changes?...  [T]he relationship between the belief consistency of the text read and accuracy of belief recollections was mediated by belief change.  This belief memory bias was independent of on-line text processing and comprehension measures, and indicates poor metacognitive awareness of belief change.
Which is frustrating.  The implication is that most of us have such poor self-awareness that we don't even notice when our opinions change.  I suppose it's natural enough; it's hard for all of us to say, "Okay, I guess I was wrong."

But for cryin' in the sink, learning how to admit error is part of growing up.  The world is a complex, counterintuitive place, and we have fallible sensory organs and brains, so of course we're going to get it wrong sometimes.  Because of that, we have to learn not only to admit error, but to examine our own beliefs and biases with a high-power lens.  If you don't periodically look at your own most dearly-held beliefs and ask, "Could I be wrong about this?  How could I tell?  And what would that mean?", you are stumbling around in the dark with no clear way of figuring out where you've made a mistake.

So we skeptics have to toil on.  I'm not saying I'm right about everything -- far from it -- but I will maintain that skepticism, logic, and science are the best ways of sifting fact from fiction.  It's disappointing that we're still a nation where every other person you meet believes in haunted houses, but there is a remedy.  And if, as the second study suggests, the people we convince end up saying, "Meh, I never really believed in ghosts in the first place," I can accept that as the next best outcome to an outright admission of error.

Thursday, November 2, 2017

Living the dream

Last night I dreamed I was in my classroom.  It wasn't my real classroom, however -- it looked like a 19th century lecture hall.  Wooden desks, old cabinets containing jars with ground-glass stoppers, various pieces of equipment of uncertain purpose, some of which looked like (and may in fact have been) torture equipment.  My son lived in an apartment above my classroom, with his wife, which is especially curious because he's not married.  I was teaching a lesson on the reproductive systems of monkeys, but my students weren't listening.  Also, my son kept coming out on the balcony (of course there was a balcony) and interrupting my lecture to ask me questions about the rules of rugby.

After that, it got a little weird.

Neuroscientists have been trying to figure out the physiological function of dreams for years.  The contention is that they must be doing something important, because they're so ubiquitous.  Judging from my own dogs, even other species dream.  Sometimes they have exciting dreams, with muted little barks and twitching paws, often ending in a growl and a shake of the head, as if they're killing some poor defenseless prey; other times they have placid dreams, eliciting a sigh and a wagging tail, which ranks right up there amongst the cutest things I've ever seen.

But what purpose dreams serve has been elusive.  There's some contention that dreaming might help consolidate memory; that it may help to eliminate old synaptic connections that are no longer useful; and that it might function to reset neurotransmitter receptors, especially those connected with the neurotransmitter dopamine.  But last week, some neuropsychologists at Rutgers University have found evidence of yet another function of dreaming; making people less likely to overreact in scary situations.

Tom Merry, "Gladstone Dreams About Queen Victoria's Dinner" (1886) [image courtesy of the Wellcome Library Gallery and the Wikimedia Commons]

In "Baseline Levels of Rapid-Eye-Movement Sleep May Protect Against Excessive Activity in Fear-Related Neural Circuitry," by Itamar Lerner, Shira M. Lupkin, Neha Sinha, Alan Tsai, and Mark A. Gluck, we learn that people who have been deprived of REM (rapid eye movement, the phase of sleep where dreaming occurs) are more likely to experience extreme anxiety and PTSD-like symptoms than people who have been REMing normally, as well as higher activity in the amygdala -- the part of the brain associated with fear, anxiety, and anger.

The authors write:
Sleep, and particularly rapid-eye movement sleep (REM), has been implicated in the modulation of neural activity following fear conditioning and extinction in both human and animal studies.  It has long been presumed that such effects play a role in the formation and persistence of Post-Traumatic-Stress-disorder, of which sleep impairments are a core feature.  However, to date, few studies have thoroughly examined the potential effects of sleep prior to conditioning on subsequent acquisition of fear learning in humans.  Further, these studies have been restricted to analyzing the effects of a single night of sleep—thus assuming a state-like relationship between the two.  In the current study, we employed long-term mobile sleep monitoring and functional neuroimaging (fMRI) to explore whether trait-like variations in sleep patterns, measured in advance in both male and female participants, predict subsequent patterns of neural activity during fear learning.  Our results indicate that higher baseline levels of REM sleep predict reduced fear-related activity in, and connectivity between, the hippocampus, amygdala and ventromedial PFC during conditioning.  Additionally, Skin-Conductance-Responses (SCR) were weakly correlated to the activity in the amygdala.  Conversely, there was no direct correlation between REM sleep and SCR, indicating that REM may only modulate fear acquisition indirectly.  In a follow-up experiment, we show that these results are replicable, though to a lesser extent, when measuring sleep over a single night just prior to conditioning.  As such, baseline sleep parameters may be able to serve as biomarkers for resilience, or lack thereof, to trauma.
Which I find pretty fascinating.  I had sleep problems for years, finally (at least in part) resolved after a visit to a sleep lab and a prescription for a CPAP machine.  Turns out I have obstructive sleep apnea, apparently due to a narrow tracheal opening, and was waking up 23 times an hour.  I'm still not a really sound sleeper, but I feel like at least I'm not sleepwalking through life the way I was, pre-CPAP.  I also suffer from pretty severe social anxiety, and although I'm not convinced that the two are related, it is curious that the researchers found that a lack of REM ramps up anxiety.

However, even after fixing my apnea, my nights are still disturbed by bizarre dreams, for no particularly apparent reason.  I don't dream about things I'm anxious over, for the most part; my dreams are often weird and disjointed, with scenarios that make sense while I'm dreaming and seem ridiculous once I'm awake.  But what does it all mean?  I am extremely dubious about those "Your Dreams Interpreted" books that tell you that if you dream about a horse, it means you are secretly in love with your neighbor.  (I just made that up.  I have no idea what those books say about dreaming about horses, and I'm not sufficiently motivated to go find out.)  In any case, it's highly unlikely that even a symbolic interpretation of dream imagery would be consistent from person to person.

On a bigger scale, however, there is remarkable consistency in dream content from person to person.  We all have dreams of being chased, falling, flying, being in embarrassing situations, being in erotic situations.  But when you slice them more finely, the specifics of dreams vary greatly, even with people who are in the same circumstances, making it pretty unlikely that there's any kind of one-to-one correlation between dream imagery and events in real life.

So the study by Lerner et al. is fascinating, but doesn't really explain the content of dreams, nor why they can be so absolutely convincing when you're in them, and entirely absurd after you wake up.  But I better wrap this up.  I gotta go do some research in case Lucas wants to chat with me, because I might be able to hold my own when the topic is monkey junk, but I know bugger-all about rugby.