Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, November 8, 2017

Elegy for a dying language

In the village of Ayapa in southern Mexico there are two old men who don't much like each other, and despite the fact that they only live 500 meters away from each other, they haven't spoken in years.  One, Manuel Segovia, is described as being "a little prickly;" the other, Isidro Velazquez, is said to be stoic and a bit of a recluse.

All of which would be nothing more than a comical vignette into small-town life, except for the fact that they are the last two fluent speakers of the Ayapaneco language.  And, in fact, they have recently decided to put their feud behind them so they can work together to preserve it.

Ayapaneco is one of 68 indigenous languages in Mexico.  It is from the Mixe-Zoque family of languages, which are spoken by people of Olmec descent.  It survived the conquest of Mexico by the Spanish, but was finally done in by the institution of compulsory Spanish education in the 20th century and has been dwindling ever since.

My question of the day is: should we care?

Current estimates are that there are over 6,000 languages in daily use by native speakers (which excludes languages such as Latin, that are in daily use in schools but of which no one is a native speaker).  A great many of these are in danger of extinction -- they are spoken only by a handful of people, mostly the elderly, and the children aren't being raised fluent.  It's an eye-opening fact that 96% of the world's languages are spoken by 4% of the world's people, and the other 96% of the world's people speak the other 4% of the world's languages.

Run that one around in your head for a while.

On the top of the list is Mandarin, the most widely-spoken language in the world.  English, predictably, follows.  Of the people who speak neither Mandarin nor English, a substantial fraction speak Hindi, Spanish, Russian, or some dialect of Arabic.  Most of the rest of the world's languages?  Inconsequential -- at least in numbers.



15th century manuscript in medieval Gaelic [image courtesy of the Wikimedia Commons]

Linguists, obviously, care deeply about this.  Michael Krauss, professor emeritus of the University of Alaska at Fairbanks, has stated, "... it is catastrophic for the future of mankind.  It should be as scary as losing 90% of the biological species."

Is he right?  The argument for preserving languages is mostly derived from a cultural knowledge perspective; language is a way of encoding knowledge, and each different sort of code represents a unique body of that knowledge.  It's sort of an expanded version of the Sapir-Whorf hypothesis, that states that the language you speak alters how you think (and vice versa).  That argument has its points, but it is also specious in the sense that most languages can encode the same knowledge somehow, and therefore when the last native speaker of Ayapaneco dies, we won't have necessarily lost that culture's knowledge.  We may have lost the ability to figure out how that knowledge was encoded -- as we have with the Linear A writing of Crete -- but that's not the same as losing the knowledge itself.

The analogy to biodiversity is also a bit specious.  Languages don't form some kind of synergistic whole, as the species in an ecosystem do, where the loss of any one thread can cause the whole thing to come unraveled.  In fact, you might argue the opposite -- that having lots of unique languages in an area (such as the hundreds of mutually incomprehensible native languages in Australia) can actually impede cultural communication and understanding.  Species loss can destroy an ecosystem -- witness what's happening in Haiti and Madagascar.  It's a little hard to imagine language loss as having those same kinds of effects on the cultural landscape of the world.

Still, I can't help wishing for the extinction to stop.  It's just sad -- the fact that the number of native speakers of the beautiful Irish Gaelic and Breton languages are steadily decreasing, that there are languages (primarily in Australia and amongst the native languages of North and South America) for whom the last native speakers will die in the next five to ten years without ever having a linguist study, or even record, what it sounded like.  I don't have a cogent argument from a utilitarian standpoint abut why this is a bad thing.  It's aesthetics, pure and simple -- languages are cool.  The idea that English and Mandarin can swamp Twi and Yanomami is probably unavoidable, and it even follows the purely Dawkinsian concept of the competition between memes.  But I don't have to like it, any more than I like the fact that my bird feeders are more often visited by starlings than by indigo buntings.

Tuesday, November 7, 2017

Stopping the rumor machine

Twenty-six people are dead in yet another mass shooting, this one in a Baptist church in Sutherland Springs, a small community 21 miles from San Antonio, Texas.

The killer, Devin Patrick Kelley, died near the scene of the crime.  He had been fired upon by a local resident as he fled the church, and was later found in his car, dead of a gunshot wound.  It is at present undetermined if the bullet that killed him came from the resident's gun, or if it was a self-inflicted wound.

Devin Patrick Kelley

Wiser heads than mine have already taken up the issue of stricter gun control, especially in cases like Kelley's.  Kelley was court martialled in 2012 for an assault on his wife and child, spent a year in prison, and was dishonorably discharged.  All I will say is that I find it a little hard to defend an assault rifle being in the hands of a man who had been convicted of... assault.

I also have to throw out there that the whole "thoughts and prayers" thing is getting a little old.  If thoughts and prayers worked, you'd think the attack wouldn't have happened in the first place, given that the victims were in a freakin' church when it occurred.

But that's not why I'm writing about Kelley and the Sutherland Springs attack.  What I'd like to address here is how, within twelve hours of the attack, there was an immediate attempt by damn near everybody to link Kelley to a variety of groups, in each case to conform to the claimant's personal bias about how the world works.

Here are just a few of the ones I've run into:
  • Someone made a fake Facebook page for Kelley in which there was a photograph of his weapon, a Ruger AR-556, with the caption, "She's a bad bitch."
  • Far-right-wing activists Mike Cernovich and Alex Jones immediately started broadcasting the claim that Kelley was a member of Antifa.  This was then picked up by various questionable "news" sources, including YourNewsWire.com, which trumpeted the headline, "Texas Church Shooter Was Antifa Member Who Vowed to Start Civil War."
  • Often using the Alex Jones article as evidence, Twitter erupted Sunday night with a flurry of claims that Kelley was a Democrat frustrated by Donald Trump's presidential win, and was determined to visit revenge on a bunch of god-fearing Republicans.
  • An entirely different bunch of folks on Twitter started the story that Kelley was actually a Muslim convert named Samir al-Hajeeda.  Coincidentally, Samir al-Hajeeda was blamed by many of these same people for the Las Vegas shootings a month ago.  It's a little hard to fathom how anyone could believe that, given the fact that both gunmen died at the scene of the crime.
  • Not to be outdone, the website Freedum Junkshun claimed that Kelley was an "avid atheist" named Raymond Peter Littlebury, who was "on the payroll of the DNC."
And so on and so forth.

Look, I've made the point before.  You can't stop this kind of thing from zinging at light speed around the interwebz.  Fake news agencies gonna fake news, crazies gonna craze, you know?  Some of these sources were obviously pseudo-satirical clickbait right from the get-go.  I mean, did anyone even look at the name of the site Freedum Junkshun and wonder why they spelled it that way?

And for heaven's sake, Mike Cernovich and Alex Jones?  At this point, if Cernovich and Jones said the grass was green, I'd want an independent source to corroborate the claim.

So it's not the existence of these ridiculous claims I want to address.  It's the people who hear them and unquestioningly believe them.

I know it's easy to fall into the confirmation bias trap -- accepting a claim because it's in line with what you already believed, be it that all conservatives are violent gun nuts, all liberals scheming slimeballs, all Muslims potential suicide bombers, all religious people starry-eyed fanatics, all atheists amoral agents of Satan himself.  It takes work to counter our tendency to swallow whole any evidence of what we already believed.

But you know what?  You have to do it.  Because otherwise you become prey to the aforementioned crazies and promoters of fake news clickbait.  If you don't corroborate what you post, you're not supporting your beliefs; you're playing right into the hands of people who are trying to use your singleminded adherence to your sense of correctness to achieve their own ends.

At the time of this writing, we know next to nothing about Devin Patrick Kelley other than his military record and jail time.  We don't know which, if any, political affiliation he had, whether or not he was religious, whether he was an activist or simply someone who wanted to kill people.  So all of this speculation, all of these specious claims, are entirely vacuous.

Presumably at some point we'll know more about Kelley.  At the moment, we don't.

So please please please stop auto-posting these stories.  At the very least, cross-check what you post against other sources, and check out a few sources from different viewpoints.  (Of course if you cross-check Breitbart against Fox News, or Raw Story against ThinkProgress, you're gonna get the same answer.  That's not cross-checking, that's slamming the door on the echo chamber.)

Otherwise you are not only falling for nonsense, you are directly contributing to the divisiveness that is currently ripping our nation apart.

As the brilliant physicist Richard Feynman put it: "You must be careful not to believe something simply because you want it to be true.  Nobody can fool you as easily as you can fool yourself."

Monday, November 6, 2017

Tut tut

Most of you are probably familiar with the famous "King Tut's Curse."

The story goes that when British archaeologist Howard Carter opened the hitherto undisturbed tomb of King Tutankhamen, the "Boy King" of Egypt during the 18th dynasty, it unleashed a curse on the men who had desecrated it -- resulting in the deaths of (by some claims) twenty of the expedition members.

Tutankhamen was the son of the famous "Heretic King" Akhenaten, and died at the age of eighteen in 1341 BCE.  Some archaeologists speculate that he was murdered, but current forensic anthropology seems to indicate that he died of a combination of malaria and complications from a badly broken leg.

King Tutankhamen's death mask [image courtesy of photographer Carsten Frenzl and the Wikimedia Commons]

Be that as it may, shortly after Tut's tomb was opened, people associated with the expedition began to die.  The first was Lord Carnarvon, who had funded Carter's expedition, who cut himself badly while shaving and died shortly thereafter of sepsis from an infection.  While it's easy enough to explain a death from infection in Egypt prior to the advent of modern antibiotics, the deaths continued after the members of the expedition returned to London:
  • Richard Bethell, Carter's personal secretary, was found smothered in a Mayfair club.
  • Bethell's father, Lord Westbury, fell to his death from his seventh-floor flat -- where he had kept artifacts from the tomb his son had given him.
  • Aubrey Herbert, half-brother of the first victim Lord Carnarvon, died in a London hospital "of mysterious symptoms."
  • Ernest Wallis Budge, of the British Museum, was found dead in his home shortly after arranging for the first public show of King Tut's sarcophagus. 
And so on.  All in all, twenty people associated with the expedition died within the first few years after returning to England.  (It must be said that Howard Carter, who led the expedition, lived for another sixteen years; and you'd think that if King Tut would have wanted to smite anyone, it would have been Carter.  And actually, a statistical study done of Egyptologists who had entered pharaohs' tombs found that their average age at death was no lower than that of the background population.)

Still, that leaves some decidedly odd deaths to explain.  And historian Mark Benyon thinks he's figured out how to explain them.

In his book London's Curse: Murder Black Magic, and Tutankhamun in the 1920s West End, Benyon lays the deaths of Carter's associates in London -- especially Bethell, Westbury, Herbert, and Budge, all of which were deaths by foul play -- at the feet of none other than Aleister Crowley.

Crowley, the self-proclaimed "Wickedest Man on Earth," was a sex-obsessed heroin addict who had founded a society called "Thelema."  Thelema's motto was "Do what thou wilt," which narrowly edged out Crowley's second favorite, which was "Fuck anything or anyone that will hold still long enough."  His rituals were notorious all over London for drunken debauchery, and few doubted then (and fewer doubt now) that there was any activity so depraved that Crowley wouldn't happily indulge in it.

Aleister Crowley, ca. 1912 [image courtesy of the Wikimedia Commons]

One of Crowley's obsessions was Jack the Ripper.  He believed that the Ripper murders had been accomplished through occult means, and frequently was heard to speak of Jack the Ripper with reverence.  Benyon believes that when Crowley heard about Howard Carter's discoveries, he was outraged -- many of Thelema's rituals and beliefs were derived from Egyptian mythology -- and he came up with the idea of a series of copycat murders to get even with the men who had (in his mind) desecrated Tutankhamen's tomb.

It's an interesting hypothesis.  Surely all of the expedition members knew of Crowley; after all, almost everyone in London at the time did.  At least one (Budge) was an occultist who ran in the same circles as Crowley.  That Crowley was capable of such a thing is hardly to be questioned.  Whether Benyon has proved the case or not is debatable, but even at first glance it certainly makes better sense than the Pharaoh's Curse malarkey.  Whether Benyon's explanation is right in all the details or not  is probably impossible at this point to prove, rather like the dozens of explanations put forward to explain the Ripper murders themselves.  But this certainly makes me inclined to file the "Mummy's Curse" under "Another woo-woo claim plausibly explained by logic and rationality."

Saturday, November 4, 2017

Grammar wars

In linguistics, there's a bit of a line in the sand drawn between the descriptivists and the prescriptivists.  The former believe that the role of linguists is simply to describe language, not establish hard-and-fast rules for how language should be.  The latter believe that grammar and other linguistic rules exist in order to keep language stable and consistent, and therefore there are usages that are wrong, illogical, or just plain ugly.

Of course, most linguists don't fall squarely into one camp or the other; a lot of us are descriptivists up to a point, after which we say, "Okay, that's wrong."  I have to admit that I'm more of a descriptivist bent myself, but there are some things that bring out my inner ruler-wielding grammar teacher, like when I see people write "alot."  Drives me nuts.  And I know it's now become acceptable, but "alright" affects me exactly the same way.

It's "all right," dammit.

However, some research just published in Nature last week shows, if you're of a prescriptivist disposition, eventually you're going to lose.

In "Detecting Evolutionary Forces in Language Change," Mitchell G. Newberry, Christopher A. Ahern, Robin Clark, and Joshua B. Plotkin of the University of Pennsylvania describe that language change is inevitable, unstoppable, and even the toughest prescriptivist out there isn't going to halt the adoption of new words and grammatical forms.

The researchers analyzed over a hundred thousand texts from 1810 onward, looking for changes in morphology -- for example, the decrease in the use of past tense forms like "leapt" and "spilt" in favor of "leaped" and "spilled."  The conventional wisdom was that irregular forms (like pluralizing "goose" to "geese") persist because they're common; less common words, like "turf" -- which used to pluralize to "turves" -- eventually regularize because people don't use the word often enough to learn the irregular plural, and eventually the regular plural ("turfs") takes over.

The research by Newberry et al. shows that this isn't true -- when there are two competing forms, which one wins is more a matter of random chance than commonness.  They draw a very cool analogy between this phenomenon, which they call stochastic drift, to the genetic drift experienced by evolving populations of living organisms.

"Whether it is by random chance or selection, one of the things that is true about English – and indeed other languages – is that the language changes,” said Joshua Plotkin, who co-authored the study.  "The grammarians might [win the battle] for a decade, but certainly over a century they are going to be on the losing side.  The prevailing view is that if language is changing it should in general change towards the regular form, because the regular form is easier to remember.  But chance can play an important role even in language evolution – as we know it does in biological evolution."

So in the ongoing battles over grammatical, pronunciation, and spelling change, the purists are probably doomed to fail.  It's worthwhile remembering how many words in modern English are the result of such mangling; both "uncle" and "umpire" came about because of an improper split of the indefinite article ("a nuncle" and "a numpire" became "an uncle" and "an umpire").  "To burgle" came about because of a phenomenon called back formation -- when a common linguistic pattern gets applied improperly to a word that sounds like it has the same basic construction.  A teacher teaches, a baker bakes, so a burglar must burgle.  (I'm surprised, frankly, given how English yanks words around, we don't have carpenters carpenting.)


Anyhow, if this is read by any hard-core prescriptivists, all I can say is "I'm sorry."  It's a pity, but the world doesn't always work the way we'd like it to.  But even so, I'm damned if I'm going to use "alright" and "alot."  A line has to be drawn somewhere.

Friday, November 3, 2017

The persistence of belief

Two studies published last week were profoundly discouraging to people like me, who spend a lot of time trying to promote skepticism and critical thinking, and squelching loopy claims.

The first was a study of American beliefs done at Chapman University.  The study found that:
  • 55% of the Americans surveyed believed in ancient advanced civilizations such as Atlantis
  • 52% believed in ghosts, hauntings, or evil spirits
  • 35% believed that aliens visited the Earth long ago and influenced ancient civilizations
  • 26% believe that aliens are still visiting the Earth
  • 25% believe in telekinesis, the ability to move objects with your mind
  • 19% believe that psychics can foresee the future
  • 16% believe Bigfoot is real
Only a quarter of the people surveyed held no paranormal beliefs whatsoever.

If that's not discouraging enough, compare that to a Gallup poll this year that found only 19% of the Americans surveyed believed that evolution exists and operates through purely natural forces.  So yes: apparently more Americans believe Carrie is a historical documentary than believe in non-god-driven evolutionary biology.

[image courtesy of the Wikimedia Commons]

Then we had a paper called "Poor Metacognitive Awareness of Belief Change," by Michael B. Wolfe and Todd J. Williams of Grand Valley State University, that appeared in the Quarterly Journal of Experimental Psychology.  This study found that yes, you can sometimes change people's opinions with the facts.  They gave people (actual) studies to read showing that spanking is a lousy method of discipline -- it simply doesn't work, and has a number of well-documented bad side effects on children.  (And don't even start with me about "I was spanked as a child and I'm fine."  If that's true, I'm glad you turned out okay, but you should appreciate the fact that you were lucky -- the research is absolutely unequivocal about the negative effects and poor efficacy of spanking.)

And some people did change their minds.  Which is encouraging.  But when the subjects were questioned afterwards, the researchers found that the ones whose stance changed tended to misremember their original beliefs.

In other words: they reported that their beliefs hadn't shifted much, that apropos of their new position, they knew it all along.

The authors write:
When people change beliefs as a result of reading a text, are they aware of these changes?...  [T]he relationship between the belief consistency of the text read and accuracy of belief recollections was mediated by belief change.  This belief memory bias was independent of on-line text processing and comprehension measures, and indicates poor metacognitive awareness of belief change.
Which is frustrating.  The implication is that most of us have such poor self-awareness that we don't even notice when our opinions change.  I suppose it's natural enough; it's hard for all of us to say, "Okay, I guess I was wrong."

But for cryin' in the sink, learning how to admit error is part of growing up.  The world is a complex, counterintuitive place, and we have fallible sensory organs and brains, so of course we're going to get it wrong sometimes.  Because of that, we have to learn not only to admit error, but to examine our own beliefs and biases with a high-power lens.  If you don't periodically look at your own most dearly-held beliefs and ask, "Could I be wrong about this?  How could I tell?  And what would that mean?", you are stumbling around in the dark with no clear way of figuring out where you've made a mistake.

So we skeptics have to toil on.  I'm not saying I'm right about everything -- far from it -- but I will maintain that skepticism, logic, and science are the best ways of sifting fact from fiction.  It's disappointing that we're still a nation where every other person you meet believes in haunted houses, but there is a remedy.  And if, as the second study suggests, the people we convince end up saying, "Meh, I never really believed in ghosts in the first place," I can accept that as the next best outcome to an outright admission of error.

Thursday, November 2, 2017

Living the dream

Last night I dreamed I was in my classroom.  It wasn't my real classroom, however -- it looked like a 19th century lecture hall.  Wooden desks, old cabinets containing jars with ground-glass stoppers, various pieces of equipment of uncertain purpose, some of which looked like (and may in fact have been) torture equipment.  My son lived in an apartment above my classroom, with his wife, which is especially curious because he's not married.  I was teaching a lesson on the reproductive systems of monkeys, but my students weren't listening.  Also, my son kept coming out on the balcony (of course there was a balcony) and interrupting my lecture to ask me questions about the rules of rugby.

After that, it got a little weird.

Neuroscientists have been trying to figure out the physiological function of dreams for years.  The contention is that they must be doing something important, because they're so ubiquitous.  Judging from my own dogs, even other species dream.  Sometimes they have exciting dreams, with muted little barks and twitching paws, often ending in a growl and a shake of the head, as if they're killing some poor defenseless prey; other times they have placid dreams, eliciting a sigh and a wagging tail, which ranks right up there amongst the cutest things I've ever seen.

But what purpose dreams serve has been elusive.  There's some contention that dreaming might help consolidate memory; that it may help to eliminate old synaptic connections that are no longer useful; and that it might function to reset neurotransmitter receptors, especially those connected with the neurotransmitter dopamine.  But last week, some neuropsychologists at Rutgers University have found evidence of yet another function of dreaming; making people less likely to overreact in scary situations.

Tom Merry, "Gladstone Dreams About Queen Victoria's Dinner" (1886) [image courtesy of the Wellcome Library Gallery and the Wikimedia Commons]

In "Baseline Levels of Rapid-Eye-Movement Sleep May Protect Against Excessive Activity in Fear-Related Neural Circuitry," by Itamar Lerner, Shira M. Lupkin, Neha Sinha, Alan Tsai, and Mark A. Gluck, we learn that people who have been deprived of REM (rapid eye movement, the phase of sleep where dreaming occurs) are more likely to experience extreme anxiety and PTSD-like symptoms than people who have been REMing normally, as well as higher activity in the amygdala -- the part of the brain associated with fear, anxiety, and anger.

The authors write:
Sleep, and particularly rapid-eye movement sleep (REM), has been implicated in the modulation of neural activity following fear conditioning and extinction in both human and animal studies.  It has long been presumed that such effects play a role in the formation and persistence of Post-Traumatic-Stress-disorder, of which sleep impairments are a core feature.  However, to date, few studies have thoroughly examined the potential effects of sleep prior to conditioning on subsequent acquisition of fear learning in humans.  Further, these studies have been restricted to analyzing the effects of a single night of sleep—thus assuming a state-like relationship between the two.  In the current study, we employed long-term mobile sleep monitoring and functional neuroimaging (fMRI) to explore whether trait-like variations in sleep patterns, measured in advance in both male and female participants, predict subsequent patterns of neural activity during fear learning.  Our results indicate that higher baseline levels of REM sleep predict reduced fear-related activity in, and connectivity between, the hippocampus, amygdala and ventromedial PFC during conditioning.  Additionally, Skin-Conductance-Responses (SCR) were weakly correlated to the activity in the amygdala.  Conversely, there was no direct correlation between REM sleep and SCR, indicating that REM may only modulate fear acquisition indirectly.  In a follow-up experiment, we show that these results are replicable, though to a lesser extent, when measuring sleep over a single night just prior to conditioning.  As such, baseline sleep parameters may be able to serve as biomarkers for resilience, or lack thereof, to trauma.
Which I find pretty fascinating.  I had sleep problems for years, finally (at least in part) resolved after a visit to a sleep lab and a prescription for a CPAP machine.  Turns out I have obstructive sleep apnea, apparently due to a narrow tracheal opening, and was waking up 23 times an hour.  I'm still not a really sound sleeper, but I feel like at least I'm not sleepwalking through life the way I was, pre-CPAP.  I also suffer from pretty severe social anxiety, and although I'm not convinced that the two are related, it is curious that the researchers found that a lack of REM ramps up anxiety.

However, even after fixing my apnea, my nights are still disturbed by bizarre dreams, for no particularly apparent reason.  I don't dream about things I'm anxious over, for the most part; my dreams are often weird and disjointed, with scenarios that make sense while I'm dreaming and seem ridiculous once I'm awake.  But what does it all mean?  I am extremely dubious about those "Your Dreams Interpreted" books that tell you that if you dream about a horse, it means you are secretly in love with your neighbor.  (I just made that up.  I have no idea what those books say about dreaming about horses, and I'm not sufficiently motivated to go find out.)  In any case, it's highly unlikely that even a symbolic interpretation of dream imagery would be consistent from person to person.

On a bigger scale, however, there is remarkable consistency in dream content from person to person.  We all have dreams of being chased, falling, flying, being in embarrassing situations, being in erotic situations.  But when you slice them more finely, the specifics of dreams vary greatly, even with people who are in the same circumstances, making it pretty unlikely that there's any kind of one-to-one correlation between dream imagery and events in real life.

So the study by Lerner et al. is fascinating, but doesn't really explain the content of dreams, nor why they can be so absolutely convincing when you're in them, and entirely absurd after you wake up.  But I better wrap this up.  I gotta go do some research in case Lucas wants to chat with me, because I might be able to hold my own when the topic is monkey junk, but I know bugger-all about rugby.

Wednesday, November 1, 2017

Rock recordings

I find it a little astonishing that after seven years in the business of blogging about scoffing at the paranormal, I can still run into claims I've never heard of before.

This just happened yesterday, when a long-time loyal reader of Skeptophilia sent me an email with a link and a message saying, "What the hell is this?"  And thus I was introduced to the wacky claim that has become known as the "Stone Tape Theory."

First off, for the good of the order, let me officially lodge a complaint about calling this a "theory."  "Theory" has to be one of the most misused words in the English language, and that misuse never fails to get my goat.  Which explains why I become apoplectic with rage when someone says about evolution, "It's only a theory," as if the word meant "some loopy claim I came up with two nights ago in the corner pub after my third pint of beer, and which could as easily be wrong as right."

To a scientist, a theory is a system of ideas, supported by all the available evidence, that explains a natural phenomenon.  Saying "evolution is a theory" isn't an insult; it means "evolution is the best model we have."  If something is a theory, it's passed the test of scientific inquiry, and holds up to scrutiny.

Okay, now that I've gotten that out of my system, what is the "Stone Tape" thing?  It's an explanation of hauntings and other paranormal phenomena, which claims that they occur because a powerfully emotional or traumatic event happened on the site -- and those emotions were somehow transferred to the soil, rocks, trees, and so on that surround it.  That's why, proponents say, hauntings often cease when the building in which they occur is remodeled or torn down; you've gotten rid of the object(s) on which the events were imprinted.

[image courtesy of photographer Russ Hamer and the Wikimedia Commons]

So hauntings aren't actual spirits (whatever the hell an "actual spirit" would be).  There's no awareness or anything.  What you're seeing and hearing is like playing back a poorly-recorded tape of the events in question -- a sort of residuum of electrical energy from the emotions of the people in question.

I have a few responses to all this, as you might imagine.

First, if the electrical energy from our brain when we experience strong emotions was able to be "imprinted" on inanimate matter, you have to wonder why (1) it doesn't happen all the time, and (2) you don't get instant replays from people, including yourself, who are still alive.  For example, if all it took were strong emotions, you'd think there would be a residue of dismay from my AP classes when I passed back their first quiz, and that now every time I went into the room, there would be ghostly silhouettes of 21 students with ghastly expressions, murmuring, "I guess I better study next time."

But I've never heard of anything like this happening.  However, that is small potatoes compared to my other question, which is: how on earth could your emotional energy "imprint" on a rock?  I have some quartz crystals on the shelf in my classroom, and I decided this morning to spend five minutes thinking rageful thoughts at them.  I thought that since they were crystals, it might work better, because crystals, you know?  But afterwards there was no change in the quartz, no sense of having my anger re-radiated back out at me.  They looked exactly the same as before, and didn't seem at all flustered.

I got the same results with my coffee cup, a bag of potting soil, and a deer skull that sits on top of one of my glassware cabinets.

So we're confronted with this little problem called "lack of a mechanism."  There's no way that the tiny changes in electrical field generated by your thoughts -- changes that are small enough that you need a sensitive machine, an electroencephalogram, to detect -- could cause any changes in the matter in the area.  I have the same objection to claims of psychokinesis; the Law of Conservation of Energy, which is strictly enforced in most jurisdictions, would imply that if you used your psychic powers to lift (say) a car, you would have to expend as much mental energy as the increase in gravitational potential energy the car experiences by being up in the air.

Which is a huge amount, something you can infer from imagining what would happen if the car suddenly released that potential energy by falling on you.

So I'm not buying either the Stone Tape *grumble, grumble* "Theory" or claims of psychokinesis.  Not only is there no evidence that either one exists, there's no plausible mechanism by which either could occur without breaking basically every law of physics in the book.

On the other hand, if the next time I'm putting away glassware, the deer skull tumbles off the shelf and conks me in the head, I suppose it will serve me right.