Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, July 12, 2025

Mental models and lying stones

Richard Feynman famously said, "The first principle is that you must not fool yourself -- and you are the easiest person to fool."

This insightful statement isn't meant to impugn anyone's honesty or intelligence, but to highlight that everyone -- and I'm sure Feynman was very much including himself in this assessment -- has biases that prevent them from seeing clearly.  We've already got a model, an internal framework by which we interpret what we experience, and that inevitably constrains our understanding.

As science historian James Burke points out, in his brilliant analysis of the scientific endeavor The Day the Universe Changed, it's a trap that's impossible to get out of.  You have to have some mental model for how you think the world works, or all the sensory input you receive would simply be chaos.  "Without a structure, a theory for what's there," Burke says, "you don't see anything."

And once you've settled on a model, it's nearly impossible to compromise with.  You're automatically going to take some things as givens and ignore others as irrelevant, dismiss some pieces of evidence out of hand and accept others without question.  We're always taking what we experience and comparing it to our own mental frameworks, deciding what is important and what isn't.  When my wife finished her most recent art piece -- a stunning image of a raven's face, set against a crimson background -- and I was on social media later that day and saw another piece of art someone had posted with a raven against red -- I shrugged and laughed and said, "Weird coincidence."

Quoth the Raven, pen/ink/watercolor by Carol Bloomgarden (2025) [Image used with permission]

But that's only because I had already decided that odd synchronicities don't mean anything.  If I had a mental model that considered such chance occurrences as spiritually significant omens, I would have interpreted that very, very differently.

Our mental frameworks are essential, but they can lead us astray as often as they land us on the right answer.  Consider, for example, the strange, sad case of Johann Beringer and the "lying stones."

Johann Bartholomeus Adam Beringer was a professor of medicine at the University of Würzburg in the early eighteenth century.  His training was in anatomy and physiology, but he had a deep interest in paleontology, and had a large collection of fossils he'd found during hikes in his native Germany.  He was also a devout Lutheran and a biblical literalist, so he interpreted all the fossil evidence as consistent with biblical events like the six-day creation, the Noachian flood, and so on.

Unfortunately, he also had a reputation for being arrogant, humorless, and difficult to get along with.  This made him several enemies, including two of his coworkers -- Ignace Roderique, a professor of geography and algebra, and Johann Georg von Eckhart, the university librarian.  So Roderique and von Eckhart hatched a plan to knock Beringer down a peg or two.

They found out where he was planning on doing his next fossil hunt, and planted some fake fossils along the way.

These "lying stones" are crudely carved from limestone.  On some of them, you can still see the chisel marks.


More outlandish still, Roderique and von Eckhart carved the word "God" in Hebrew on the backs of some of them.  Making it look like the artisan had signed His name, so to speak.

One colleague -- who was not in on the prank -- looked at the stones, and said to Beringer, "Um... are you sure?  Those look like chisel marks."  Beringer dismissed his objections, and in fact, turned them into evidence for his explanation.  Beringer wrote, "...the figures... are so exactly fitted to the dimensions of the stones, that one would swear that they are the work of a very meticulous sculptor...[and they] seem to bear unmistakable indications of the sculptor's knife."

They were so perfect, Beringer said, that they could only be the work of God.

So as astonishing as it may seem, Beringer fell for the ruse hook, line, and sinker.  Roderique and von Eckhart, buoyed up by their success, repeated their prank multiple times.  Finally Beringer had enough "fossils" that in 1726, he published a scholarly work called Lithographiae Wirceburgensis (The Writing-Stones of Würzburg).  But shortly after the book's publication -- it's unclear how -- Beringer realized he'd been taken for a ride.

He sued Roderique and von Eckhart for defamation -- and won.  Roderique and von Eckhart were both summarily fired, but it was too late; Beringer was a laughingstock in the scientific community.  He tried to recover all of the copies of his book and destroy them, but finally gave up.  His reputation was reduced to rubble, and he died twelve years later in total obscurity.

It's easy to laugh at Beringer's credulity, but the only reason you're laughing is because if you found such a "fossil," your mental model would immediately make you doubt its veracity.  In his framework -- which included a six-thousand-year-old Earth, a biblical flood, and a God who was perfectly capable of signing his own handiwork -- he didn't even stop to consider it.

The history of science is laden with missteps caused by biased mental models.  In 1790, a report of a fireball over France that strewed meteorites over a large region prompted a scientific paper -- that laughingly dismissed the claim as "impossible."  Pierre Bertholon, editor of the Journal des Sciences Utiles, wrote, "How sad, is it not, to see a whole municipality attempt to certify the truth of folk tales… the philosophical reader will draw his own conclusions regarding this document, which attests to an apparently false fact, a physically impossible phenomenon."  DNA was dismissed as the genetic code for decades, because of the argument that DNA's alphabet only contains four "letters," so the much richer twenty-letter alphabet of proteins (the amino acids) must be the language of the genes.  Even in the twentieth century, geologists didn't bother looking for evidence for continental drift until the 1950s, long after there'd been significant clues that the continents had, in fact, moved, largely because they couldn't imagine a mechanism that could be responsible.

Our mental models work on every level -- all the way down to telling us what questions are worth investigating.

So poor Johann Beringer.  Not to excuse him for being an arrogant prick, but he didn't deserve to be the target of a mean-spirited practical joke, nor does he deserve our derision now.  He was merely operating within his own framework of understanding, same as you and I do.

I wonder what we're missing, simply because we've decided it's irrelevant -- and what we've accepted as axiomatic, and therefore beyond questioning?

Maybe we're not so very far ahead of Beringer ourselves.

****************************************


Friday, July 11, 2025

Dream weavers

In Ursula LeGuin's amazing and disturbing novel The Lathe of Heaven, a very ordinary guy finds that he has a completely unordinary ability.

When he dreams, he wakes up and finds that whatever he dreamed has become reality.

George Orr, the protagonist, is terrified by this, as you might imagine.  It's not like he can control what he dreams; he isn't able to program himself to dream something pleasant in order to find he has it when he wakes.  No, it's more sinister than that.  Consider the bizarre, confusing, often frightening content of most of our dreams -- dreams that prompt you to say the next morning, "Where the hell did that come from?"

That is what makes up George's reality.

The worst part is that George is the only one who knows it's happening.  When his dream content alters reality, it alters everything -- including the memories -- of the people he knows.  When he wakes up and finds that the cityscape has changed and that some people he knew are gone, replaced with others he has never seen before, everyone else's memories changed as well.  George wakes to find he has a girlfriend, but she doesn't think it's sudden and weird; being George's girlfriend is what she remembers.

Only George sees that this is just the latest version of a constantly shifting reality.

So when he tries to explain to people what he can do, and (if possible) find a way to stop it, no one believes him.  No one... except a ruthless and ambitious psychiatrist who realizes that if he can figure out how to manipulate George's dreams, he can fashion a world to his own desires, using George as a tool to create the reality he wants.

LeGuin's terrifying vision of what happens when grasping amorality meets a naïve but useful skill is turning out not to be far from being realized.  George Orr's paranormal ability is the stuff of fiction, of course; but the capacity for influencing our dreams is not.

Nor, apparently, is the potential for using our dreams as a conduit for suggestions that might alter our behavior -- with or without our permission, possibly with or without our knowledge.

[Image licensed under the Creative Commons stephentrepreneur, Hurtle Square dreams, CC BY-SA 2.0]

I found out recently from a friend and loyal reader of Skeptophilia that there is a cohort of powerful corporations who have teamed up to see if there's a way to insert advertising content into our dreams.  Xbox, Coors, Microsoft, and Burger King, among others, have been experimenting on volunteers to see if they can introduce targeted advertising while we're asleep, and (especially) while we're at the neurologically hyperactive REM (rapid eye movement) stage of sleep, in order to induce people to alter their behavior -- i.e., purchase the product in question -- once they wake up.  They've had some success; a test by Coors found that sixty percent of the volunteers were susceptible to having these kinds of product-based suggestions influence their dream content.

Forty sleep and dream researchers are now pushing back, and have drafted a document calling for regulation of what the corporate researchers are calling "dream incubation."  "It is easy to envision a world in which smart speakers—forty million Americans currently have them in their bedrooms—become instruments of passive, unconscious overnight advertising, with or without our permission," the authors state.

My fear is that the profit motive will outweigh any reluctance people might have toward having their dreams infiltrated by corporate content.  If cellular service providers were willing to give users a discount on their monthly fees, provided they agreed to allow themselves to be exposed to ads during the nighttime hours, how many people would say, "Sure, okay, go for it"?  I know for myself, I'm often willing to tolerate ads on games and video streaming services rather than paying extra to go ad-free.  I'd like to think that I'm able to tune out the ads sufficiently that they're not influencing my behavior, but what would happen if I'm exposed to them for all eight to ten hours that I'm sleeping every night?

Not all scientists are concerned about the technique's efficacy, however.  "Of course you can play ads to someone as they are sleeping, but as far as having much effect, there is little evidence," said Deirdre Barrett, a dream researcher at Harvard University.  "Dream incubation doesn’t seem very cost effective compared with traditional advertising campaigns."

Even if it doesn't have the manipulative capability the corporations are hoping, the idea still scares the hell out of me.  When every moment of our days and nights become just another opportunity for monetization, where will it all stop?  "I am not overly concerned, just as I am not concerned that people can be hypnotized against their will," said University of Montreal dream researcher Tore Nielsen.  "If it does indeed happen and no regulatory actions are taken to prevent it, then I think we will be well on our way to a Big Brother state … [and] whether or not our dreams can be modified would likely be the least of our worries."

Which is it exactly.  As I've pointed out before, my main concern is about the increasing control corporate interests have over everything.  And here in the United States, the problem is that the people who could potentially pass legislation to limit what the corporations can get away with are being funded largely by corporate donors, so they're not anxious to put the brakes on and see that flow of cash dry up suddenly.  It's a catch-22 that would require the government to police its own behavior for no other reason than simple morality and ethics.

And you can guess how successful that is likely to be, especially considering the United States's current ethically-challenged administration.

I'm hoping that at least someone is listening, though. I tend to agree with Nielsen; I don't think our dream content will be as easily manipulable, or as behavior-altering, as the corporations hope.  But what I'm more worried about is that once we refuse to delineate a hard line around our personal lives, and say to them, "Here, and no further," we've opened ourselves up to there being no part of our personal space that isn't considered a target for monetization.

****************************************


Thursday, July 10, 2025

Unto the breach

Today I dodged a battle on social media, and I honestly don't know if it makes me a coward or just someone who tries to be prudent about which battles are even winnable.

The person in question, an acquaintance I only know through a mutual friend but who connected to me a couple of years ago for reasons unknown, has thrown out some questionable stuff before, but nothing as bad as this. " There aren't many genders," she posted.  "There are TWO genders and many mental disorders."

After I stopped seeing red enough that I could tell what was on my computer screen, I pondered a variety of responses I could have made.  Among the top contenders:
  • "Wow, that's some weapons-grade stupidity, right there."
  • "Do you realize what a narrow-minded bigot this makes you sound like?"
  • "Get off your fucking high horse and do some research."
Then I calmed down a little more, and considered other, marginally less obnoxious responses:
  • "Maybe before you post stuff like this, you should talk to someone who is trans and get actual information on what it's like."
  • "I believe the Bible you claim to be so fond of has a lot more to say about charity, kindness, and passing judgment than it does about the biology of gender.  You should reread those verses."
  • "I hope like hell your grandchildren don't turn out to be LGBTQ.  For their sake, not for yours."
But finally I said nothing, and unfriended her.

I know it's the duty of every responsible person to confront racism, homophobia, bigotry, narrow-mindedness, and general idiocy.  Not doing so, leaving this kind of thing unchallenged, gives it tacit permission to continue.  I never would have let something like this go in my classroom; the few times I ever got really, truly angry at students during my 32 year career were over issues like this.

But lord have mercy, I am tired.  I'm tired of seeing this kind of bullshit trumpeted as if it was a proclamation of an eternal truth.  I'm tired of trying to convince the anti-vaxxers and climate change deniers, the nitwits who claim the 2020 election was stolen and that Trump is the Second Coming of Jesus, the people who believe that the January 6 insurrectionists were Antifa and liberals in disguise.

Plus, there's the question of what good it would have done if I had confronted her on her nasty, sneering post.  She barely knows me; I think we've maybe talked in person once.  Since then I've had zero interactions with her, online or anywhere else.  Why would she listen to me?  More likely she'd write me off as another godless liberal, getting all bent out of shape because she dropped a Truth Bomb on me.  What is the chance that anything I could have said, polite or rude, would have changed her attitude one iota?

[Image licensed under the Creative Commons Blaine A. White, The Argument 01, CC BY-SA 4.0]

Still, I can't help but feel that I took the coward's way out.  If I'm not going to challenge stupidity and bigotry, it kind of gives lie to the entire raison d'être of this blog I've written so diligently on for the last fifteen years.  Every time we let someone like her get away with something like this unchallenged, it does double damage -- it further convinces any LGBTQ people who read it that they don't have (or aren't deserving of) unequivocal support, and it gives any other bigots in the studio audience free license to perpetuate their own hateful views.

So I dodged my responsibility, and I'm still feeling a little sick about it.  I'm not going to go back and re-friend her just to have an opportunity to say, "Oh, and about that post...!", and I guess there's an outside (probably minuscule) chance that when she sees she's lost friends over it, she might reconsider.

But I still think I made the wrong decision.

Right now, I'm taking a deep breath and recommitting myself to fight like hell against this sort of thing.  I can't let bigotry slide, excuse it by saying "it's just their religion/politics/age," give it a pass because I'm afraid of what they might say in response or who else I might piss off.  Okay, I'm tired, but it's still a battle worth fighting -- and one that can be won, but only if we refuse to accept prejudice and hatred every damn time we see it.

Shakespeare put it far more eloquently, in Henry V:
Once more unto the breach, dear friends, once more;
Or close the wall up with our English dead.
In peace there's nothing so becomes a man
As modest stillness and humility:
But when the blast of war blows in our ears,
Then imitate the action of the tiger;
Stiffen the sinews, summon up the blood,
Disguise fair nature with hard-favour'd rage;
Then lend the eye a terrible aspect;
Let pry through the portage of the head
Like the brass cannon; let the brow o'erwhelm it
As fearfully as doth a galled rock
O'erhang and jutty his confounded base,
Swill'd with the wild and wasteful ocean.
****************************************


Wednesday, July 9, 2025

Tracking the hailstones

One of the most shocking results from mathematics -- or even scholarship as a whole -- is Kurt Gödel's Incompleteness Theorem.

Like (I suspect) many of us, I first ran into this startling idea in Douglas Hofstadter's wonderful book Gödel, Escher, Bach: An Eternal Golden Braid, which I read when I was an undergraduate at the University of Louisiana.  I've since reread the whole thing twice, but I'm afraid the parts about formal logic and Gödel's proof are still a real challenge to my understanding.  The gist of it is that Gödel responded to a call by German mathematician David Hilbert to come up with a finite, consistent set of axioms from which all other true statements in mathematics could be derived (and, significantly, which excluded all false or paradoxical ones).  Gödel picked up the gauntlet, but not in the way Hilbert expected (or wanted).

He showed that what Hilbert was asking for was fundamentally impossible.

Put succinctly, Gödel proved that if you come up with an axiomatic system that can generate all true statements of mathematics, it will also generate some untrue ones; if you come up with a system that generates only true statements, there will always be true statements that cannot be proven from within it.  In other words, if a mathematical system is complete, it's inconsistent; if it is consistent, it's incomplete.

The result is kind of staggering, and the more you think about it, the weirder it gets.  Math is supposed to be cut and dried, black-and-white, where things are either provable (and therefore true) or they're simply wrong.  What Gödel showed was that this is not the case -- and worse, there's no way to fix it.  If you simply take any true (but unprovable) mathematical statements you find, and add them to the system as axioms, the new expanded system still falls prey to Gödel's proof.

It's the ultimate catch-22.

The problem is, there's no way to tell the difference between a true-but-thus-far-unproven statement and a true-but-unprovable statement.  There have been a number of conjectures that have baffled mathematicians for ages, and finally been proven -- the four-color map theorem and Fermat's last theorem come to mind.  But one that has resisted all attempts at a proof is the strange Collatz conjecture, also known as the hailstone sequence, proposed in 1937 by the German mathematician Lothar Collatz.

What's wild about the Collatz conjecture is that it's simple enough a grade-school student could understand it.  It says: start with any natural number.  If it's even, divide it by two.  If it's odd, multiply it by three and then add one.  Repeat the process until you reach 1.  Here's how it would work, starting with 7:

7 - 22 - 11 - 34 - 17 - 52 - 26 - 13 - 40 - 20 - 10 - 5 - 16 - 8 - 4 - 2 - 1.

You can see why it's called a "hailstone sequence;" like hailstones, the numbers rise and fall, sometimes buffeted far upwards before finally "falling to Earth."  And what Collatz said was that, subject to this procedure, every natural number will finally fall to 1.

Simple enough, right?  Wrong.  The best minds in mathematics have been stumped as to how to prove it.  The brilliant Hungarian mathematician Paul Erdös said, "Mathematics may not be ready for such a problem."  American mathematician Jeffrey Lagarias was even bleaker, saying, "[The Collatz conjecture] is completely out of reach of present-day mathematics."

What's weirdest is that there does seem to be a pattern -- a relationship between the number you start with and the number of steps it takes to reach 1.  Here's what the graph looks like, if you plot the number of steps as a function of the number you start with, for every number from 1 to 9,999:

[Image is in the Public Domain]

It certainly doesn't appear to be random, but this doesn't get us any closer to proving that all numbers descend to 1 in a finite number of steps.

The reason all this comes up is a recent paper in The Journal of Supercomputing showing that every number between 1 and 2 to the 71st power obeys the Collatz conjecture.  That's a bit over twenty quintillion.  Of course, this still isn't proof; all it'd take is one single number in the octillions that either (1) keeps rising higher and higher forever, or (2) leads to an infinite loop, to disprove it.  So until a formal proof (or disproof) is found, all mathematicians can do is keep extending the list of numbers tested.

But is the Collatz conjecture one of Gödel's inevitable true-but-unprovable statements?  No way to know, even if it never does get proven.  That's the brilliance -- and the frustration -- of Gödel's proof.  Such statements are forever outside the axiomatic system, so there's no way to get at them.

So much for mathematics being firm ground.

Anyhow, that's our mind-blowing bit of news for this morning.  A simple conjecture that has baffled mathematicians for almost ninety years, and is no closer to being solved now than it was when it was first proposed.  It's indicative of how weird and non-intuitive mathematics can be.  As Douglas Hofstadter put it, "It turns out that an eerie type of chaos can lurk just behind a facade of order -- and yet, deep inside the chaos lurks an even eerier type of order."

****************************************


Tuesday, July 8, 2025

Linguistic Calvinball

I've written here before about the monumental difficulty of translating written text when you (1) don't know what the character-to-sound correspondence is (including whether the script is alphabetic, syllabic, or ideographic), (2) don't know what language the script represents, and (3) don't know whether it's read left-to-right, right-to-left, or alternating every other line (boustrophedonic script).  This was what Arthur Evans, Alice Kober, and Michael Ventris were up against with the Linear B script of Crete.  That they succeeded is a testimony not only to their skill as linguists and to their sheer dogged persistence, but to the fact that they had absolutely astonishing pattern-recognition ability.  Despite my MA in linguistics and decent background in a handful of languages, I can't imagine taking on such a task, much less succeeding at it.

The problem becomes even thornier when you consider that what appears to be a script might be asemic -- something that looks like a real written language but is actually meaningless.  (Just a couple of months ago, I wrote here about an asemic text called A Book From the Sky that the creator himself said was nonsense, but that hasn't stopped people from trying to translate it anyhow.)

Which brings us to the Rohonc Codex.

The first certain mention of the Rohonc Codex is in the nineteenth century, although a 1743 catalog of the Rohonc (now the city of Rechnitz, Austria) Library might refer to it -- it says, "Magyar imádságok, volumen I in 12" ("Hungarian prayers in one volume, size duodecimo"). 

As you'll see, that the text represents prayers, or is even in Hungarian, very much remains to be seen.  The size matches; duodecimo means "twelve sheets, approximately 127 millimeters by 187 millimeters in size," and given that some of the earliest guesses about the book's contents were that it was a prayerbook in archaic Hungarian, it's possible that the catalog entry refers to the Codex.  The paper it's printed on appears to be sixteenth-century Venetian in origin, but of course this doesn't mean that's when the book was written -- only that it's unlikely to be any older than that.

One page of the Rohonc Codex [Image is in the Public Domain]

The drawings are rather crude, and the lettering doesn't resemble any known script, although various linguists have compared it to Hungarian runes, Dacian, a dialect of early Romanian, and some variant of Hindi.  Others think it's simply a forgery -- asemic, in other words -- with a sizable number attributing it to the antiquarian Sámuel Nemes, who was known to have forged other documents.

There's no sure connection between Nemes and the Rohonc Codex, however.  He's not known ever to have handled the document, and certainly never mentioned it.  So this seems as tentative as all the other explanations.

Attempts to use the statistical distribution of clusters of symbols, invoking such patterns as Zipf's Law -- the tendency across languages for the word rank to be inversely proportional to word frequency -- have also failed.

Like with A Book From the Sky, this hasn't stopped hopeful scholars from claiming success.  Some of them have been eye-rollingly bad, like the solution proposed in 1996 by one Attila Nyíri of Hungary.  Nyíri combined some Sumerian symbols with chance resemblances to the Latin alphabet, and used such expedients as rearranging letters and letting the same symbol correspond to more than one sound, and still came up with gibberish like, Eljött az Istened. Száll az Úr.  Ó.  Vannak a szent angyalok.  Azok.  Ó.  ("Your God has come.  The Lord flies.  Oh.  There are the holy angels.  Them.  Oh."

I'm perhaps to be excused for being reminded of the Dick and Jane readers.  "Oh, Jane, see Spot.  See Spot run.  Oh, Spot, don't roll in that dead squirrel.  Oh."

Another attempt, this one only marginally more plausible, was made by Romanian linguist Viorica Enăchiuc, and hypothesized that the document (1) is read right-to-left and bottom-to-top, and (2) was written in a Dacian dialect of Latin.  This one came up with lines like Solrgco zicjra naprzi olto co sesvil cas  ("O Sun of the live let write what span the time"), which still isn't exactly what I'd call lucid writing.  

Then there's the Indian linguist Mahesh Kumar Singh, who said the Codex is written left-to-right and top-to-bottom in Hindi, using an obscure variant of the ancient Brahmi script.  Singh translated one passage as, He bhagwan log bahoot garib yahan bimar aur bhookhe hai / inko itni sakti aur himmat do taki ye apne karmo ko pura kar sake ("Oh, my God!  Here the people is very poor, ill and starving, therefore give them sufficient potency and power that they may satisfy their needs.")  His "translation," though, was immediately excoriated by other linguists, who said that he was playing fast-and-loose with the script interpretation, and had come up with symbol-to-sound correspondences that were convenient to how he wanted the translation to come out, not what was supported in other texts.

So the whole enterprise has turned into the linguistic version of Calvinball (from Bill Watterson's brilliant Calvin and Hobbes).  If you make up the rules as you go, and never play by the same rules twice, anything can happen.

The upshot of it all is that the Rohonc Codex is still undeciphered, if there's even anything there to decipher.  Like the more famous Voynich Manuscript, it retains its aura of attractive mystery, because most of us can't resist a puzzle, even if a lot of the best linguists think the script is nonsense.  Because how do you prove decisively that something isn't sensible language?

After all, there are still people who think that Donald Trump's speeches make sense, even when he says shit like, "I saw engines about three, four years ago.  These things were coming—cylinders, no wings, no nothing—and they’re coming down very slowly, landing on a raft in the middle of the ocean someplace, with a circle, boom!  Reminded me of the Biden circles that he used to have, right?  He’d have eight circles, and he couldn’t fill ’em up.  But then I heard he beat us with the popular vote.  He couldn’t fill up the eight circles.  I always loved those circles, they were so beautiful, so beautiful to look at."

So maybe "Oh.  There are the holy angels.  Them.  Oh," isn't so bad.

In any case, I'm sure there'll be further attempts to solve it.  Which falls into the "no harm if it amuses you" department.  And who knows?  Maybe there's a team made up of this century's Evans/Kober/Ventris triumvirate who will actually succeed.

All I know is that attempting it is way above my pay grade.

****************************************


Monday, July 7, 2025

Dord, fnord, and nimrod

We were having dinner with our younger son a while back, and he asked if there was a common origin for the -naut in astronaut and the naut- in nautical.

"Yes," I said.  "Latin nauta, meaning 'sailor.'  Astronaut literally means 'star sailor.'  Also cosmonaut, but that one came from Latin to English via Russian."

"How about juggernaut?" he asked.

"Nope," I said.  "That's a false cognate.  Juggernaut comes from Hindi, from the name of a god, Jagannath.  Every year on the festival day for Jagannath, they'd bring out his huge stone statue on a wheeled cart, and the (probably apocryphal) story is that sometimes it would get away from them, and roll down the hill and crush people.  So it became a name for a destructive force that gets out of hand."

Nathan stared at me for a moment.  "How the hell do you know this stuff?" he asked.

"Two reasons.  First, M.A. in historical linguistics.  Second, it takes up lots of the brain space that otherwise would be used for less important stuff, like where I put my car keys and remembering to pay the utility bill."

I've been fascinated with words ever since I was little, which probably explains not only my degree but the fact that I'm a writer.  And it's always been intriguing to me how words not only shift in spelling and pronunciation, but shift in meaning, and can even pop into and out of existence in strange and unpredictable ways.  Take, for example, the word dord, that for eight years was in the Merriam-Webster New International Dictionary as a synonym for "density."  In 1931, Austin Patterson, the chemistry editor for Merriam-Webster, sent in a handwritten editing slip for the entry for the word density, saying, "D or d, cont./density."  He meant, of course, that in equations, the variable for density could either be a capital or a lower case letter d.  Unfortunately, the typesetter misread it -- possibly because Patterson's writing left too little space between words -- and thought that he was proposing dord as a synonym.

Well, the chemistry editor should know, right?  So into the dictionary it went.

It wasn't until 1939 that editors realized they couldn't find an etymology for dord, figured out how the mistake had come about, and the word was removed.  By then, though, it had found its way into other books.  It's thought that the error wasn't completely expunged until 1947 or so.

Then there's fnord, which is a word coined in 1965 by Kerry Thornley and Greg Hill as part of the sort-of-parody, sort-of-not Discordian religion's founding text Principia Discordia.  It refers to a stimulus -- usually a word or a picture -- that people are trained as children not to notice consciously, but that when perceived subliminally causes feelings of unease.  Government-sponsored mind-control, in other words.  It really took off when it was used in the 1975  Illuminatus! Trilogy, by Robert Shea and Robert Anton Wilson, which became popular with the counterculture of the time (for obvious reasons).

Fnord isn't the only word that came into being because of a work of fiction.  There's grok, meaning "to understand on a deep or visceral level," from Robert Heinlein's novel Stranger in a Strange Land.   A lot of you probably know that the quark, the fundamental particle that makes up protons and neutrons, was named by physicist Murray Gell-Mann after the odd line from James Joyce's Finnegan's Wake, "Three quarks for Muster Mark."  Less well known is that the familiar word robot is also a neologism from fiction, from Czech writer Karel Čapek's play R.U.R. (Rossum's Universal Robots); robota in Czech means "hard labor, drudgery," so by extension, the word took on the meaning of the mechanical servant who performed such tasks.  Our current definition -- a sophisticated mechanical device capable of highly technical work -- has come a long way from the original, which was closer to slave.

Sometimes words can, more or less accidentally, migrate even farther from their original meaning than that.  Consider nimrod.  It was originally a name, referenced in Genesis 10:8-9 -- "Then Cush begat Nimrod; he began to be a mighty one in the Earth.  He was a mighty hunter before the Lord."  Well, back in 1940, the episode of Looney Tunes called "A Wild Hare" was released, the first of many surrounding the perpetual chase between hunter Elmer Fudd and the Wascally Wabbit.  In the episode, Bugs calls Elmer "a poor little Nimrod" -- poking fun at his being a hunter, and a completely inept one at that -- but the problem was that very few kids in 1940 (and probably even fewer today) understood the reference and connected it to the biblical character.  Instead, they thought it was just a humorous word meaning "buffoon."  The wild (and completely deserved) popularity of Bugs Bunny led to the original allusion to "a mighty hunter" being swamped; ask just about anyone today what nimrod means and they're likely to say something like "an idiot."


Interestingly, another of Bugs's attempted coinages meaning "a fool" -- maroon, from the hilarious 1953 episode "Bully for Bugs" -- never caught on in the same way.  When he says about the bull, "What a maroon!", just about everyone got the joke, probably because both the word he meant (moron) and the conventional definition of the word he said (a purplish-red color) are familiar enough that we realized he was mispronouncing a word, not coining a new one.


It's still funny enough, though, that I've heard people say "What a maroon!" when referring to someone who's dumb -- but as a quote from a fictional character, not because they think it's the correct word.

Languages shift and flow constantly.  Fortunately for me, since language evolution is my area of study.  It's why the whole prescriptivism vs. descriptivism battle is honestly pretty comical -- the argument over whether, respectively, linguists are recording the way languages should be used (forever and ever amen), or simply describing how they are used.  Despite the best efforts of the prescriptivists, languages change all the time, sometimes in entirely sudden and unpredictable ways.  Slang words are the most obvious examples -- when I was a teacher, I was amazed at how slang came and went, how some words would be en vogue one month and passé the next, while others had real staying power.  (And sometimes resurface.  I still remember being startled the first time I heard a student unironically saying "groovy.")

But that's part of the fun of it.  That our own modes of communication change over time, often in response to cultural phenomena like books, television, and movies, is itself an interesting feature of our ongoing attempt to be understood. 

And I'm sure Bugs would be proud of how he's influenced the English language, even if it was inadvertent.

****************************************


Saturday, July 5, 2025

Out of time

A friend of mine recently posted, "And poof!  Just like that, 1975 is fifty years ago."

My response was, "Sorry.  Wrong.  1975 is 25 years ago.  In five years, 1975 will still be 25 years ago.  That's my story, and I'm stickin' to it."

I've written here before about how plastic human memory is, but mostly I've focused on the content -- how we remember events.  But equally unreliable is how we remember time.  It's hard for me to fathom the fact that it's been six years since I retired from teaching.  On the other hand, the last overseas trip I took -- to Iceland, in 2022 -- seems like it was a great deal longer ago than that.  And 1975... well....  My own sense of temporal sequencing is, in fact, pretty faulty, and there have been times I've had to look up a time-stamped photograph, or some other certain reference point, to be sure when exactly some event had occurred.

Turns out, though, that just about all of us have inaccurate mental time-framing.  And the screw-up doesn't even necessarily work the way you'd think.  The assumption was -- and it makes some intuitive sense -- that memories of more recent events would be stronger than those from longer ago, and that's how your brain keeps track of when things happened.  It's analogous to driving at night, and judging the distance to a car by the brightness of its headlights; dimmer lights = the oncoming car is farther away.

But just as this sense can be confounded -- a car with super-bright halogen headlights might be farther away than it seems to be -- your brain's time sequencing can be muddled by the simple expedient of repetition.  Oddly, though, repetition has the unexpected effect of making an event seems like it happened further in the past than it actually did.

[Image licensed under the Creative Commons Isabelle Grosjean ZA, MontreGousset001, CC BY-SA 3.0]

A new study out of Ohio State University, published this week in the journal Psychological Science, shows that when presented with the same stimulus multiple times, the estimate of when the test subject saw it for the first time became skewed by as much as twenty-five percent.  It was a robust result -- holding across the majority of the hundreds of volunteers in the study -- and it came as a surprise to the researchers.

"We all know what it is like to be bombarded with the same headline day after day after day," said study co-author Sami Yousif.  "We wondered whether this constant repetition of information was distorting our mental timelines...  Images shown five times were remembered as having occurred even further back than those shown only two or three times.  This pattern persisted across all seven sets of image conditions...  We were surprised at how strong the effects were.  We had a hunch that repetition might distort temporal memory, but we did not expect these distortions to be so significant."

So when someone says "I know it happened that way, I remember it," it should be as suspect with respect to timing as it is to content.

"People should take away two things," Yousif said.  "(1) Time perception is illusory.  That is, our sense of when things occurred is systematically distorted in predictable ways.  (2) These distortions can be substantial, even if their causes are simple (i.e., the mere repetition of information)."

More and more it's seeming like what we think of as our rock-solid memory is an elaborate but rickety house of cards, composed of bits of accurate recollections mixed in with partial truths (real memories in the wrong sequence, or correctly sequenced memories that are being remembered imprecisely), along with a heaping helping of complete fiction.  Add to that the unsettling truth that unless you have a fixed, factual reference point, there's no way to tell the difference.

Makes you wonder how eyewitness testimony can still be used as the sine qua non of evidence in courts of law.

****************************************


Friday, July 4, 2025

Creatures from the alongside

In C. S. Lewis's novel Perelandra, the protagonist, Elwin Ransom, goes to the planet Venus.  In Lewis's fictional universe Venus isn't the scorched, acid-soaked hell we now know it to be; it's a water world, with floating islands of lush vegetation, tame animals, and a climate like something out of paradise.

In fact, to Lewis, it is paradise; a world that hasn't fallen (in the biblical sense).  Ransom runs into a woman who appears to be the planet's only humanoid inhabitant, and she exhibits a combination of high intelligence and innocent naïveté that is Lewis's expression of the Edenic state.  Eventually another Earth person arrives -- the scientist Weston, who is (more or less) the delegate of the Evil One, playing here the role of the Serpent.  And Weston tells the woman about humanity's love for telling stories:

"That is a strange thing," she said.  "To think abut what will never happen."

"Nay, in our world we do it all the time.  We put words together to mean things that have never happened and places that never were: beautiful words, well put together.  And then we tell them to one another.  We call it stories or poetry...  It is for mirth and wonder and wisdom."

"What is the wisdom in it?"

"Because the world is made up not only of what is but of what might be.  Maleldil [God] knows both and wants us to know both."

"This is more than I ever thought of.  The other [Ransom] has already told me things which made me feel like a tree whose branches were growing wider and wider apart.  But this goes beyond all.  Stepping out of what is into what might be, and talking and making things out there, alongside the world...  This is a strange and great thing you are telling me."

It's more than a little ironic -- and given Lewis's impish sense of humor, I'm quite sure it was deliberate -- that a man whose fame came primarily from writing fictional stories identifies fictional stories as coming from the devil, within one of his fictional stories.  Me, I'm more inclined to agree with Ralph Waldo Emerson: "Fiction reveals truth that reality obscures."

Our propensity for telling stories is curious, and it's likely that it goes a long way back.  Considering the ubiquity of tales about gods and heroes, it seems certain that saying "Once upon a time..." has been going on since before we had written language.  It's so familiar that we lose sight of how peculiar it is; as far as we know, we are alone amongst the nine-million-odd species in Kingdom Animalia in inventing entertaining falsehoods and sharing them with the members of our tribe.

The topic of storytelling comes up because quite by accident I stumbled on Wikipedia's page called "Lists of Legendary Creatures."  It's long enough that they have individual pages for each letter of the alphabet.  It launched me down a rabbit hole that I didn't emerge from for hours. 

And there are some pretty fanciful denizens of the "alongside world."  Here are just a few examples I thought were particularly interesting:

  • The Alp-luachra of Ireland.  This creature looks like a newt, and waits for someone to fall asleep by the side of the stream where it lives, then it crawls into his/her mouth and takes up residence in the stomach.  There it absorbs the "quintessence" of the food, causing the person to lose weight and have no energy.
  • The Popobawa of Zanzibar, a one-eyed shadowy humanoid with a sulfurous odor and wings.  It visits houses at night where it looks for people (either gender) to ravish.
  • The Erchitu, a were...ox.  In Sardinia, people who commit crimes and don't receive the more traditional forms of justice turn on the night of the full Moon into huge oxen, which then get chased around the place being poked with skewers by demons.  This is one tale I wish was true, because full Moon days in the White House and United States Congress would be really entertaining.
  • The Nekomata, a cat with multiple tails that lives in the mountains regions of Japan and tricks unwary solo travelers, pretending at first to be playful and then leading them into the wilds and either losing them or else attacking them.  They apparently are quite talented musicians, though.

Nekomata (猫又) from the Hyakkai-Zukan (百怪図巻) by Sawaki Suushi (1707) [Image is in the Public Domain]

  • The Gwyllgi, one of many "big evil black dog" creatures, this one from Wales.  The Gwyllgi is powerfully-built and smells bad.  If you added "has no respect for personal space" and "will chase a tennis ball for hours," this would be a decent description of my dog Guinness, but Guinness comes from Pennsylvania, not Wales, so maybe that's not a match.
  • The Sânziană of Romania, who is a fairy that looks like a beautiful young woman.  Traditionally they dance in clearings in the forest each year on June 24, and are a danger to young men who see them -- any guy who spies the Sânziene will go mad with desire (and stay that way, apparently).
  • The Ao-Ao, from the legends of the Guarani people of Paraguay.  The Ao-Ao is a creature that looks kind of like a sheep, but has fangs and claws, and eats people.  It is, in fact, a real baa-dass.

A statue of an Ao-Ao by Paraguayan sculptor Ramón Elias [Image is in the Public Domain]

  • The Tlahuelpuchi, of the Nahua people of central Mexico.  The Tlahuelpuchi is a vampire, a human who is cursed to suck the blood of others (apparently it's very fond of babies).  When it appears, it sometimes looks human but has an eerie glow; other times, it leaves its legs behind and turns into a bird.  Either way, it's one seriously creepy legend.
  • The Dokkaebi, a goblin-like creature from Korea.  It has bulging eyes and a huge, grinning mouth filled with lots of teeth, and if it meets you it challenges you to a wrestling match.  They're very powerful, but apparently they are weak on the right side, so remember that if you're ever in a wrestling match with a goblin in Korea.

So that's just the merest sampling of the creatures in the list.  I encourage you to do a deeper dive.  And myself, I think the whole thing is pretty cool -- a tribute to the inventiveness and creativity of the human mind.  I understand why (in the context of the novel) C. S. Lewis attributed storytelling to the devil, but honestly, I can't see anything wrong with it unless you're trying to convince someone it's all true.

I mean, consider a world without stories.  How impoverished would that be?  So keep telling tales.  It's part of what it means to be human.

****************************************


Thursday, July 3, 2025

Grace under pressure

In the 1992 Winter Olympics, there was an eighteen-year-old French figure skater named Laëtitia Hubert.  She was a wonderful skater, even by the stratospheric standards of the Olympics; she'd earned a silver medal at the French National Championships that year.  But 1992 was a year of hyperfocus, especially on the women's figure skating -- when there were such famous (and/or infamous) names as Nancy Kerrigan, Tonya Harding, Kristi Yamaguchi, Midori Ito, and Surya Bonaly competing.

What I remember best, though, is what happened to Laëtitia Hubert.  She went into the Short Program as a virtual unknown to just about everyone watching -- and skated a near-perfect program, rocketing her up to fifth place overall.  From her reaction afterward it seemed like she was more shocked at her fantastic performance than anyone.  It was one of those situations we've all had, where the stars align and everything goes way more brilliantly than expected -- only this was with the world watching, at one of the most publicized events of an already emotionally-fraught Winter Olympics.

This, of course, catapulted Hubert into competition with the Big Names.  She went into the Long Program up against skaters of world-wide fame.  And there, unlike the pure joy she showed during the Short Program, you could see the anxiety in her face even before she stated.

She completely fell apart.  She had four disastrous falls, and various other stumbles and missteps.  It is the one and only time I've ever seen the camera cut away from an athlete mid-performance -- as if even the media couldn't bear to watch.  She dropped to, and ended at, fifteenth place overall.

It was simply awful to watch.  I've always hated seeing people fail at something; witnessing embarrassing situations is almost physically painful to me.  I don't really follow the Olympics (or sports in general), but over thirty years later, I still remember that night.  (To be fair to Hubert -- and to end the story on a happy note -- she went on to have a successful career as a competitive skater, earning medals at several national and international events, and in fact in 1997 achieved a gold medal at the Trophée Lalique competition, bumping Olympic gold medalist Tara Lipinski into second place.)

I always think of Laëtitia Hubert whenever I think of the phenomenon of "choking under pressure."  It's a response that has been studied extensively by psychologists.  In fact, way back in 1908 a pair of psychologists, Robert Yerkes and John Dillingham Dodson, noted the peculiar relationship between pressure and performance in what is now called the Yerkes-Dodson curve; performance improves with increasing pressure (what Yerkes and Dodson called "mental and physiological arousal"), but only up to a point.  Too much pressure, and performance tanks.  There have been a number of reasons suggested for this effect, one of which is that it's related to the level of a group of chemicals in the blood called glucocorticoids.  The level of glucocorticoids in a person's blood has been shown to be positively correlated with long-term memory formation -- but just as with Yerkes-Dodson, only up to a point.  When the levels get too high, memory formation and retention crumbles.  And glucocorticoid production has been found to rise in situations that have four characteristics -- those that are novel, unpredictable, contain social or emotional risks, and/or are largely outside of our capacity to control outcomes.

Which sounds like a pretty good description of the Olympics to me.

What's still mysterious about the Yerkes-Dodson curve, and the phenomenon of choking under pressure in general, is how it evolved.  How can a sudden drop in performance when the stress increases be selected for?  Seems like the more stressful and risky the situation, the better you should do.  You'd think the individuals who did choke when things got dangerous would be weeded out by (for example) hungry lions.

But what is curious -- and what brings the topic up today -- is that a study in Proceedings of the National Academy of Sciences showed that humans aren't the only ones who choke under pressure.

So do monkeys.

In a clever set of experiments led by Adam Smoulder of Carnegie Mellon University, researchers found that giving monkeys a scaled set of rewards for completing tasks showed a positive correlation between reward level and performance, until they got to the point where success at a difficult task resulted in a huge payoff.  And just like with humans, at that point, the monkeys' performance fell apart.

The authors describe the experiments as follows:
Monkeys initiated trials by placing their hand so that a cursor (red circle) fell within the start target (pale blue circle).  The reach target then appeared (gray circle with orange shape) at one of two (Monkeys N and F) or eight (Monkey E) potential locations (dashed circles), where the inscribed shape’s form (Monkey N) or color (Monkeys F and E) indicated the potential reward available for a successful reach.  After a short, variable delay period, the start target vanished, cueing the animal to reach the peripheral target.  The animals had to quickly move the cursor into the reach target and hold for 400 ms before receiving the cued reward.
And when the color (or shape) cueing the level of the reward got to the highest level -- something that only occurred in five percent of the trials, so not only was the jackpot valuable, it was rare -- the monkeys' ability to succeed dropped through the floor.  What is most curious about this is that the effect didn't go away with practice; even the monkeys who had spent a lot of time mastering the skill still did poorly when the stakes were highest.

So the choking-under-pressure phenomenon isn't limited to humans, indicating it has a long evolutionary history.  This also suggests that it's not due to overthinking, something that I've heard as an explanation -- that our tendency to intellectualize gets in the way.  That always seemed to make some sense to me, given my experience with musical performance and stage fright.  My capacity for screwing up on stage always seemed to be (1) unrelated to how much I'd practiced a piece of music once I'd passed a certain level of familiarity with it, and (2) directly connected to my own awareness of how nervous I was.  I did eventually get over the worst of my stage fright, mostly from just doing it again and again without spontaneously bursting into flame.  But I definitely still have moments when I think, "Oh, no, we're gonna play 'Reel St. Antoine' next and it's really hard and I'm gonna fuck it up AAAAUUUGGGH," and sure enough, that's when I would fuck it up.  Those moments when I somehow prevented my brain from going into overthink-mode, and just enjoyed the music, were far more likely to go well, regardless of the difficulty of the piece.
 
One of my more nerve-wracking performances -- a duet with the amazing fiddler Deb Rifkin on a dizzyingly fast medley of Balkan dance tunes, in front of an audience of other musicians, including some big names (like the incomparable Bruce Molsky).  I have to add that (1) I didn't choke, and (2) Bruce, who may be famous but is also an awfully nice guy, came up afterward and told us how great we sounded.  I still haven't quite recovered from the high of that moment.

As an aside, a suggestion by a friend -- to take a shot of scotch before performing -- did not work.  Alcohol doesn't make me less nervous, it just makes me sloppier.  I have heard about professional musicians taking beta blockers before performing, but that's always seemed to me to be a little dicey, given that the mechanism by which beta blockers decrease anxiety is unknown, as is their long-term effects.  Also, I've heard more than one musician describe the playing of a performer on beta blockers as "soulless," as if the reduction in stress also takes away some of the intensity of emotional content we try to express in our playing.

Be that as it may, it's hard to imagine that a monkey's choking under pressure is due to the same kind of overthinking we tend to do.  They're smart animals, no question about it, but I've never thought of them as having the capacity for intellectualizing a situation we have (for better or worse).  So unless I'm wrong about that, and there's more self-reflection going on inside the monkey brain than I realize, there's something else going on here.

So that's our bit of curious psychological research of the day.  Monkeys also choke under pressure.  Now, it'd be nice to find a way to manage it that doesn't involve taking a mood-altering medication.  For me, it took years of exposure therapy to manage my stage fright, and I still have bouts of it sometimes even so.  It may be an evolutionarily-derived response that has a long history, and presumably some sort of beneficial function, but it certainly can be unpleasant at times.

****************************************