Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, July 10, 2024

The echo of evil

One of the more horrifying stories from my home state of Louisiana is the more-or-less true tale of Madame Delphine Macarty LaLaurie.

I qualify it with "more-or-less" because being gruesome, even by New Orleans gothic standards, it's certainly been embellished along the way.  Plus, as you'll see there's a supernatural twist to the whole thing, and -- at least in my not-always-so-humble opinion -- that makes it fictional by default.  But with that caveat in place, here's what we know.

Delphine was born on the 19th of March, 1787, in New Orleans, to Louis Barthélemy de Macarty (or McCarty or McCarthy or MacCarthy) and his wife, Marie-Jeanne L'Érable.  Louis's father was from Ireland, but the rest of the family was French -- as well as influential and rich.  Her uncle by marriage was the governor of the Spanish colony of Louisiana, and a cousin later became mayor of New Orleans.  Delphine married three times; first to a prominent officer in the Spanish military named Ramón de Lopez y Angulo, then to a wealthy banker named Jean Blanque, and last to a doctor, Léonard Louis Nicolas LaLaurie.

Delphine Macarty LaLaurie [Image is in the Public Domain]

Until 1834, Delphine and her husband(s) showed every sign of being completely normal upper-class citizens, participating in the high society of the New Orleans French Quarter.  A few hints had gotten out about the LaLauries, especially Delphine, alleging that she mistreated slaves, but in that day and age it had to be pretty extreme before anyone would do anything about that even if it were proven true.

Eventually, it was.  And the reality turned out to be so bad that even the privileged White people of the antebellum South were revolted.

In April of 1834, a fire broke out in the kitchen of the LaLaurie mansion.  Responding to calls for help, neighbors came in to extinguish the blaze -- and found the family cook chained to the stove by her ankle.  This spurred an investigation, and the police found the family slaves in deplorable shape, showing evidence of torture and deprivation.  At first Dr. LaLaurie responded to the inquiry with derision, saying, "some people had better stay at home rather than come to others' houses to dictate laws and meddle with other people's business," but when the condition of the slaves was made public, the outrage was so strong that a mob descended on the house.  The couple fled, eventually making their way to Paris, where they lived for the rest of their lives.  Dr. LaLaurie's death is unrecorded, but Delphine's shows up in the Paris Archives, saying she died on 7 December 1849 at the age of 62.  She never publicly acknowledged any guilt over how she and her husband had treated the slaves; in fact, a letter from Paulin Blanque, her son by her second marriage, states that his mother "never had any idea about the reason for her departure from the city."

So either Dr. LaLaurie was the real villain, here, or Delphine was amoral and an accomplished liar.

Perhaps both.

Certainly the legend, though, favors the latter.  The tale of a depraved and sadistic woman had a cachet that grabbed people's attention, and the story began to grow by accretion.  The 1946 book Ghost Stories of Old New Orleans, by Jeanne deLavigne, went into explicit detail about what Delphine supposedly did -- I'll spare you the details, not only because they are downright disgusting, but because the more grotesque of the claims are entirely unsubstantiated by the records.  Now, I'm not saying the LaLauries were innocent, mind you; at the best, they were cruel, heartless people whose escape to Paris is the very definition of "getting off lightly."  But any time there's a claim like this, people always want to add to it -- and they have, throwing in enough gory details to do a slasher movie proud.

The LaLaurie house was rebuilt -- there wasn't much left but the frame after the fire and the attack by the enraged mob -- and over the years has been a private residence (the most recent owner was none other than Nicolas Cage), a music conservatory, a high school, a residence for delinquents, a bar, and a furniture store.  It's widely considered to be haunted, and features prominently on New Orleans ghost walks; some call it "the most haunted building in Louisiana," where at night you can hear the moans of the poor tortured slaves and the evil, cold laugh of the wicked Delphine, as she walks the hallways and staircases looking for new victims.

LaLaurie Mansion, 1140 Royal Street [Image licensed under the Creative Commons APK, LaLaurie Mansion, CC BY-SA 4.0]

The reason the topic comes up is because the home just went up for sale again -- target price, a cool $10.25 million.  So if you have a good chunk of cash and want to live in one of the most notorious haunted houses in the Deep South, here's your chance.

Predictably, I don't put much stock in the paranormal side of this, but the author of the article about the sale makes a trenchant point; ghosts or no ghosts, isn't it pretty tasteless to be using the evil reputation of the site as a way of jacking up the price?  After all, no one doubts that real human beings were treated horribly here, many of them ultimately dying of their injuries.  There's not even the relief of a just ending to fall back on; the LaLauries pretty much got off scot-free.  The article's author suggests that maybe the thing to do is turn the place into a museum chronicling the plight of slaves in the South, who even after they were nominally freed by the Emancipation Proclamation and the Civil War, still had to endure generations of prejudice, persecution, and injustice.  (And to our nation's enduring shame, in many places their descendants still do.)

It's a nice idea, but money will talk, as it always does.  Some rich person will buy the LaLaurie Mansion, and it'll still be featured on ghost tours, cashing in on a legacy of human suffering.  Whatever the horrible details of the story of Delphine and her husband, having a building standing in their name is still on some level celebrating them, leaving an echo of evil on the streets of the French Quarter.

I understand the argument about leaving up places with horrific historical associations as reminders, but this is a case where I think the most fitting thing is to raze the damn place and erase every last trace of Delphine LaLaurie.  She got off easy (extremely easy) in life -- perhaps eradicating her memory after death is a fitting end for someone who was judged as sadistic even by the cruel standards of her time and place.

****************************************



Tuesday, July 9, 2024

Jump scare preparation

I'm currently working my way through a rewatch of the old television series Kolchak: The Night Stalker.

The show was only on for one season (1974-1975, so when I was fourteen or so years old), and the main character was played to awkward, bumbling perfection by Darren McGavin.  


I dearly loved this show when I was a kid, and was devastated when it was cancelled, but in retrospect I have to admit that the worst episodes of its run are pretty dreadful.  Like the one I watched just yesterday while I was working out, called "The Vampire."  You will probably be unsurprised to learn that it's about a vampire.  (The show also had episodes called "The Zombie" and "The Werewolf" named for analogous reasons.  Some of the episodes had intriguing and creative titles, but these were not amongst them.)  "The Vampire," though -- well, to swipe a line from the inimitable Dorothy Parker, "to call the plot 'wafer-thin' would be to give grievous insult to wafer-makers."  The titular vampire attacks a few people, the cops are skeptical that it's the work of a vampire, the usual hijinks ensue, and Kolchak eventually dispatches her with a cross and a stake through the heart.

Roll end credits.

On the other hand, some of the best episodes of the show are downright brilliant -- and scary as hell.  "The Energy Eater" is about a thing that haunts the basement of a new, cutting-edge hospital facility, sucking up electrical energy; the scene where Kolchak is running down a long hallway being pursued by it, and behind him one by one the light bulbs are bursting, gradually plunging the place into darkness, is absolutely terrifying.  "The Spanish Moss Murders" features a guy in a sleep study whose brain waves are being manipulated -- inadvertently bringing his nightmare to life, a hideous swamp creature called Père Malfait who is made entirely of dripping clumps of Spanish moss.  

But none of them had the impact on me as a teenager that "Horror in the Heights" did.  The monster in that one lures you in by impersonating the person you trust the most.  I'll never forget one scene, where an elderly couple unexpectedly runs into their rabbi while they're walking home late one night from a movie theater.  The scene starts out from their point-of view -- you see the smiling, paternal face of the rabbi as he walks down the sidewalk toward them.  But then the camera swivels around the trio, and when it gets to the side, you can see that only the front half of the rabbi is human -- the back half is a hulking, hairy beast.  It's wearing the rabbi's form like a full-body tie-on mask.

It's incredibly effective -- and is one of the creepiest scenes I have ever watched.

It's honestly kind of puzzling that I watch scary shows, though, because I am seriously suggestible.  When the movie The Sixth Sense first was released on DVD, my girlfriend (now wife) and I watched it at her house.  Then I had to make a forty-five minute drive, alone in my car at around midnight, then go (still alone) into my cold, dark, empty house.  I might actually have jumped into bed from four feet away so the evil little girl ghost wouldn't reach out from underneath and grab my ankle.  I also might have pulled the blankets up as high over me as I could without suffocating, following the time-tested rule that monsters' claws can't pierce a down comforter.

So yeah.  I might be a skeptic, but I am also a great big coward.

This was why I found some research that was published in the journal Neuroimage so fascinating.  It comes out of the University of Turku (Finland), where a team led by neuroscientist Lauri Nummenmaa had people watching movies like The Devil's Backbone and The Conjuring while hooked to an fMRI scanner.

They asked the participants (all of whom said they watched at least one horror movie every six months) to rate the movies they watched for suspense and scariness, count the number of "jump scares," and evaluate their overall quality.  The scientists then looked at the fMRI results to see what parts of the brain were active when, and found some interesting patterns.

As the tension is increasing -- points where you're thinking, "Something scary is going to happen soon" -- the parts of the brain involved in visual and auditory processing ramp up activity.  Makes sense; if you were in a situation with real threats, and were worried about some imminent danger, you would begin to pay more attention to your surroundings, looking for clues to whether your fears were justified.  Then at the moment of jump scares, the parts of the brain involved in decision-making and fight-or-flight response spike in activity, as you make the split-second decision whether to run, fight the monster, or (most likely in my case) just piss your pants and then have a stroke.

Nummenmaa and his team found, however, that all through the movie, the sensory processing and rapid-response parts of the brain were in continuous cross-talk.  Apparently the brain is saying, "Okay, we're in a horror movie, so something terrifying is bound to happen sooner or later.  May as well prepare for it now."

What I still find fascinating, though, is why people actually like this sensation.  Even me.  I mean, my favorite Doctor Who episode -- the one that got me hooked on the series in the first place -- is the iconic episode "Blink," featuring the terrifying Weeping Angels, surely one of the scariest fictional monsters ever invented.


Maybe for a lot of us, it's so when it's over, we can reassure ourselves that although we might have problems in our lives, at least we're not being disemboweled by a werewolf or abducted by aliens or whatnot.  I'm not sure if this is true for me, though.  Because long after the show has ended, I'm still convinced that whatever horrifying creature was rampaging through the story, it's still out there.

And it's looking for me.

So maybe I shouldn't watch scary shows.  It definitely takes a toll on me.  I remember when I saw the episode of The X Files called "Patience," about this extremely creepy humanoid bat-creature that was one by one hunting down the guys who had killed its mate years earlier.  At the end of the episode there's only one of them left alive, and he's gone to a cabin on a little island in a lake out in the middle of nowhere to hide.  There's a fire going in the fireplace, everything is deathly quiet.  He's freaking out, of course, jumping at the tiniest noise.  So when there's a thump and a smoldering piece of wood rolls out onto the floor, he is terrified at first, but then (very cautiously) goes to investigate.  No, nothing over by the fireplace, nothing up the chimney.  So he turns around...

... and the bat thing is standing right behind him.

Man, it was ages before I recovered from that scene.  That evening I was damn close to telling my dogs that they could just pee on the rug, because there was no way in hell I was opening the back door to let them out.  Who knew what could be out there?  Bat things, Weeping Angels, evil ghosts, invisible energy-suckers, swamp monsters covered with dripping Spanish moss.

Or... worst of all... maybe even the person I trust most, walking slowly toward me out of the darkness, wearing a big reassuring smile on their face.

****************************************



Monday, July 8, 2024

Beginner's mind

Last September, I started learning Japanese through Duolingo.

[Image licensed under the Creative Commons Grantuking from Cerrione, Italy, Flag of Japan (1), CC BY 2.0]

My master's degree is in historical linguistics, so I'm at least a little better than the average bear when it comes to languages, but still -- my graduate research focused entirely on Indo-European languages.  (More specifically, the effects of the Viking invasions on Old English and the Celtic languages.)  Besides the Scandinavian languages and the ones found in the British Isles, I have a decent, if rudimentary, grounding in Greek and Latin, but still -- until last September, anything off of the Indo-European family tree was pretty well outside my wheelhouse.

The result is that there are features of Japanese that I'm struggling with, because they're so different from any other language I've studied.  Languages like Old English, Old Norse, Gaelic, Greek, and Latin are all inflected languages -- nouns change form depending on how they're being used in a sentence.  A simple example from Latin: in the two sentences "Canis felem momordit" ("The dog bit the cat") and "Felis canem momordit" ("The cat bit the dog"), you know who bit whom not by the order of the words, but by the endings.  The biter ends in -s, the bitee ends in -m.  The sentence would still be intelligible (albeit a little strange-sounding) if you rearranged the words.

Not so in Japanese.  In Japanese, not only does everything have to be in exactly the right order, just about every noun has to be followed by the correct particle, a short, more-or-less untranslatable word that tells you what the function of the previous word is.  They act a little like case endings do in inflected languages, and a little like prepositions in English, but with some subtleties that are different from either.  For example, here's a sentence in Japanese:

Tanaka san wa, sono sushiya de hirugohan o tabemashou ka?

Mr. Tanaka [particle indicating respect, always used when addressing another person] [particle indicating who you're talking to or the subject of the sentence], that sushi shop [particle indicating going to a place] lunch [particle indicating the object of the sentence] should we eat [particle indicating that what you just said was a question]? = "Mr. Tanaka, would you like to eat lunch at that sushi shop?"

Woe betide if you forget the particle or use the wrong one, or put things out of order.  Damn near every time I miss something on Duolingo and get that awful "clunk" noise that tells you that you screwed up, it's because I made a particle-related mistake.

And don't even get me started about the three different writing systems you have to learn.

This is the first time in a while I've been in the position of starting from absolute ground zero with something.  I guess I do have a bit of a leg up from having a background in other languages, but it's not really that much.  Being a rank beginner is humbling -- if you're going to get anywhere, you have to be willing to let yourself make stupid mistakes (sometimes over and over and over), laugh about it, and keep going.  I'm not really so good at that -- not only do I take myself way too damn seriously most of the time, I have that unpleasant combination of being (1) ridiculously self-critical and (2) highly competitive.  If you're familiar with Duolingo, you undoubtedly know about the whole XP (experience points) and "leagues" thing -- when you complete a lesson you earn XP (as long as you don't lose points in the lesson because you fucked up the particles again), and at the end of the week, you are ranked in XP against other learners, and depending on your score, you can move up into a new "league."

Or get "demoted."  Heaven forbid.  Given my personality, my attitude is "death before demotion."  As my wife pointed out, nothing happens if I get demoted -- it's not like the app reaches into my cerebrum and deletes what I've learned, or anything.  

She's right of course, but still.

I'll be damned if I'm gonna let myself get demoted.

So last week I reached "Diamond League," which is the top-tier.  Yay me, right?  Only now, there's nowhere left to go.  But I have to keep hammering at it, because if I don't I'll get dropped back into Obsidian League, and screw that sideways.

On the other hand, I keep at it because I also want to learn Japanese, right?  Of course right.

In Zen Buddhism, there's a concept called shoshin (初心), usually translated as "beginner's mind."  It means approaching every endeavor as if you were just seeing it for the first time, with excitement, anticipation -- and no preconceived notions of how it should go.  This is a hard lesson for me, harder even than remembering kanji.  I've had to get used to taking it slowly, realizing that I'm not going to learn a difficult and unfamiliar language overnight, and to come at it from a standpoint of curiosity and enjoyment.

It's not a competition, however determined I am to stay in the "Diamond League."  The process and the knowledge and the achievement should be the point, not a focus on some arbitrary standard of where I think I should be.

And some day, I'd like to visit the lovely country of Japan, and (maybe?) be able to converse a little in their language.  

[Image licensed under the Creative CommonsKeihin Nike, Bunkyou Koishikawa Botanical Japanese Garden 1 (1), CC BY-SA 3.0]

When that day comes, I suspect if I can approach the whole thing with beginner's mind, I'll get a lot more out of the experience.  Until that time -- I could probably think of a few other aspects of my life that this principle could be applied to, as well.

****************************************



Friday, July 5, 2024

Twists and turns

One of the things I love the most about science is how one thing leads to another.

Someone notices something anomalous, and thinks to ask, "why?"  The answer to that question leads to more "whys" and "hows," and before long it's led you somewhere you never dreamed of, and opened up new vistas for understanding the universe.

Take, for example, the strange phenomenon of lunar swirls.

Swirls near Firsov Crater [Image is in the Public Domain courtesy of NASA/JPL]

Lunar swirls are pretty much what they sound like; undulating curls of light-colored rock and dust, often overlying craters and other topographic features, but seeming not to follow any obvious contour lines.  This is odder than it may appear to be at first.  We see lots of looping, curly stuff on Earth -- cirrus clouds, the twist of hurricanes and tornadoes, the meanders of rivers -- but all of those occur because of some fluid flowing, be it air or water vapor or liquid water.  The Moon has no atmosphere, and never has had flowing water; so what's causing the sinuous shape?

The mystery deepened when lunar sampling missions found out that the light regions had somehow been magnetized.  This at least explained the color difference; the magnetized bits deflected the particles in the solar wind, causing them to hit nearby rocks instead.  This triggered a series of chemical reactions that darkened the rocks' surfaces, while the magnetized parts were spared and stayed light-colored.

But then the question was, how did the light-colored rocks get magnetized in the first place?

It happens easily enough on Earth; a lot of terrestrial rocks have particles of magnetite (iron II, III oxide), and while they're in the molten state the particles are free to move.  They respond like compass needles, aligning with the Earth's magnetic field, and when the lava cools the magnetite crystals are frozen in place, locking in a magnetic signature.  (You probably know that this property is how geologists found out that the Earth's magnetic field periodically flips -- something that was key to proving the plate tectonics model.)

The problem is twofold.  First, magnetite is rare in lunar rocks; and even more difficult to explain -- the Moon has no magnetic field.  So what are these magnetic crystals, and how are they aligning well enough to make the rocks magnetized?

A possible answer was the subject of a paper this week in the Journal of Geophysical Research, describing a study out of Washington University.  A rock called ilmenite, common on the Moon's surface, can form crystalline iron (which is highly magnetic).  As far as how the crystals got aligned, the research team found a process that could cause enough of a magnetic field anomaly to cause it -- if there was a flow of high-titanium magma underground.

"Our analog experiments showed that at lunar conditions, we could create the magnetizable material that we needed," said study co-author Michael Krawczynski. "So, it's plausible that these swirls are caused by subsurface magma...  If you're going to make magnetic anomalies by the methods that we describe, then the underground magma needs to have high titanium.  We have seen hints of this reaction creating iron metal in lunar meteorites and in lunar samples from Apollo.  But all of those samples are surface lava flows, and our study shows cooling underground should significantly enhance these metal-forming reactions."

So a formation on the lunar surface led to an inference about magnetism and the solar wind, and ultimately gave us information about the subsurface geology of the Moon.  I don't know about you, but I love this kind of stuff.  So many of us just look at things and shrug our shoulders, if we notice them at all.  And maybe that's what sets scientists apart; their capacity for seeing what the rest of us miss, and most importantly, wondering why things are the way they are.

It's pretty clear that science isn't just a list of vocabulary -- even though sadly, it's often taught that way.  Science is a verb.  As the brilliant polymath Jules Henri Poincaré put it, "Science is built up with facts as a house is with stones; but a collection of facts is no more a science than a heap of stones is a house."

****************************************



Thursday, July 4, 2024

The fork in the road

One of the most bizarre (and misunderstood) features of quantum physics is indeterminacy.

This is because we live in a macroscopic universe that -- most of the time, at least -- behaves in a determinate fashion.  Now, that doesn't mean we necessarily know everything about it.  For example, if we drop balls into a Galton board -- a device with a grid of pegs to deflect the ball's path -- eventually we'll get a normal distribution:

[Image licensed under the Creative Commons Matemateca (IME USP), Galton box, CC BY-SA 4.0]

With a device like a Galton board, we can accurately predict the probability of any given ball landing in a particular slot, but the actual path of the ball can't be predicted ahead of time.

Here's where the difficulty starts, though.  When people talk about quantum phenomena and describe them as probabilities, there's a way in which the analogy to macroscopic probability breaks down.  With a Galton board, the problem with predicting a ball's path doesn't mean it's not completely deterministic; it has to do with our (very) incomplete knowledge about the ball's initial state.  If you knew every last detail about the game -- each ball's mass, spin, air resistance, elasticity, the angle and speed of release, the angle at which it strikes the first peg, as well as the position, shape, and composition of every peg -- at least in theory, you could predict with one hundred percent accuracy which slot it would land in.  The ball's path is completely controlled by deterministic Newtonian physics; it's only the complexity of the system and our lack of knowledge that makes it impossible to parse.

This is not the situation with quantum systems.

When a particle travels from its source to a detector -- such as in the famous double-slit experiment -- it's not that the particle really and truly went through either slit A or slit B, and we simply don't happen to know which.  The particle, or more accurately, the wave function of the particle, took both paths at the same time, and how the detector is set up determines what we end up seeing.  Prior to being observed at the detector, the particle literally existed in all possible paths simultaneously, including ones passing through Bolivia and the Andromeda Galaxy.

To summarize the difference -- in a determinate system, we may not be able to predict an outcome, but that's only because we have incomplete information about it.  In an indeterminate system, the probability field itself is the reality.  However tempting it is to say that a particle, prior to being observed, took a specific fork in the road, and we just don't know which, completely misses the truth -- and misses how utterly bizarre the quantum world actually is.

People who object to this admittedly weird model of the world usually fall back on a single question, which is surprisingly hard to answer.  Okay, so on the one hand we have deterministic but complex systems, whose outcome is sensitively dependent on initial conditions (like the Galton board).  On the other, we have quantum systems which are probabilistic by nature.  How could we tell the difference?  Maybe in a quantum system there are hidden variables -- information about the system we don't have access to -- that make it appear indeterminate.  (This was Einstein's opinion, which he summed up in his famous statement that "God does not play dice with the universe.")

Unfortunately for Einstein, and for anyone else who is uncomfortable with the fact that the microscopic basis of reality is fundamentally at odds with our desire for a mechanistic, predictable universe, research at the Vienna University of Technology, which was described in a paper this week in Physical Review Letters, has shown conclusively that there are no hidden variables.  Our reality is indeterminate.  The idea of particles having definite positions and velocities, independent of observation and measurement, is simply wrong.

The experiment hinges on something called the Leggett-Garg Inequality -- described in a 1985 paper by physicists Anthony James Leggett and Anupam Garg -- which clearly distinguishes between how classical (determinate) and quantum (indeterminate) systems evolve over time.  Correlations between three different time measurements of the same system would show a different magnitude depending on whether it was behaving in a classical or quantum fashion.

The problem is, no one was able to figure out how to create a real-world test of it -- until now.  The team developed a neutron interferometer, which splits a neutron beam into two parts and then recombines it at a detector.  And the results of the experiment showed conclusively that contrary to our mental image of neutrons as hard little b-bs, that of course have to take either the left or the right hand path, every single neutron took both paths at the same time.  This violates the Leggett-Garg Inequality and is a crystal-clear hallmark of an inherently indeterminate system.

"Our experiment shows that nature really is as strange as quantum theory claims," said study co-author Stephan Sponar.  "No matter which classical, macroscopically realistic theory you come up with, it will never be able to explain reality.  It doesn't work without quantum physics."

Now, mind you, I'm not saying I completely understand this.  As Richard Feynman himself put it, "I think we can safely say that no one understands quantum physics."  (And if the great Feynman could say this, it doesn't leave much room for a rank amateur like me to pontificate about it.)  But perhaps the most fitting way to end is with a quote by the brilliant biologist J. B. S. Haldane: "The world is not only queerer than we suppose, it is queerer than we can suppose."

****************************************



Wednesday, July 3, 2024

Birdwalking through life

I have sometimes compared the sensation inside my brain as being like riding the Tilt-o-Whirl backwards.

I've had a combination of an extremely short attention span and insatiable curiosity since I was a kid.  I still remember when I was about ten and my parents splurged on a complete set of the Encyclopedia Brittanica, figuring (rightly) that it would come in handy during my education.  What they didn't figure on was my capacity for getting completely and inextricably lost in it.  I'd start out looking up some fact -- say, what year James Madison was elected president -- then get distracted by a nearby entry and head off toward that, and before you knew it I was sitting on the living room floor with a dozen of the volumes open to articles having to do with a string of only vaguely-connected topics.  I could start out with James Madison and end up in an entry for the flora and fauna of Cameroon, with no real idea how I'd gotten from one to the other.

That facet of my personality hasn't changed any in the intervening five-odd decades.  I still birdwalk my way through the world, something regular readers of Skeptophilia undoubtedly know all too well (and if you are a regular reader, thank you for putting up with my whirligig approach to life).  Now, of course, I don't need an Encyclopedia Brittanica; the internet is positively made for people like me, to judge by the winding path I took just yesterday.

It all started when I was doing some research into the origin of the word cynosure (meaning "something attention-getting, a guidepost or focal point") for my popular feature "Ask Linguistics Guy" over on TikTok.  I was pretty certain that the word came from the Greek κυνός, meaning "dog," but I wasn't sure of the rest of the derivation.  (I was right about κυνός, but for the rest of the story you'll have to check out my video, which I'll post later today on TikTok.)

But while looking up cynosure my eye was caught by the preceding entry in the etymological dictionary, cynocephaly.  Which means "having a dog's head."

Fig. 1: an example of cynocephaly.  Of course, he's kind of cyno-everything, so it probably doesn't count.  And if you are thinking that I'm only using this as an excuse to post a photograph of my extremely cute puppy, you're on to me.

A more common usage of cynocephaly is someone who has a dog's head and a human body, and it was apparently a fairly common belief back in the day that such beings existed.  In the fifth century B.C.E. the Greek writer Ctesias of Cnidus wrote a book in which he claimed that there was a whole race of cynocephalic people in India, which he was free to say because he'd apparently never been there and neither had any of his readers.  Other writers said that the Cynocephali lived in Libya or Serbia or Finland or Sumatra; you'd think the fact that none of those places are close to each other would have clued them in that there was something amiss, but no.  There was even a discussion in the ninth century, launched amongst the church fathers by a theologian named Ratramnus of Corbie, about whether dog-headed people would have eternal souls or not, because if they did, it was incumbent upon the Christians to find them and preach the Gospel to them.

As far as I know, this discussion came to nothing, mostly because the Cynocephali don't exist.

In any case, this got me on the track of looking into the attitudes of the medievals toward dogs, and my next stop was the story of Saint Guinefort.  If you've never heard of Saint Guinefort, I'm sure you're not alone; he was never officially beatified by the Catholic Church, because he's a dog.  The legend goes that a knight near Lyon had a greyhound named Guinefort, and he left his infant son in the care of the dog one day (that's some solid parenting, right there).  Well, when the knight returned, the cradle was overturned, and Guinefort's jaws were dripping blood.  The infuriated knight pulled his sword and killed the dog, assuming Guinefort had killed the baby.  Only then did he think to turn the cradle over (a real genius, this knight) -- and there was the baby, safe and sound, along with a dead viper covered with dog bites.  So the knight felt just terrible, and erected a shrine to Guinefort, who was venerated in the area as a saint, despite the local priests saying "Hey, you can't do that!" and even threatening to fine people who came there to pray.  The whole episode supposedly happened in the thirteenth century -- but people were still bringing their sick children to be blessed by Saint Guinefort in the 1940s!

From there I started looking into folklore surrounding protectors of children, and after several more jumps that I won't belabor you with, I ended up reading about the mythical monster called Coco (or Cucuy) from Spain and Portugal.  The Coco is a hooded figure that is supposed to haunt houses with children, sometimes appearing only as a stray shadow cast by no physical object.  (Shades of the pants-wettingly terrifying Star Trek: The Next Generation episode "Identify Crisis," which if you haven't watched I highly recommend -- only don't watch it while you're alone.)

Fig. 2: "Wait a moment... whose shadow is that?"  *shudder*

Anyhow, the idea is that El Coco particularly goes after disobedient children, so the legend probably started as a way for parents to get their kids to behave.  The problem with these kinds of stories, though, is that it's a fine line between scaring kids enough to obey the rules and scaring them so much they refuse to sleep, which is why there are lullabies about keeping the Coco away.  Some are barely better than the legend itself:

Duérmete niño, duérmete ya...
Que viene el Coco y te comerá

(Sleep child, sleep or else...
Coco will come and eat you)
I don't know about you, but that would have pacified the absolute shit out of me when I was four years old.  I would have been so pacified I wouldn't have closed my eyes until I was in my mid-twenties.  Then there's this one, from Portugal:
Vai-te Coco. Vai-te Coco
Para cima do telhado
Deixa o menino dormir
Um soninho descansado
Dorme neném
Que a Coco vem pegar
Papai foi pra roça
Mamãe foi trabalhar


(Leave Coco. Leave Coco
Go to the top of the roof
Let the child have
A quiet sleep
Sleep little baby
That Coco comes to get you
Daddy went to the farm
Mommy went to work)
Because there's nothing like "hey, kid, your parents are gone, so you're on your own if the monsters come" to get a child to settle down.  Maybe they should have hired a greyhound or something.

Fig. 3: Que Viene el Coco, by Goya (1799).  The mom looks like she's about to say, "You can have the kids, I'm getting right the fuck outta here."  [Image is in the Public Domain]

In the "See Also" listings at the bottom of the page for El Coco was an entry for Madame Koi-Koi, who sounded interesting (and whom I had also never heard of).  So that was my next stop.  Turns out Madame Koi-Koi is -- and I am not making up the wording -- "one of the most popular boarding school ghosts in Nigeria, Ghana, and South Africa."  Myself, I wouldn't have thought there were enough boarding school ghosts to turn it into a competition, but shows you what I know.  Supposedly Madame Koi-Koi is the ghost of a wicked teacher who was killed by her own students because of her cruelty, and now she haunts schools.  She always wears high heels -- "Koi-Koi" is apparently imitative of the sound her heels make on the floor -- so at least you can hear her coming.  Her favorite thing is to corner students in the bathroom for some reason, especially at night.

Getting up to pee at two a.m. is a fraught affair, in many African boarding schools.

Anyhow, I suppose I've recounted enough of my wanderings.  I'd like to tell you that I stopped there and then went and did something productive, but that would be a lie.  But at least you have a sense of what it's like in my head 24/7.

I hope you enjoyed the ride.  At least you can get off.

****************************************



Tuesday, July 2, 2024

Measure for measure

In yesterday's post we looked at one bizarre human obsession, which is drawing lines all over the place and pretending they represent something real.  Today we're going to look at another, which is our penchant for quantifying everything.

Certainly, accurate measurement is critical in science; data, for the most part, is numerical, and most models these days are mathematical representations of reality.  But still, there's a strange aspect to it, which British science historian James Burke got at in his brilliant series The Day the Universe Changed:

[T]he structural view of things at the time controls what science does at every level.  From the cosmic questions about the whole universe, to what bits of that universe are worth investigating, to how far you let the questions take you, what experiments to do, what evidence you can and can't accept.  And down at that detailed level, the control still operates, because it even tells you what instruments you should use.  And of course, at this stage, you're looking for data to prove your theory, so you design the kind of instruments to find the kind of data you reckon you're going to find.  The whole argument comes full circle when you get the raw data itself.  Because it isn't raw data.  It's what you planned to find from the start.

He goes on to make the important point that true leaps in understanding occur when the unexpected occurs, and some piece of the data doesn't fit with the existing model; then (assuming the data are verified and found to be correct), there's no choice but to revise the model -- or trash it entirely and start over.

[Image is in the Public Domain]

But what this has done is created a morass of different units of measurement, and I'm not referring solely to my own country's pig-headed insistence on avoiding the use of the metric system.  Imperial units -- feet, miles, pounds, quarts, and so on -- are certainly cumbersome (check out this hilarious video if you want to find out just how awkward they are), but they're not the weirdest ways that humans have chosen to subdivide the natural world.  So for your edification, here are a few of the stranger units of measurement I've run into:

  • the micromort -- defined as a one-in-a-million chance of death.  For example, smoking a cigarette and a half increases your chance of dying by about one micromort.
  • a jiffy is 1/60 of a second, from the vertical refresh period on NTSC analog video hardware running on American (60 Hertz) alternating current.  So next time someone tells you, "I'll be back in a jiffy," you can confidently respond, "I seriously doubt that."
  • so many people in Britain publicly compared the areas of geographical regions to the size of Wales that it led to a unit of area, the nanowales -- one billionth the area of Wales, or about 20.78 square meters.
  • the Sverdrup, named after Norwegian oceanographer Harald Sverdrup, at least has its basis in metric units.  It's a unit of flow rate, equal to one million cubic meters per second.  Being as huge as it is, you might imagine it has limited utility -- in fact, it's pretty much only used in oceanography and meteorology.  (For reference, the flow rate of the Gulf Stream varies between 30 and 150 Sverdrup, depending on where you measure it and what you consider its boundaries to be.)
  • the dolor is a unit of pain.  One dolor is equal to the difference between two levels of pain that is just noticeable.  The subjective nature of pain has resulted in it not being widely accepted in the medical community.
  • a millihelen is a unit of beauty, named after Helen of Troy -- the amount of beauty required to launch one ship.
  • when I taught dimensional analysis in physics, I had students practice converting from one set of units to another -- a useful skill when doing science.  I always made a point of having them convert velocities from meters per second to furlongs per fortnight, which firmly cemented in their brains that I have a screw loose.  (For what it's worth, a furlong is 660 feet, or about 201.17 meters; a fortnight is fourteen days, so 1,209,600 seconds.  Thus, the speed of light is about 1.8 terafurlongs per fortnight, a factoid you can bring out at the next cocktail party you attend, especially if you want people to find ways to avoid you for the rest of the evening.)
  • one mickey is the smallest resolvable movement possible with a computer mouse.  Most of them have a sensitivity of about five hundred mickeys per inch.
  • a Smoot is a unit of length, named after Harvard student Oliver R. Smoot.  The story is that one day in 1958, Smoot got falling-down drunk, and his buddies (who were also snookered but not as badly as Smoot was) were basically dragging him home, and decided to measure the length of the Harvard Bridge in Smoot-lengths (about 170 centimeters).  The bridge, they found, was 364.4 Smoots in length plus a little bit, so there's now a plaque saying "364.4 Smoots and an ear" on the bridge.  (Smoot went on, I shit you not, to be the chairperson of the American National Standards Institute and president of the International Organization for Standardization.  Talk about being destined for a particular career.)
  • the weirdest unit of volume I've ever heard of is the Hubble-barn.  This combines the Hubble length -- the radius of the known universe -- with a unit of area called the barn, which is used to measure the scattering cross-section of atomic nuclei and is equal to 10^-28 square meters.  One Hubble-barn is the volume of a rectangular solid that has a square face with an area of one barn stretching across the entire known universe.  If you do the calculation, it's way less volume than you'd think -- on the order of 13.1 liters.
  • last, we have the ohnosecond, which is the time elapsed between making a mistake and recognizing it, such as pressing "send" on an email describing details of some illicit but highly pleasurable activities you want to experience with a coworker with whom you're having a clandestine dalliance, and realizing too late that you forgot to change the "to" line from "Reply All."

So there you have it -- some ways to measure the world, some serious, some not so much.  In any case, I'd better wrap this up.  So far I've had only about 0.02 Hubble-barns of coffee, so I'm moving at a velocity of around a furlong per fortnight.  I should post this, and hope that there are at least a few ohnoseconds between hitting "Publish" and seeing what I've wrought.

****************************************