Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, April 13, 2024

The stowaways

Aficionados of the Star Trek universe undoubtedly recall the iconic character Jadzia Dax.  Dax was a Trill -- a fusion of a humanoid host and a strange-looking brain symbiont.  The union of the two blended their personalities, resulting in what was truly a new, composite life form.


Star Trek is amazing in a lot of ways, not least because of their attention to current science and an uncanny prescience about where science is heading.  It turns out that we're all composite life forms.  We carry around something like 39 trillion bacterial cells in and on our own bodies -- the vast majority of which are either commensals (neither helpful nor harmful) or are actually beneficial -- a number that is higher than the number of human cells we have.  Each of our cells also contains mitochondria, which are the descendants of endosymbiotic bacteria that have inhabited the cells of eukaryotes for billions of years, and without which we couldn't release energy from our food molecules.  Plants have not only mitochondria but chloroplasts, yet another species of bacteria that like mitochondria, have their own DNA, took up residence in their hosts billions of years ago, and have been there ever since.

But the rabbit hole goes a hell of a lot deeper than that.  By some estimates, between five and eight percent of our genomes are endogenous retroviruses -- genetic fragments left behind by viruses that spliced their DNA into ours.  Like our bacterial hitchhikers, a good many of these are either neutral or beneficial; for example, the production of bile, estrogen, and several proteins essential for the formation of the placenta are all directly affected by endogenous retroviral genes.  A few do seem to be deleterious, and have roles in certain cancers, autoimmune diseases, and neurological disorders like ALS and schizophrenia.

What brings this topic up is an astonishing study led by Tyler Coale, of the University of California - Santa Cruz, that came out in the journal Science this week.  Coale's study found there's yet another example of endosymbiosis -- this one a lot more recently evolved -- which turned a formerly free-living nitrogen-fixing bacterium into a true cellular organelle.

Nitrogen is critical for the production of both proteins and DNA.  Although 78% of the air we breathe is nitrogen, it's completely useless to us; we breathe it right back out.  All the nitrogen in our bodies' proteins and nucleic acids had to pass through a food chain that started with nitrogen-fixing bacteria, the only known organisms that can absorb nitrogen from the air and convert it to an organic compound.  Leguminous plants like beans, peas, alfalfa, and clover have a nifty symbiotic arrangement with nitrogen-fixing bacteria; they create nodules in their roots where the bacteria live, and the bacteria provide the plants with a ready source of nitrogen.

But in legumes, the two remain independent organisms.  What Coale and his colleague discovered is a species of algae (Braarudosphaera bigelowii) in which the bacteria (UCYN-A) have evolved to become inseparable from the host cells.  In other words, they became an organelle, just like mitochondria and chloroplasts.

Although there's no canonical definition of organelle, most biologists include two must-haves: (1) coordinated division of the organelle within the cell; and (2) the evolution of a transport system that allows for specific tagging and importation of proteins into the organelle.  By those standards, UCYN-A is definitely an organelle.  

"Both boxes are checked by Coale," said Jeff Elhai, microbiologist at Virginia Commonwealth University.  "Even to the semantic purists, UCYN-A must be counted as an organelle, joining mitochondria, chloroplasts and chromatophores."

All these stowaways, in the cells of just about every living thing on Earth, call into question what exactly we mean not only by the word organelle but by the word organism.  The high-school-biology-class definition of an organism is "an individual life form of a species."  But is there any such thing?  The ostensibly individual life form called Gordon who is currently writing this post is made of (at least) equal numbers of human cells and cells from different species of bacteria, without many of which I'd be sick as hell, or possibly even dead.  Remove the symbiotic mitochondria from within my cells, and I'd definitely be dead -- within minutes.  Deeper still, at a minimum, one in twenty of the genes in my "human DNA" comes from viruses and bacteria.

Looked at closely, I'm as put together of spare parts as the Junk Man in Lost in Space.  Fortunately, I appear to run a bit more smoothly most days than he did.


In any case, calling me "a single organism" is so far from accurate it's almost laughable.

Honestly, it's kind of cool how interconnected everything is.  Back in the days of the first serious taxonomist, Swedish biologist Carl Linnaeus, scientists had the idea that all living things were categorizable into neat little cubbyholes.  Not only is that incorrect on the species level (something I wrote about in detail a couple of years ago), it's not even true on the individual level or on the level of genomes.  Life on Earth is a huge, tangled skein of threads.  The whole thing puts me in mind of a quote from John Muir: "Tug at a single thing in nature, and you find that it is hitched to everything else in the universe."

****************************************



Friday, April 12, 2024

The kakistocracy

Today I'd like to look at the state of Arizona, where this week a 4-2 decision by the state's Supreme Court made abortions illegal in any circumstance except to save a woman's life -- practically speaking, making them illegal period, because few doctors will want to risk their livelihood (or their freedom) based on whether a court will decide a particular abortion was a medical necessity.

This decision caused the state law to revert to a code passed in 1864 -- decades before women even had the right to vote.  It's an interesting historical filigree that the man who pushed the 1864 law through in the first place, then Speaker of the House for the Arizona Territory W. Claude Jones, was a notorious adulterer, philanderer, liar, and pedophile (he openly called himself a "pursuer of nubile females"), whose victims included a twelve-year old Mexican girl and a fifteen-year-old who had recently arrived with her parents from Texas.  The decision by the court is also irrespective of the fact that such restrictions are wildly unpopular; in a 2023 poll, only thirteen percent of Americans responded that abortion should be illegal in all circumstances, and just over sixty percent stated that the United States Supreme Court's Dobbs decision (which overturned Roe v. Wade) was "a bad thing."

What's striking about this is that despite the fact that the majority of American citizens are at least pro-choice in some circumstances, they keep electing people who are somewhere to the right of Tomás de Torquemada.  Take, for example, Arizona State Senator Anthony Kern, who crowed, "Looks like our prayer team stirred up some God-haters," and led a prayer circle on the floor of the Senate in which -- I shit you not -- he "spoke in tongues."

Is it just me, or do these people sound like this?


A point I've made (many times) here in Skeptophilia is that I have no issue with what you believe, as long as you don't use those beliefs as a hammer to force others to comply.  On the other hand, I am under no obligation to refrain from saying those beliefs are ridiculous, especially when you make a point of exhibiting them in public.

Put another way: I always try to respect people, but ideas only deserve respect if they make sense and honor other people's rights.

A few days ago I saw a post on social media where a guy took exception to those of us who were making fun of Rapture-believers who thought the total eclipse on Monday was a sign of the End Times.  "Most Rapture-believers don't think that," he said (despite the fact that people like Marjorie Taylor Greene stated that the eclipse was a "sign from God to repent"), then sniffed, "People who are making fun of Rapture-believers are actually making fun of themselves."

Um, no.  We're actually making fun of the Rapture-believers.  If you hold silly beliefs, you can't blame other people for laughing.

The whole problem escalates when these people are elected to public office, and start using their bizarre worldviews to drive policy.  For example, a law in Louisiana just passed the House which would require all public school classrooms to post the Ten Commandments.  (And before you @ me about how the Ten Commandments are just guides to good behavior, and apply regardless of whether you're religious or not, allow me to remind you that the First Commandment is "I am the Lord thy God; you shall have no other gods before me.")  Another proposed bill in my former home state, HB777, would make it a criminal offense for a librarian to belong to the American Library Association -- because libraries have long stood for free access to information, which is absolutely anathema to the Far Right.  (Also because the ALA has championed the availability of books representing racial diversity and LGBTQ+ representation; apparently we can't have the world knowing there are people who aren't straight white Christians.)

I can only hope that Americans are becoming aware of the extent to which people who proudly espouse loony beliefs have taken control of the government, and that this will galvanize voters to turn out for the election this November.  I'm not talking about true conservatives (people like former congressman Joe Walsh) -- although I may not agree with him about all that much, I could have a reasonable discussion with him.  But I have zero common ground with irrational religious ideologues like current Speaker of the House Mike Johnson, and snarling hypocrites like Lauren Boebert, who publicly stated that she's all about "family values" and is "tired of this separation of church and state junk" but who apparently thinks it's A-OK to give her boyfriend a handjob in a public theater.

We have allowed ourselves to be controlled by a group of men and women whose outsized impact on our laws far exceeds their numbers.  We can turn this around -- but only if people get themselves to the polls.  We don't need elected officials like Anthony Kern babbling, "Ickety ackety ooh aah aah," then claiming those are God's words saying what a Very Good Boy He Is.  We need people capable of reasoned discourse, who -- even if they disagree -- can present their arguments based on facts and logic, not on some bizarre set of beliefs that make about as much sense as claiming that the universe is being controlled by a Giant Green Bunny From The Andromeda Galaxy.

Which means that we need to voteAll of us.  Our system is far from perfect, but this year the choice is stark.  (Maybe it always is.)  The Greeks had a word for the direction we're heading: a kakistocracy, government by the worst, the most unfit, or the most unscrupulous.  Remember the quote from Plato: "The price of apathy toward public affairs is to be ruled by those who are actively evil."

Or, in the case of Anthony Kern, flat-out insane.  

****************************************



Thursday, April 11, 2024

Requiem for a visionary

I was saddened to hear of the death of the brilliant British physicist Peter Higgs on Monday, April 8, at the grand old age of 94.  Higgs is most famous for his proposal in 1964 of what has since come to be known as the "Higgs mechanism" (he was far too modest a man to name it after himself; that was the doing of colleagues who recognized his genius).  This springboarded off work by the Nobel Prize-winning Japanese physicist Yochiro Nambu, who was researching spontaneous symmetry breaking -- Higgs's insight was to see that the same process could be used to argue for the existence of a previously unknown field, the properties of which seemed to explain why ordinary particles have mass.

This was a huge leap, and by Higgs's own account, he was knocking at the knees when he presented the paper at a conference.  But it passed peer review and was published in the journal Physical Review Letters, and afterward stood up to repeated attempts to punch holes in its logic.  His argument required the existence of a massive spin-zero boson -- now known as the Higgs boson -- and he had to wait 48 years for it to be discovered at CERN by the ATLAS and Compact Muon Solenoid (CMS) experiments.  When informed that the Higgs boson had been discovered, at exactly the mass/energy he'd predicted, he responded with his typical humility, saying, "It's really an incredible thing that it's happened in my lifetime."

It surprised no one when he won the Nobel Prize in Physics the following year (2013).

Higgs at the Nobel Prize Awards Ceremony [Image licensed under the Creative Commons Bengt Nyman, Nobel Prize 24 2013, CC BY 2.0]

Higgs, however, was a bit of an anachronism.  He was a professor at Edinburgh University, but refused to buy into the competitive grant-seeking paper-production culture of academia.  He was also famously non-technological; he said he'd never sent an email, used a cellphone, or owned a television.  (He did say that he'd been persuaded to watch an episode of The Big Bang Theory once, but "wasn't impressed.")  He frustrated the hell out of the administration of the university, responding to demands for a list of recent publications with the word "None."  Apparently it was only caution -- well-founded, as it turned out -- by the administrators that persuaded them to keep him on the payroll.  "He might get a Nobel Prize at some point," one of them said.  "If not, we can always get rid of him."

In an interview, Higgs said that he'd never get hired in today's academic world, something that is more of an indictment against academia than it is of Higgs himself.  "It's difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964," he said.  "After I retired it was quite a long time before I went back to my department.  I thought I was well out of it.  It wasn't my way of doing things any more.  Today I wouldn't get an academic job.  It's as simple as that.  I don't think I would be regarded as productive enough."

Reading about this immediately made me think about the devastating recent video by theoretical physicist Sabine Hossenfelder, a stinging takedown of how the factory-model attitude in research science is killing scientists' capacity for doing real and groundbreaking research:

It was a rude awakening to realize that this institute [where she had her first job in physics research] wasn't about knowledge discovery, it was about money-making.  And the more I saw of academia, the more I realized it wasn't just this particular institute and this particular professor.  It was generally the case.  The moment you put people into big institutions, the goal shifts from knowledge discovery to money-making.  Here's how this works:

If a researcher gets a scholarship or research grant, the institution gets part of that money.  It's called the "overhead."  Technically, that's meant to pay for offices and equipment and administration.  But academic institutions pay part of their staff from this overhead, so they need to keep that overhead coming.  Small scholarships don't make much money, but big research grants can be tens of millions of dollars.  And the overhead can be anything between fifteen and fifty percent.  This is why research institutions exert loads of pressure on researchers to bring in grant money.  And partly, they do this by keeping the researchers on temporary contracts so that they need grants to get paid themselves...  And the overhead isn't even the real problem.  The real problem is that the easiest way to grow in academia is to pay other people to produce papers on which you, as the grant holder, can put your name.  That's how academia works.  Grants pay students and postdocs to produce research papers for the grant holder.  And those papers are what the supervisor then uses to apply for more grants.  The result is a paper-production machine in which students and postdocs are burnt through to bring in money for the institution...

I began to understand what you need to do to get a grant or to get hired.  You have to work on topics that are mainstream enough but not too mainstream.  You want them to be a little bit edgy, but not too edgy.  It needs to be something that fits into the existing machinery.  And since most grants are three years, or five years at most, it also needs to be something that can be wrapped up quickly...

The more I saw of the foundations of physics, the more I became convinced that the research there wasn't based upon sound scientific principles...  [Most researchers today] are only interested in writing more papers...  To get grants.  To get postdocs.  To write more papers.  To get more grants.  And round and round it goes.

You can see why a visionary like Peter Higgs was uncomfortable in today's academia (and vice versa).  But it's also horrifying to think about the Peter Higgses of this generation -- today's up-and-coming scientific groundbreakers, who may not ever get a chance to bring their ideas to the world, sandbagged instead by a hidebound money-making machine that has amplified "publish-or-perish" into "publish-or-never-get-started."

In any case, the world has lost a gentle, soft-spoken genius, whose unique insights -- made at a time when the academic world was more welcoming to such individuals -- completed our picture of the Standard Model of particle physics, and whose theories led to an understanding of the fundamental properties of matter and energy we're still working to explore fully.  94 is a respectable age in pretty much anyone's opinion, but it's still sad to lose someone of such brilliance, who was not only a leading name in pure research, but was unhesitating in pointing out the problems with how science is done.

It took 48 years for his theory about the Higgs mechanism to be experimentally vindicated; let's hope his criticisms of academia have a shorter gestation period.

****************************************



Wednesday, April 10, 2024

Ill winds

When you think about it, wind is a strange phenomenon.

In its simplest form, wind occurs when uneven heating of the surface of the Earth causes higher pressure in some places than in others, and the air flows from highs to lows.  But it's considerably more complex (and interesting) than that, because as surface-dwellers we often forget that there's a third dimension -- and that air can move vertically as well as horizontally.

I got to thinking of this because I've been reading Eric Pinder's fascinating, often lyrical, book Tying Down the Wind: Adventures in the Worst Weather on Earth.  Pinder is a meteorologist who was stationed as a weather observer on Mount Washington, New Hampshire, which one in every three days clocks hurricane-force winds (greater than 119 kilometers per hour) and is the spot that holds second place for the highest anemometer-clocked wind speed ever recorded on the Earth's surface (an almost unimaginable 372 kilometers per hour; the only higher one was on Barrow Island, Australia, which on April 10, 1996, during Cyclone Olivia, hit 407 kilometers per hour).

The fact that air moves vertically, of course, is why air moves horizontally.  When the Sun heats a patch of ground, the air above it warms and becomes less dense, causing it to rise.  This creates an area of low pressure, and air moves in from the side to replace the air moving upward.  This process, writ large, is what causes hurricanes; the heat source is the ocean, and the convection caused by that tremendous reservoir of heat energy not only generates wind, but when the water-vapor-laden air rises high enough, it undergoes adiabatic cooling, triggering condensation, cloud formation -- and torrential rain.

The process can go the other direction, though.  A weather phenomenon that has long fascinated me is the convective microburst, something that most often happens in hot, dry climates in midsummer, like the American Midwest.  The process goes something like this.  Rising air triggers cloud formation, and ultimately rain clouds.  When the droplets of water become heavy enough that the downward force of gravity exceeds the upward force of the air updrafts, they fall, but they drop into the layer of warm, dry air near the surface, so they evaporate on the way down, often not making it to the ground as rain.  Evaporation cools the air that surrounds them, making it denser -- and if the process happens fast enough, it creates a blob of air so much denser than the air surrounding it that it literally falls out of the sky, hits the ground, and explodes outward.  Windspeeds can go from nothing to 100 kilometers per hour in a matter of fifteen seconds.  Then -- a couple of minutes later -- it's all over, the dust (and any airborne objects) settle back to Earth, and everyone in the vicinity staggers around trying to figure out what the hell just happened.

A convective microburst in Nebraska [Image licensed under the Creative Commons Couch-scratching-cats, Downburst 1, CC BY-SA 4.0]

Microbursts aren't the only weird weather phenomenon having to do with density flow.  Have you heard of katabatic winds?  If you haven't, it's probably because you live in an area where they don't happen, because they're really dramatic where they do.  Katabatic winds (from the Greek κατάβασις, "falling down") occurs when you have significant chilling of a layer of air aloft -- on top of a mountain, for example, or (even better) over an ice sheet.  This raises the density of the air mass, creating a huge difference in gravitational potential energy from high to low.  The superchilled air pours downward, funneling through any gaps in the terrain; the effect is accentuated when there's a low pressure center nearby.  The katabatic winds off Antarctica (nicknamed "Herbies," for no reason I could find) and the ones off Greenland (known by the Inuit name piteraq) can be unpredictable, fast, and frigid, often driving layers of snow horizontally and creating sudden whiteout conditions.

Then there's the foehn (or föhn) wind, created when onshore air flow is pushed up against a mountain range.  This occurs in the southern Alps, central Washington and Oregon, parts of Greece and Turkey, and south-central China.  On the windward side of the mountains, the air rises and cools; this causes condensation and higher rainfall.  But when the air piles up and gets pushed over the mountain passes, it warms for two reasons -- the pressure increases as it goes downhill on the other side, and the condensation of water vapor releases heat energy.  The result is a warm, dry wind that pours downhill on the leeward side of the mountains -- the source of the "Chinook winds" that desiccate the northwestern United States east of the Cascades.

Interestingly, foehn winds are associated with physiological problems -- headaches, sinus problems, and mood swings.  It's documented that prescriptions for anxiolytic medications go up when the foehn is blowing; and a study at the Ludwig Maximilians Universität München found that suicide and accident rates both go up by about ten percent during periods when there's a strong foehn, and no one knows why exactly.

In any case, there are a few interesting tidbits about a phenomenon we usually don't think about unless we're in the path of a hurricane or tornado.  Something to think about next time your face is brushed by a warm breeze.  We live at the bottom of a layer of moving fluid, driven by invisible forces that usually are benign.  Only occasionally do we see how powerful that fluid can be -- preferably, from a safe distance.

****************************************



Tuesday, April 9, 2024

Music of the heart

I've wondered for years why certain pieces of music elicit such a powerful emotional response.

Partly that's because I react powerfully myself, and kind of always have.  I vividly remember being about fifteen years old and being moved to tears the first time I heard Ralph Vaughan Williams's Fantasia on a Theme by Thomas Tallis:


Well, "moved to tears" is kind of an understatement.  "Sobbing" or "bawling" would be closer to the mark.

Then, there's the first time I heard the moment when the sedate, tranquil "Quoniam Tu Solus Sanctus" in J. S. Bach's Mass in B Minor suddenly launches into the wild, triumphant trumpets and chorus "Cum Sancto Spiritu":


This one elicited a different response, although just as intense.  I was lying on my sofa with headphones on, and when that transition happened I felt like I had been bodily lifted into the air.  These experiences were what prompted me to weave both of these pieces of music -- and a number of others -- into the narrative of my novel The Chains of Orion, as experienced by the character of the kind-hearted, music-loving robot Quine.  One of my coolest experiences as a writer was being told by a reader that he'd been so intrigued to find out why I'd chosen the pieces I'd used as a framework for Quine's story that every time another one was mentioned, he'd sit and listen to it -- and doing this had really enriched his experience of reading the book.

So music can generate some powerful emotions, but what's curious to me is how differently people can react.  I also recall a less-pleasant incident when as a teenager I got into a riproaring argument with my mom (who was one of those people who simply couldn't bear someone having a different opinion than her) over whether Mason Williams's brilliant guitar piece Classical Gas was melancholy or not.  I find the minor key riffs -- especially after the bright major-key brass passage in the middle -- to be deeply wistful, nostalgic, just this side of sad.  My mom's argument was basically "it's happy because it's fast," which to this day I don't understand.  (Although if I were to have the same conversation today, I'd be much quicker to let it go and say "okay, your opinion is your own."  Maybe my mom wasn't the only one who couldn't stand being contradicted.)


While it's still a mystery why some pieces of music can affect certain people viscerally and leave others completely cold, a paper that came out last week in the journal iScience has taken at least the first step toward cataloguing how those experiences are perceived.  A team led by Tatsuya Daikoku of the University of Tokyo used the impressions of over five hundred listeners to different chord changes to see if there was any commonality in the sensations those created.

And there was.  The authors write:
The relationship between bodily sensations and emotions can be elucidated from the perspective of the brain’s predictive processing.  Predictive processing operates on the principle that our brain constantly anticipates and predicts sensory inputs based on prior experiences.  When there’s a mismatch between the predicted and actual sensory input, a prediction error is generated.  Interoception, which refers to the brain’s perception of internal bodily states, plays a pivotal role in this context.  The brain generates emotions by minimizing prediction errors between the anticipatory signals derived from its internal model and the sensory signals through exteroceptive and interoceptive sensations.  Within the framework of music, when our musical predictions are not met, it can lead to a visceral, interoceptive response.  For instance, if we anticipate a musical chord progression based on our prior experiences and the music deviates from this expectation, it can generate a prediction error.  This error might manifest as a sudden change in heartbeat or a rush of emotions associated with surprise, both of which are interoceptive responses.

This certainly describes my mental levitation during Bach's Mass in B Minor.  

I wonder, though, how much of that sense of unmet anticipation is dependent upon the musical tradition we've grown up with.  I get together with two musician friends every couple of weeks to play Balkan music -- a tradition not only with chord progressions that can sound strange to Western European ears, but with time signatures heavily favoring odd numbers.  (One piece we play has the time signature -- I kid you not -- 25/16.)  So for example, would the progressions in this lovely and haunting tune sound unsurprising -- and therefore less poignant -- to someone who grew up in rural Macedonia?


In any case, that was beyond the scope of the study, but it would be an interesting next step to include volunteers from cultures with very different musical traditions.

So I think I'll wrap this up.  Maybe put on some music.  Stravinsky's Firebird never fails to pick me up by the tail and whirl me around a bit.  On the other hand, for an emotional rollercoaster, there's nothing like Beethoven's Seventh Symphony, which takes us from the joyful gallop of the first movement directly into the wrenching pathos of the second.  Or maybe I'll opt for the eerie atmosphere of Debussy's piano piece The Drowned Cathedral.

So much music, so little time.

****************************************



Monday, April 8, 2024

The relic

The first thing I learned in my studies of linguistics is that languages aren't static.

It's a good thing, because my field is historical linguistics, and if languages didn't change over time I kind of wouldn't have anything to study.  There's an ongoing battle, of course, as to how much languages should change, and what kinds of changes are acceptable; this is the whole descriptivism vs. prescriptivism debate about which I wrote only last month.  My own view on this is that languages are gonna change whether you want them to or not, so being a prescriptivist is deliberately choosing the losing side -- but if lost causes are your thing, then knock yourself out.

Where it gets interesting is that the rates of language change can vary tremendously.  Some cultures are inherently protective of their language, and resist things like borrow words -- a great example is Icelandic, which has changed so little in a thousand years that modern Icelanders can still read the Old Norse sagas with little more difficulty than we read Shakespeare.

Speaking of Shakespeare, it bears mention that the language of Shakespeare and his contemporaries isn't (as I heard some students call it) "Old English."  Old English is an entirely different language, not mutually intelligible with Modern English, and by Shakespeare's time had been an extinct language for about four hundred years.  Here's a sample of Old English:

Fæder ure şu şe eart on heofonum, si şin nama gehalgod.  To becume şin rice, gewurşe ğin willa, on eorğan swa swa on heofonum.

I wonder how many of you recognized this as the first two lines of the Lord's Prayer:

Our Father, who art in heaven, hallowed be thy name.  Thy kingdom come, thy will be done on Earth as it is in heaven.

There's been a discussion going on in linguistic circles for years about which dialect of English has changed the least -- not since the time of Old English, but at least since Elizabethan English, the dialect of Shakespeare's time.  We have a tendency, largely because of some of the famous performances of Hamlet and Macbeth and Richard III, to imagine Shakespeare's contemporaries as speaking something like the modern upper-class in southeastern England, but that's pretty clearly not the case.  Analyses of the rhyme and rhythm schemes of Shakespeare's sonnets, for example, suggest that Shakespearean English was rhotic -- the /r/ in words like far and park were pronounced -- while the speech of southern England today is almost all non-rhotic.  Vowels, too, were probably different; today a typical English person pronounces words like path with an open back unrounded vowel /ɑ/ (a bit like the vowel in the word cop); in Shakespeare's time, it was probably closer to the modern American pronunciation, with a front unrounded vowel /æ/ (the vowel sound in cat).

Analysis of spoken English from dozens of different regions has led some linguists to conclude -- although the point is still controversial -- that certain Appalachian dialects, and some of the isolated island dialects of coastal North and South Carolina, are the closest to the speech of Shakespeare's day, at least in terms of pronunciation.  Vocabulary changes according to the demands of the culture -- as I said, there's no such thing as a static language.

[Image licensed under the Creative Commons Alumnum, Primary Human Languages Improved Version, CC BY-SA 4.0]

The reason all this comes up is that linguists have come upon another example of a dialect that preserves a relic dialect -- this one, from a great deal longer ago than Elizabethan English.  In the region of Trabzon in northern Turkey, there is a group of people who speak Romeyka -- a dialect of Pontic Greek that is thought to have changed little since the region was settled from classical-era Greece over two thousand years ago.

Since that time, Romeyka has been passed down orally, and its status as a cultural marker meant that like Icelandic, it has been maintained with little change.  Modern Greek, however, has changed a great deal in that same time span; in terms of syntax (and probably pronunciation as well), Romeyka is closer to what would have been spoken in Athens in Socrates's time than Modern Greek is.  "Conversion to Islam across Asia Minor was usually accompanied by a linguistic shift to Turkish, but communities in the valleys retained Romeyka," said Ioanna Sitaridou, of the University of Cambridge, who is heading the study.  "And because of Islamization, they retained some archaic features, while the Greek-speaking communities who remained Christian grew closer to Modern Greek, especially because of extensive schooling in Greek in the nineteenth and early twentieth centuries...  Romeyka is a sister, rather than a daughter, of Modern Greek.  Essentially this analysis unsettles the claim that Modern Greek is an isolate language."

The problem facing the researchers is that like many minority languages, Romeyka is vanishing rapidly.  Most native speakers of Romeyka are over 65; fewer and fewer young people are learning it as their first language.  It's understandable, of course.  People want their children to succeed in the world, and it's critical that they be able to communicate in the majority language in schools, communities, and jobs.

But the loss of any language, especially one that has persisted virtually unchanged for so long, still strikes me as sad.

It's a consolation, though, that linguists like Ioanna Sitaridou are working to record, study, and preserve these dwindling languages before it's too late.  Especially in the case of a language like Romeyka, where there is no written form; without recordings and scholarly studies, once it's gone, it's gone.  How many other languages have vanished like that, without a trace -- when no more children are being raised to speak it, when the last native speaker dies?  It's the way of things, I suppose, but it's still a tragedy, a loss of the way of communication of an entire culture.

At least with Romeyka, we have people working on its behalf -- trying to find out what we can of a two-thousand-year-old linguistic relic from the time of Alexander the Great.

****************************************



Saturday, April 6, 2024

Total eclipse of the brain

As most of you undoubtedly know, on Monday, April 8, there's going to be a total solar eclipse visible in much of North America.  I've been looking forward to this one for years, because as luck would have it the path of totality is really close to where I live; we have our eclipse glasses at the ready and are going to head up to the lovely town of Canandaigua, New York to see it.  Best of all, it looks like we should have decent weather, never a guarantee in our cloudy, rainy climate.

It's a rare and spectacular event -- rare, at least, from the perspective of being convenient without a great deal of travel.  There are two or three solar eclipses every year, but if the path of totality is in the middle of the Indian Ocean, most of us won't be able to see it.  So you'd think their frequency would convince people that as striking as the phenomenon is, it's perfectly natural and nothing to freak out about.

You would be wrong.

Conspiracy theories have been popping up like toadstools after a rainstorm, most of them dire predictions about what the eclipse means.  Which is, of course, different from simply what it means; what it means is no more mysterious than an object casting a shadow, albeit a really big one.

What is means, though?  Well...  *cue dramatic music* it could mean damn near anything.  And none of it good.

[Image licensed under the Creative Commons ESA/CESAR/Wouter van Reeven, CC BY-SA IGO 3.0, Total solar eclipse ESA425433, CC BY-SA 3.0 IGO]

Let's start with the people who think it's significant that the path of totality for this eclipse crosses the path of totality for the 2017 solar eclipse, and where they cross is near New Madrid, Missouri.  Geology and/or history buffs probably recognize this place as the site of the massive 1811 earthquake that rang church bells as far away as Richmond, Virginia and changed the course of the Mississippi River.  Well, "X marks the spot," right?  Of course right.  When the shadow of the Moon crosses New Madrid, it's going to set off a superquake that will flatten everything for miles around.

Because apparently, that's how dangerous shadows are, especially when they cross where other shadows were seven years ago.

"This has never happened before, two eclipse paths crossing at a single point over one town," one commenter screeched, despite the fact that a quick look at a solar eclipse map should show him this is blatant nonsense.  It also illustrates that he didn't pay any attention in high school geometry class, because crossing at a single point is kind of what non-parallel lines always fucking do.

Then, there's the Twitter user (sorry, I refuse to call it "X" because it sounds idiotic) who posted the following, receiving tens of thousands of upvotes and thousands of retweets:

Elon Musk changes Twitter's name to X.  His baby's mother, Grimes, posted a strange image on instagram before covid that literally told us covid was going to happen, all the way down to the 3 injections.  In that same image, a few rows beneath the covid 'prediction' is a solar eclipse.  Under it, a flower between two dragons.  2024 is the year of the dragon.  The lotus flower begins blooming in China on April 8th.  The eclipse is happening on April 8th.  That is way too many coincidences for me to feel comfortable, along with the Deagel projection of a 225 Million person decrease in the US by 2025.  It would appear some massive sacrifice could possibly be in the works.

Right!  Sure!  What?

One TikToker made an entirely different claim -- this one that that eclipse isn't going to last for four minutes or so as we've been told, but for three to five days, and that during that time the entire Earth will be plunged into complete darkness.  "Photons and electromagnetic particles that travel at the speed of light and will act as a barrier or temporary shield around the Earth, preventing the light of the Sun or the stars from passing through it," the narrator tells us, because that's apparently how light works.  We're then told to avoid travel during that time, and that the astronomers aren't telling us the truth about the duration of the eclipse because "they don't want to cause mass panic."

And of course if there are conspiracies, you just know Alex Jones is going to get involved, and his contribution this time is noticing that the path of the eclipse passes near eight towns named Nineveh.  Because this is the name of a town in the Bible, it shows the eclipse is a sign from God.  (How an eclipse can be a sign from God meaning anything other than "Kepler and Newton were right," I have no idea.)  But Jones also believes that the Big Bad Government can't let this "biblical event" proceed as the Good Lord intended, and the Department of Homeland Security intends to "hijack the eclipse."

My expression while reading this

Then we have the people who think that the eclipse is a sign that the simulation we're all trapped in is breaking down, and therefore something something something biblical prophecies:

The computer simulation is ending, folks.  Say goodbye to the Matrix.  God says in the book of Luke that before he comes back, he will give us signs in the Sun and the Moon and the stars.  We also have the Moon that is turning to rust.  The Greek origin of that rust is hematite, which means blood.  He said the Moon will turn to blood before the terrible day of the war.  We have the Euphrates River drying up.  We have wars, we have rumors of wars, not to mention all the other biblical prophecies that have been fulfilled.  We are literally in the last seconds of the last days, y'all, and our God is so loving and kind he wants to warn us before he comes back...  This eclipse is not the Rapture, it is a direct warning from God...  We are watching a biblical prophecy play out.

Texas pastor Troy Brewer agrees, at least with the biblical part of it, but adds a nice ultranationalist christofascist spin on the whole thing:

Any time God Almighty speaks a word through the Sun, he’s talking to the nations.  Any time that the Lord would speak a word through the Moon, he is speaking to his covenant people prophetically.  That would either be Israel or it would be the bride of Christ.  Or any time that God Almighty is speaking through the stars, he is prophetically speaking to his children of inheritance...  Why would we call it the Great American Eclipse?  Because it's the first time since 1776 that an eclipse has only touched America.  Can anybody think of what happened in 1776?  Oh, I know.  It was the birth of our nation.  So this was definitely an American word from God.  And it was a word about the great nation of America...  The eclipse of 1776 was a one hour and 33 minute event from the second the shadow touched the United States to the second it left...  What is that?  Psalm 133.  "Oh, how good and pleasant it is for brethren to dwell together in unity."  It’s a call of unity for the body of Christ, whereas I want to tell you the warning of the second one is a call of civil war.  And then you have brother against brother in the second one.

Which conveniently ignores that (1) Monday's eclipse will also cross through Mexico and Canada; (2) there have been fifteen total solar eclipses on record that mostly affected the United States, most recently in 2017; and (3) how long the 1776 eclipse (or any solar eclipse) lasts depends on where you are relative to its path, so the whole Psalm 133 thing is idiotic.  But facts and reality just don't matter to these people, do they?  It's my considered opinion that Troy Brewer and his ilk have experienced a total eclipse of the brain, but one where the shadow is showing no sign of passing.

Anyhow, you get the picture.  Any time we have an interesting and uncommon astronomical event, it brings all the wackos yapping from the corners where they usually hide.  What never fails to astonish me, however, is that after the event is over, and nothing untoward takes place, it never discourages either them or their followers.  Doesn't that strike you as bizarre?  You make this grand and dire prediction, preach sermons about it or post it on Twitter or make TikTok videos (or whatever your preferred mode of communication to your devotees is), and then the big day comes, and... nothing happens.

If this was you, wouldn't you think, "Maybe I need to revise my worldview?"  I know I would.  But the weird thing is how that almost never happens.  I can damn near guarantee that Alex Jones and Troy Brewer and the TikTok anti-Matrix biblical apocalypse woman and the rest will not shift their opinions one iota when Monday comes and goes and there are no mass human sacrifices or Christian nationalist civil wars or megaquakes or three days of pitch darkness or computer simulation breakdowns or, heaven forbid, Moon rust.  They'll quiet down for a little, until we have another astronomical event, and then it'll be back to the yapping.

This time!  This is it!  We really mean it this time, you'll see!

Anyway, if you're able to get to the path of totality, I hope you enjoy the sky show.  Don't forget to wear proper eye protection (sunglasses are not enough).  Don't worry about the prophecies from the wingnuts.  We've made it through hundreds of ends-of-the-world already, we'll survive this one.

****************************************



Friday, April 5, 2024

Locked in place

The Moon orbits the Earth in such a way that the same side always faces us.  Put another way, its periods of revolution and rotation are the same; it takes the same amount of time for the Moon to turn once on its axis as it does to circle the Earth.

This seems like a hell of a coincidence, but there is (of course) a physical explanation for it.  Close orbits -- either of a planet around its host star, or a satellite around a planet -- generate a high tidal force, which is the gradient in the gravitational force experienced by the near side of the orbiting body as compared to the far side.  There's always going to be a tidal force; even tiny Pluto has a greater pull from the Sun on the near side than it does on the far side, but with a small body at that great a distance, the difference is minuscule.  (You're experiencing a tidal force right now; the Earth is pulling harder on your feet than on your head, assuming you're not upside down as you're reading this.)  But the Moon's proximity to the Earth means that the tidal force it experiences is comparatively huge.  So even if it once rotated faster than it revolved, the higher pull on the near side slowed its rotation down -- a sort of gravitational drag -- until the two matched exactly.

The result is called 1:1 tidal locking, and is why (apologies to Pink Floyd) there is no permanently dark side of the Moon.  There's a near-Earth and a far-Earth side, but no matter where you are on the Moon, you'll have a 28-day light/dark cycle.  However, the apparent position of the Earth in the sky doesn't change.  If where you stand on the Moon's surface, the Earth appears to be hovering thirty degrees above the western horizon, that's where it will always be from that perspective.

It's been known for some time that planets can also be tidally locked.  Once again, it's more likely to happen when they orbit close to their host star, which means a lot of tidally-locked planets are probably so hot they're uninhabitable.  But the situation changes if the host star is a red dwarf -- small, low-luminosity stars that are incredibly common, making up almost three-quarters of the stars in the Milky Way.  These stars have such a low heat output that the "Goldilocks zone" -- the distance from the star in which the conditions are "just right" for liquid water to form -- is very close in.

So a star in a red dwarf's habitable zone might well also be tidally locked.

Think of how bizarre a situation that would be.  If the planet is at the right distance for the lit side to be comfortable, there'd be a region of perpetual twilight bounding it, and on the other side of that, permanent, freezing-cold night.  Not only that; this would create the convection cell from hell.  Weather down here on Earth is largely caused by uneven heating of the planet's surface; air warms and rises near the Equator, cools, eventually becoming cool enough to sink and completing the circle.  The Earth's rotation and topography complicate the situation, but basically, that convective rise-and-fall is what generates wind, clouds, rain, snow, and the rest of the meteorological picture.

On a tidally-locked planet, these processes would be almost certainly be amplified beyond anything we ever see on Earth.  Especially the twilit boundary zone -- the constant heating of the bright side, and loss of heat to radiation on the dark side, would cause the atmosphere on the bright side to rise, drawing in cold air from the dark side fast.  The result would be a screaming hurricane across the boundary.

At least, so we think.  We don't have any tidally-locked planets to study, only airless moons.


A study out of McGill University has confirmed the first tidally-locked exoplanet, LHS 3844b, a "super-Earth" that was identified by measuring the light coming off the planet at different places in its orbit -- something that allowed the researchers to estimate its temperature.

Artist's impression of the dark side of LHS 3844b [Image credit: NASA/JPL-Caltech/R. Hurt (IPAC)]

Chances are, LHS 3844b doesn't have much of an atmosphere, so the convective hellscape I described above might not apply to it.  Still, the idea that astronomers have identified that an exoplanet is tidally locked is kind of astonishing.  The first exoplanet was only discovered in 1992; in the intervening thirty-odd years not only have we found thousands of them, we're now getting so good at analyzing them we can figure out the size of their orbits, how fast they rotate, and the probable composition of their atmospheres.

Our understanding of the universe has accelerated so much, it's hard even to imagine where it might be headed.  The idea that we could not only find an exoplanet around a distant star, but determine that the same side of the planet always faces the star, boggles the mind.

The future of astronomy is looking pretty stellar, isn't it?

****************************************



Thursday, April 4, 2024

The echoes of Carrhae

Back on the ninth of June, 53 B.C.E., seven legions of Roman heavy infantry were lured into the desert near the town of Carrhae (now Harran, Turkey) by what appeared to be a small retreating force of Parthian soldiers.  It was a trap, and the leader of the Roman forces, Marcus Licinius Crassus (who was one-third of the First Triumvirate, along with Julius Caesar and Pompey the Great) fell for it.  Well-armed and highly mobile Parthian horsemen swept down and kicked some legionnaire ass.  Just about all of the Roman soldiers were either captured or killed, and Crassus himself was executed -- in some accounts, by having molten gold poured down his throat.

Not the way I would choose to make my exit.  Yeowch.

A bust thought to be of the unfortunate Marcus Licinius Crassus [Image licensed under the Creative Commons Sergey Sosnovskiy, Bust of a Roman, Ny Carlsberg Glyptotek, CC BY-SA 4.0]

In any case, very few soldiers from Crassus's seven legions made it back to Italy.  They didn't all die, though, so what happened to the survivors?

This is where it gets interesting -- not only because historical mysteries are intrinsically intriguing, but as another example of "please don't believe whatever you see on the internet, and more importantly don't repost it without checking it for accuracy."

The Battle of Carrhae comes up because a couple of days ago I got one of those "sponsored" posts on Facebook that are largely clickbait based on what stuff you've shared or liked in the past.  With my interest in archaeology and history, I get a lot of links of the type, "Archaeologists don't want you to find out about this ONE WEIRD HISTORICAL FACT," as if actual researchers just hate it when people hear about what they're researching and love nothing better than keeping all of their findings secret from everyone.

In any case, the claim of this particular post was that the survivors of the Battle of Carrhae were absorbed into the Parthian Empire (plausible), but never were accepted there so decided after a while to up stakes and move east (possible), where they eventually made their way to northwestern China (hmmm...) and there's a place called Liqian where their descendants settled.  These guys were recruited by the Chinese as mercenaries to fight against the Xiongnu in 36 B.C.E., and when the Xiongnu were roundly defeated the grateful Chinese Emperor allowed the Romans to stay there permanently.

This idea was championed by historian Homer Dubs, professor of Chinese history at Oxford University, who as part of his argument claimed that the "fish-scale formation" used by the Chinese army against the Xiongnu had been copied from the Roman "testudo formation" -- a move where legions go forward with their shields overlapping to prevent spears and arrows from their opponents from striking home.  The Romans had taught the Chinese a new tactic, Dubs said, and that's how they won the battle.

So far, I have no problem with any of this.  There's nothing wrong with researchers making claims, even far-fetched ones; that's largely how scientific inquiry progresses, with someone saying, essentially, "Hey, here's how I think this works," and all his/her colleagues trying their best to punch holes in the claim.  If the claim stands up to the tests of evidence and logic, then we have a working model of the phenomenon in question.

But the link I got on social media pretty much stopped with, "Hey, some Romans ended up in China, isn't that cool?"  There was no mention of the fact that (1) Dubs made his claim in 1941; (2) because there has never been a single Roman artifact -- not one -- found near Liqian, just about all archaeologists and historians think Dubs was wrong; and (3) a genetic test of a large sample of people around Liqian found not the slightest trace of European ancestry.  Everyone there, apparently, is mostly of Han Chinese descent, just as you'd expect.

And the genetic tests that conclusively put Dubs's claim to rest were conducted seventeen years ago.

Look, it's not that I don't get clickbait.  These sites like "Amazing Facts From History" exist to get people to click on them, boosting their numbers and therefore their ad revenue, irrespective of whether anything they're claiming is true.  In other words, if they can get you to click on it, they win.

But what I don't understand is the number of people who shared the link -- over five thousand, at the point I saw it -- and appended comments like, "This is so interesting!" and "History is so fascinating!", apparently uncritically accepting what the site claimed without doing what I did, a (literally) two-minute read of Wikipedia that brought me to the paper from The Journal of Human Genetics I linked above.  Not a single one of the hundreds of commenters said, "But this isn't true, and we've known it's not true for almost two decades."

I can almost hear the objections.  What's the harm of believing an odd claim about ancient history, even if the (very strong) evidence is that it's false?  To me, there is actual harm in it; it establishes a habit of credulity, of accepting what sounds cool or fun or weird or interesting without any apparent consideration of whether or not it's true.  Sure, there's no immediate problem with believing Roman soldiers settled in China.

But when you start applying that same lack of critical thinking to matters of your health, the environment, or politics, the damage accrues awfully fast.

So please do some fact-checking before you share.  Apply skepticism to what you see online -- even if (or maybe, especially if) what you're considering sharing conforms to your preconceived notions about how things work.  We can all fall prey to confirmation bias, and these days, with the prevalence of clickbait sites run by folks who don't give a rat's ass if what they post is real or not, it's an increasing problem.

Check before you share.  It's that simple.

****************************************