Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, July 31, 2023

The worm turns

In the episode of The X Files called "Ice," Fox Mulder and Dana Scully are sent with a small team of scientists to a remote Arctic research station in order to investigate the murder-suicide of its entire crew.  When they get there, they find one survivor -- the station's mascot, a dog, who shows signs of hyperaggressive behavior (obviously) reminiscent of what afflicted the researchers.

They eventually figure out what happened, but not before two of the people accompanying them are dead, and both Mulder and the third scientist are obviously afflicted with the same malady.  In digging up and thawing out permafrost, the researchers had inadvertently reanimated a deep-frozen parasitic nematode that causes drastic behavioral changes, and is transmissible from bites.  They do find a way to get rid of the infection, saving the lives of Mulder, the infected scientist, and (thank heaven) the dog, but the U.S. government destroys the base before any further study of the worm or its origins can be made.

It's a highly effective and extremely creepy episode, doing what The X Files did best -- leaving you at the end with the feeling of, "This ain't actually over."

I was forced unwillingly to recall my watching of "Ice" by two news stories this week.  In the first, scientists have "reawakened" -- deliberately this time -- a nematode that has been frozen for 46,000 years in the Siberian permafrost.

Dubbed Panagrolaimus kolymaensis, it's a previously unknown species.  This doesn't mean it's a truly prehistoric species; Phylum Nematoda is estimated to contain about a million species, of which only thirty thousand have been studied, classified, and named.  So it could well be that Panagrolaimus exists out there somewhere, in active (i.e. unfrozen) ecosystems, and the invertebrate zoologists just hadn't found it yet.

Still, it's hard not to make the alarming comparison to the horrific events in "Ice" (and countless other examples of the "reanimating creatures frozen in the ice" trope in science fiction).  This reaction is somewhat ameliorated by the fact that two-thirds of the nematode species known are harmless to humans, and even the ones that are parasitic usually aren't life-threatening.  There are a few truly awful ones -- which, for the sakes of the more sensitive members of my audience, I'll refrain from giving details about -- but most nematodes are harmless, so chances are Panagrolaimus is as well.

On the other hand, it doesn't mean that thawing frozen stuff out is risk-free, and the problem is, because of climate change, thawing is happening all over the world even without reckless scientists being involved.  The second study, conducted at the European Commission Joint Research Centre, appeared in a paper in PLOS - Computational Biology and described a digital simulation of a partially frozen ecosystem (that contained living microbes in suspended animation).  They looked at how the existing community would be affected by the introduction of the now reawakened species -- and the results were a little alarming.

It has been tempting to think that because the entire ecosystem has changed since the microbes were frozen, if they were reanimated, there'd be no way they could compete with modern species which had evolved to live in those conditions.  In other words, the thawed species would be unable to cope with the new situation and would probably die out rapidly.  In fact, that did happen to some of them -- but in these models, the ancient microbes often survived, and three percent of them became dominant members of the ecosystem.  

One percent actually outcompeted and wiped out modern species.

"Given the sheer abundance of ancient microorganisms regularly released into modern communities," the authors write, "such a low probability of outbreak events still presents substantial risks.  Our findings therefore suggest that unpredictable threats so far confined to science fiction and conjecture could in fact be powerful drivers of ecological change."

Now, keep in mind that this was only a simulation; no actual microbes have been resuscitated and released into the environment.


Anyhow, there you have it.  Something new from the "Like We Didn't Already Have Enough To Worry About" department.  Maybe I shouldn't watch The X Files.  How about Doctor Who?  Let's see... how about the episode "Orphan 55"?  *reads episode summary*  "...about a future Earth so devastated by climate change that the remnants of humanity have actually evolved to metabolize carbon dioxide instead of oxygen..."

Or maybe I should just shut off the television and hide under my blankie for the rest of the day.


Saturday, July 29, 2023

All in the family

Archaeologists and paleontologists are up against the same problem; bones and other fossils only get you so far.

There are cases where fossil evidence can give you some hints about behavior -- patterns of tracks, for example, or the rare case where the positions of the fossils themselves give you a picture of what was going on, like the recent discovery of an opossum-sized mammal, Repenomamus, attacking a much larger dinosaur, Psittacosaurus.  The pair of fossil skeletons were preserved, locked in a battle to the death -- the death of both, as it turned out, because they were both engulfed mid-fight in a mudslide.

But such lucky finds are rare, and inferences of behavior from fossils are usually sketchy at best.  This is why the study of a group of Neolithic human skeletons found near Gurgy-les-Noisats, France, 150 kilometers southeast of Paris, was so extraordinary.

The level of DNA analysis now possible allowed the analysis of the genomes of 94 of the 128 individuals buried at the site, to the level that the researchers not only were able to construct a seven-generation family tree for them, but make a guess as to what each individual looked like.

The analysis found that the bodies were buried in family groups -- the more closely two people were related, the closer together they were buried -- and that women who were not descendants of the original couple were mostly completely unrelated, suggesting they'd come into the family from another community.  Just about all the males at the burial site, on the other hand, were related, leading the researchers to conclude that men in this community tended to stay put, and at least some women did not.

Another curious thing was that the study detected no half-sibling relationships.  All of the sibling groups were from the same mother and father.  In this family group, at least, monogamous relationships were the norm.

Of course, there's a lot we still don't know; while this is a stunning accomplishment, it still leaves a great many questions unanswered.  For example, were the "outsider" women brought in because of a custom of outbreeding, or by conquest/capture?  What were the religious practices and beliefs that led these people to bury family members near each other?  Was the monogamy shown in this family universal in this culture, or was this grouping an exception for some reason?

It's an intriguing piece of research.  "This type of work really breathes new life into our understanding of ancient peoples," said Kendra Sirak, an ancient-DNA specialist at Harvard Medical School in Boston, Massachusetts, who was not involved in the study.  "I'm especially curious about the man at the root of the family tree.  I would love to know what made this person so important."

And given that a significant percentage of my ancestry comes from central and western France, I have to wonder if anyone in this family tree is a direct ancestor of mine.  There's no way to find out, of course, but the thought did cross my mind.  It's kind of eerie to think when I look at those facial reconstructions, one of those faces looking back at me might be my great-great (etc.) grandparent.


Friday, July 28, 2023

The first step

The UFO community -- and, honestly, a great many other people -- are buzzing today because of the U.S. congressional hearing on Wednesday about what we are now supposed to call "UAPs" -- "unidentified aerial phenomena."

While I still do tend to agree with Neil deGrasse Tyson's comment that "if they're unidentified, that means you don't know what they are... and if you don't know what they are, that's where the conversation should stop," I have to say that even I and other folks who are accustomed to giving the side-eye to the hype are paying attention.  What strikes me about the people who testified are that they are not your stereotypical wild-eyed "I saw it in my back yard and no one believes me!" types.  They're staid military men with excellent reputations, who have now put those reputations on the line to bring to the attention of Congress -- and the public -- that there has been a coverup for years not only of sightings of UAPs, but recovery of material from downed craft.

Including what one of the whistleblowers, David Grusch, called "non-human biologicals."

It's kind of amusing how reluctant they are to use the "A" word or the "E" word, because as my wife pointed out, our dogs are "non-human biologicals."  But it was abundantly clear what -- or, rather, whom -- he was talking about.

I have to admit that some of the testimony was pretty eye-opening.  Navy pilot Ryan Graves, one of the people who testified, said that he and the people in his squadron had "frequently encountered objects... dark gray or black cubes inside a clear sphere," and that "if everyone could see the sensor and video data I witnessed, our national conversation would change."  Graves said he saw himself one of these cube-within-a-sphere objects hovering perfectly still -- in hurricane-force winds.  Another, David Fravor, said the craft he had personally seen were "far superior to anything that we had at the time, have today or are looking to develop in the next ten years."  

The members of Congress who attended the hearing all seemed to be taking the testimony completely seriously, which is itself a little shocking considering the partisan rancor accompanying damn near everything these days.  These craft -- whatever they are -- are being treated as a serious security concern, which I have to admit is accurate enough even if they aren't extraterrestrial in origin.  

I'm not ready to say we're being invaded by the Daleks or Skithra or Slitheen or what-have-you, but I have to admit that if what these people saw is of human make, the reports are downright peculiar.  Assuming the multiple sightings aren't simply fabrications or misinterpretations of natural phenomena -- and there are so many detailed accounts and records like radar and video footage that I don't see how you could discount them all -- the only other option is that they're advanced human technology (presumably not from the United States).  But it's a little hard to imagine some other country (China and Russia are the two whose names come up the most frequently) having technology that much more advanced than ours.

If I'm right about that, and I hasten to state that I'm no expert, we're thrown back on two possibilities.  Either these are some combination of glitches, misinterpretations, and lies, or they really are of non-human origin.

See?  Even I don't want to use the "A" word or the "E" word.

But unfortunately, a lot of the details -- including the hard evidence, like the pieces of downed craft and the "non-human biologicals" Grusch mentioned -- are still classified, and all three of the men who testified were very elusive about giving details in public.  And, of course, therein lies the problem; until we actually have material (biological or not) of extraterrestrial origin available for scientists to study, and written up in peer-reviewed journals, there aren't many of us skeptics who are going to be convinced.

Still, it's definitely grabbed a lot of people's attentions, including ones who ordinarily scoff at claims of UFOs and aliens and so on.  I hope that whatever comes out of this, we can drop some of the secrecy and bring out into the open whatever actual evidence there is.  If we really do have alien spacecraft buzzing about and keeping an eye on us -- if even some of the claims, going back to 1947 and the Roswell Incident are true -- then it seems like the public has a right to know.

So as a first step, the hearing was great, but it can't just stop there, or worse, conclude inside closed doors.  All that fosters is The X Files-style conspiracy theories, wild speculation by people who don't honestly have any solid facts, and more frustration from us skeptics who would just like to see, once and for all, whether there is evidence, and if so, what it actually is.


Thursday, July 27, 2023

The face in the mirror

Like many people, I've at times been in the position of having to interact with narcissists.

I'll not name names, but two, in particular, stand out.  One of them frequently said things like, "What I say is the law of the land" (without any apparent awareness of irony, because this is also an excellent example of someone being a Big Fish in a Little Pond).  This individual did have "advisors" -- for a time I was one of them -- while in point of fact never actually taking a single piece of advice or admitting to being wrong about anything.  Ever.  Worse, every interaction became about being perceived as the most knowledgeable, smart, funny, edgy, savvy person in the room, so every conversation turned into a battle for dominance, unless you refused to play (which, eventually, is what I did).

The second had a different strategy, albeit one that still resulted in the role of Center of the Entire Fucking Universe.  For this person, negative attention caused a complete emotional breakdown, which resulted in everyone having to circle the wagons simply to restore order.  Worse still was when something this individual said made me upset; because then, the focus shifted to someone else's needs, which was completely unacceptable.  My expression of annoyance, anger, or frustration was turned around into my having unreasonable expectations, which precipitated another emotional breakdown, returning me to the role of caregiver and he-who-pours-oil-on-the-waters.

It's a relief that neither of these two are part of my life any more, because being around narcissists is, among other things, absolutely exhausting.  The incessant focus on self means that no one else's needs, and often no one else's opinions, ever get heard.  Both of these people did considerable damage to others around them, without there ever being any sign of concern for the chaos they were sowing or the emotional scars they were inflicting.  (There was plenty of deflection of the blame toward the ones who were hurt, however; "it's their own fault" was another phrase I heard numerous times.)  Worst of all, neither one had any apparent awareness of being narcissistic.  I heard both expressing, at one time or another, how puzzling and unfair it was that they couldn't keep friends or maintain good relationships with business associates.

Funny how that happens when you don't consider anyone but yourself, and funnier still that neither one ever seemed to realize what the common factor in all of their difficulties was.

This lack of self-awareness makes narcissism difficult to study, because it's hard to analyze a condition that the patient doesn't know (s)he's got.  But a team at the University of Graz (Austria), led by psychologist Emanuel Jauk, has not only looked at what it means to be narcissistic -- they've done neuroimaging studies to see what's going on in a narcissist's brain.  The result was an eye-opening paper that appeared in Nature.

"Narcissism is a topic of increasing interest to science and the public, probably because cultural changes in the past decades favor narcissistic behavior," Jauk says.  "Our study was aimed at taking a closer look at the self-image of narcissistic individuals using neuroscience, which might help to unveil its less conscious aspects."

The results were fascinating.  In the authors' words:
Subclinical narcissism is a personality trait with two faces: According to social-cognitive theories it is associated with grandiosity and feelings of superiority, whereas psychodynamic theories emphasize vulnerable aspects like fluctuating self-esteem and emotional conflicts...  While social-cognitive theory would predict that self-relevant processing should be accompanied by brain activity in reward-related areas in narcissistic individuals, psychodynamic theory would suggest that it should be accompanied by activation in regions pointing to negative affect or emotional conflict.  In this study, extreme groups of high and low narcissistic individuals performed a visual self-recognition paradigm during fMRI.  Viewing one’s own face (as compared to faces of friends and strangers) was accompanied by greater activation of the dorsal and ventral anterior cingulate cortex (ACC) in highly narcissistic men.  These results suggest that highly narcissistic men experience greater negative affect or emotional conflict during self-relevant processing and point to vulnerable aspects of subclinical narcissism that might not be apparent in self-report research.
The upshot is that this study suggests narcissism doesn't result in feelings of pleasure when you think of or view yourself; it increases your anxiety.  "Narcissism," Jauk explains, "in terms of an inflated self-view, goes along with negative affect towards the self on an involuntary level."

Which certainly makes sense given my interactions with narcissists.  Above all, neither of the individuals I mentioned ever seemed all that happy.  It appeared that the returning focus on self came out of insecurity, fear, and anxiety rather than conceit -- that it was more about reassurance than it was about praise.

So the condition itself is a little misnamed, isn't it?  The word "narcissism" comes from the Greek myth of Narcissus, who was a young man whose appearance was so beautiful that he fell in love with a reflection of himself, and couldn't tear his eyes away -- he eventually pined away and died, and the gods took pity on him and turned him into the flower that now bears his name.

Narcissus by Caravaggio (1598)  [Image is in the Public Domain]

The reality is sadder.  Narcissists, apparently, think of themselves not out of self-love, but out of a constant uneasy sense that they aren't actually beautiful, intelligent, competent, or desirable.

Which is kind of a miserable way to live.  Knowing this defuses a lot of the anger I harbor from my experiences with the narcissists I described earlier.  For all of their desperation for attention, at their core they were unhappy, deeply fearful people.

The authors make reference to an alternate version of the Narcissus myth that is more in line with what true narcissists experience.  They write:
In another prominent version by Pausanias, the myth has a different ending: Narcissus is gazing at himself, when suddenly a leaf falls into the water and distorts the image.  Narcissus is shocked by the ugliness of his mirror image, which ultimately leads him to death.

 This more tragic ending is much closer to what the study found:

Considering the two versions of the ancient myth of Narcissus, our results are in favor of the less prominent version, in which Narcissus is shocked to death by the ugliness of his mirror image when a leaf drops into the water.  This myth can be seen to metaphorically reflect the ongoing critical self-monitoring that narcissists display when confronted with self-relevant material, presumably due to a lowered intrinsic coupling between self-representation and self-reward/liking.
Which makes me feel like narcissists, despite the considerable harm they can do, are more to be pitied than scorned.


Wednesday, July 26, 2023


With the insane weather we've had this summer -- and which is showing no signs of calming down -- it's easy to forget about another inevitable outcome of anthropogenic climate change: sea level rise.

Part of the issue, of course, is that humans have a regrettable tendency only to pay attention to what's right in front of their faces, like the current worldwide extreme heat wave.  It's why researchers found in 2014 that public concern about climate change decreases during the winter, an attitude Stephen Colbert summed up as "I just had dinner, so there's no such thing as world hunger."

And sea level rise is so gradual you really do have to have a long baseline even to notice it.  It's only in extremely low-lying places like Louisiana's Isle de Jean-Charles that people have been forced to notice -- and that only because the place looks very likely to cease to exist entirely in the next ten years.

Another reason the (well justified) panic over climate change has mostly focused on extreme high temperatures on land and hot sea surfaces fueling bigger storms is that climatologists thought we had something of a buffer, ice-melt-wise, in the Greenland Ice Sheet.  The Greenland Ice Sheet, they thought, had been unmelted for millions of years, which not only kept all that water locked up in solid form on land, but also helped stabilize the Arctic climate.

Note my use of the past tense.

[Image licensed under the Creative Commons Christine Zenino from Chicago, US, Greenland Glaciers outside of Ammassalik (5562580093), CC BY 2.0]

New research, based on an ice core that had been collected in the 1960s but then lost for nearly sixty years, showed something terrifying: four hundred thousand years ago, nearly the entire Greenland Ice Sheet melted, raising the sea level by several meters -- at a time when the carbon dioxide concentration was lower than it is today.  Using something called a luminescence signal -- a highly-sensitive technique that measures the last time flakes of feldspar or quartz were exposed to light, and therefore were on the surface -- the researchers found what they are calling "bulletproof" evidence that layers thought to be continuously buried deep in the Greenland ice were exposed between 420,000 and 370,000 years ago.

If this happened today -- and the indications are that if we don't curb climate change fast, it will -- the results will be nothing short of catastrophic.

"If we melt just portions of the Greenland ice sheet, the sea level rises dramatically," said Tammy Rittenour, climatologist at Utah State University.  "Forward modeling the rates of melt, and the response to high carbon dioxide, we are looking at meters of sea level rise, probably tens of meters.  And then look at the elevation of New York City, Boston, Miami, Amsterdam. Look at India and Africa -- most global population centers are near sea level."

Considering that the average elevation of the state of Delaware is twenty meters -- and that Louisiana and Florida tie at thirty-three meters -- this should scare the absolute shit out of everyone.  (And like Rittenour said -- even in those low-elevation states, most of the population is still along the coast -- so even a meter or two rise would be catastrophic.)

And, typical of privileged people in industrialized countries, I've focused on where I live.  If you look at the top ten cities threatened by climate change, only one (Miami, Florida) is in the United States.  Two are in India (Kolkata and Mumbai), two are in Vietnam (Ho Chi Minh City and Hai Phong), two are in China (Guangzhou and Shanghai), and one each in Bangladesh (Dhaka), Myanmar (Rangoon), and Thailand (Bangkok).  Just counting the urban population of these ten cities puts almost seventeen million people with the choice of relocating or drowning.

You think the refugee problem is bad now?  

And I'm not just talking about the dreaded "caravans of foreign refugees" the right-wingers here in the U.S. like to bring out every time there's focus on the fact that their entire platform lately has consisted of denying rights to people they don't like.  If the sea level rises even by a meter or two, every coastal city in the United States is in trouble -- so we're gonna have an internal refugee problem the likes of which we've never seen before.

People, we have got to figure this out.

We've had enough time to process it all, to come to the conclusion that yes, it's real, and no, it's not a "natural warm-up."  I'll end with a quote from British science historian James Burke's brilliant (and prescient) documentary After the Warming, which aired all the way back in 1989: "People spend money to insure their homes, their health, and their lives against far less likely occurrences.  That's all legislation to stop climate change turns out to be: planetary insurance...  Our attitude thus far has been like the guy in the old joke, have you heard it?  A man falls off the top of a twenty-story building, and someone on the seventeenth floor sticks his head out of the window and asks the guy how he's doing.  The man shrugs as he falls and says, 'So far, so good.'"


Tuesday, July 25, 2023

Things going "boom"

One thing that seems to be a characteristic of Americans, especially American men, is their love of loud noises and blowing shit up.

I share this odd fascination myself, although in the interest of honesty I must admit that it isn't to the extent of a lot of guys.  I like fireworks, and I can remember as a kid spending many hours messing with firecrackers, bottle rockets, Roman candles, and so on.  (For the record, yes, I still have all of my digits attached and in their original locations.)  I don't know if you heard about the mishap in San Diego back on the Fourth of July in 2012, where eighteen minutes worth of expensive fireworks all went off in about twenty seconds because of a computer screw-up.  It was caught on video (of course), and I think I've watched it maybe a dozen times.

Explosions never get old.  And for some people, they seem to be the answer to everything.

So I guess it's only natural, now that we're getting into hurricane season, that somebody inevitably comes up with the solution of stopping hurricanes by shooting something at them.  The first crew of rocket scientists who thought this would be a swell idea decided the best approach would be firing away at the hurricane with ordinary guns, neglecting two very important facts:
  1. Hurricanes, by definition, have extremely strong winds.
  2. If you fling something into an extremely strong wind, it can get flung back at you.
This prompted news agencies to diagram what could happen if you fire a gun into a hurricane:

So this brings "pissing into the wind" to an entirely new level.

Not to be outdone, another bunch of nimrods came up with an even better (i.e. more violent, with bigger explosions) solution; when a hurricane heads toward the U.S., you nuke the fucker.

I'm not making this up.  Apparently enough people were suggesting, seriously, that the way to deal with hurricanes was to detonate a nuclear bomb in the middle of them, that NOAA felt obliged to issue an official statement about why this would be a bad idea.

The person chosen to respond, probably by drawing the short straw, was staff meteorologist Chris Landsea.  Which brings up an important point; isn't "Landsea" the perfect name for a meteorologist?  I mean, with a surname like that, it's hard to think of what other field he could have gone into.  It reminds me of a dentist in my hometown when I was a kid, whose name was "Dr. Pulliam."  You have to wonder how many people end up in professions that match their names.  Like this guy:

And this candidate for District Attorney:

But I digress.

Anyhow, Chris Landsea was pretty unequivocal about using nukes to take out hurricanes.  "[A nuclear explosion] doesn't raise the barometric pressure after the shock has passed because barometric pressure in the atmosphere reflects the weight of the air above the ground," Landsea said.  "To change a Category 5 hurricane into a Category 2 hurricane, you would have to add about a half ton of air for each square meter inside the eye, or a total of a bit more than half a billion tons for a twenty-kilometer-radius eye.  It's difficult to envision a practical way of moving that much air around."

And that's not the only problem.  An even bigger deal is that hurricanes are way more powerful than nuclear weapons, if you consider the energy expenditure.  "The main difficulty with using explosives to modify hurricanes is the amount of energy required," Landsea said.  "A fully developed hurricane can release heat energy at a rate of 5 to 20 x 10^13 watts and converts less than ten per cent of the heat into the mechanical energy of the wind.  The heat release is equivalent to a ten-megaton nuclear bomb exploding every twenty minutes."

And that's not even taking into account that releasing lots of radioactive fallout into an enormous, rapidly moving windstorm is a catastrophically stupid idea.

So yeah, you can shout "'Murika!" all you want, but even a moderate hurricane could kick our asses.  It may not be a bad thing; a reality check about our actual place in the hierarchy of the natural world could remind us that we are, honestly, way less powerful than nature.  An object lesson that the folks who think we can tinker around with atmospheric carbon dioxide levels with impunity might want to keep in mind.

Apparently Landsea's statement generated another flurry of suggestions of nuking hurricanes as they develop, before they get superpowerful.  The general upshot is that when Landsea rained on their parade (as it were), these people shuffled their feet and said, "Awww, c'mon!  Can't we nuke anything?"  But NOAA was unequivocal on that point, too.  Nuking tropical depressions as they form wouldn't work not merely because only a small number of depressions become dangerous hurricanes, but because you're still dealing with an unpredictable natural force that isn't going to settle down just because you decided to bomb the shit out of it.

So there you are.  The latest, quintessentially American, suggestion for controlling the weather, as envisioned by people who failed ninth grade Earth Science.  As for me, the whole discussion has left me in the mood to blow stuff up.  At least vicariously.  Maybe I should go watch the wonderful video of the amazing (and real) "Barking Dog Reaction," since if I actually blow something up, my wife will probably object.  

That's the ticket.  Things going boom.  I like it.


Monday, July 24, 2023

Grammar wars

In linguistics, there's a bit of a line in the sand drawn between the descriptivists and the prescriptivists.  The former believe that the role of linguists is simply to describe language, not establish hard-and-fast rules for how language should be.  The latter believe that grammar and other linguistic rules exist in order to keep language stable and consistent, and therefore there are usages that are wrong, illogical, or just plain ugly.

Of course, most linguists don't fall squarely into one camp or the other; a lot of us are descriptivists up to a point, after which we say, "Okay, that's wrong."  I have to admit that I'm more of a descriptivist bent myself, but there are some things that bring out my inner ruler-wielding grammar teacher, like when I see people write "alot."  Drives me nuts.  And I know it's now become acceptable, but "alright" affects me exactly the same way.

It's "all right," dammit.

However, some research from a paper in Nature shows, if you're of a prescriptivist disposition, eventually you're going to lose.

In "Detecting Evolutionary Forces in Language Change," Mitchell G. Newberry, Christopher A. Ahern, Robin Clark, and Joshua B. Plotkin of the University of Pennsylvania explain that language change is inevitable, unstoppable, and even the toughest prescriptivist out there isn't going to halt the adoption of new words and grammatical forms.

The researchers analyzed over a hundred thousand texts from 1810 onward, looking for changes in morphology -- for example, the decrease in the use of past tense forms like "leapt" and "spilt" in favor of "leaped" and "spilled."  The conventional wisdom was that irregular forms (like pluralizing "goose" to "geese") persist when they're common; less frequently-used words, like "turf" -- which used to pluralize to "turves" -- eventually regularize because people don't use the word often enough to learn the irregular plural, and eventually the regular plural ("turfs") takes over.

The research by Newberry et al. shows that this isn't true -- when there are two competing forms, which one wins is more a matter of random chance than commonness.  They draw a very cool analogy between this phenomenon, which they call stochastic drift, to the genetic drift experienced by evolving populations of living organisms.

"Whether it is by random chance or selection, one of the things that is true about English – and indeed other languages – is that the language changes,” said Joshua Plotkin, who co-authored the study.  "The grammarians might [win the battle] for a decade, but certainly over a century they are going to be on the losing side.  The prevailing view is that if language is changing it should in general change towards the regular form, because the regular form is easier to remember.  But chance can play an important role even in language evolution – as we know it does in biological evolution."

So in the ongoing battles over grammatical, pronunciation, and spelling change, the purists are probably doomed to fail.  It's worthwhile remembering how many words in modern English are the result of such mangling; both "uncle" and "umpire" came about because of an improper split of the indefinite article ("a nuncle" and "a numpire" became "an uncle" and "an umpire").  "To burgle" came about because of a phenomenon called back formation -- when a common linguistic pattern gets applied improperly to a word that sounds like it has the same basic construction.  A teacher teaches, a baker bakes, so a burglar must burgle.  (I'm surprised, frankly, given how English twists words around, we don't have carpenters carpenting.)

Anyhow, if this is read by any hard-core prescriptivists, all I can say is "I'm sorry."  It's a pity, but the world doesn't always work the way we'd like it to.  But even so, I'm damned if I'm going to use "alright" and "alot."  A line has to be drawn somewhere.


Saturday, July 22, 2023

The celestial lighthouse

Last week I did a piece on three weird astrophysical phenomena -- odd radio circles, high-energy neutrino bursts, and fast blue optical transients -- all of which have thus far defied explanation.  And this week, a paper came out in Nature about a recent discovery adding one more to the list of unexplained celestial curiosities -- one which has the alien intelligence aficionados raising their Spock-like eyebrows in a meaningful manner (although I hasten to point out that there is no evidence that either this one, or the other three I mentioned, have anything to do with you-know-who).

However, the most recent discovery is downright bizarre.  To understand why, a bit of background.

There are many more-or-less understood phenomena in astrophysics that result in a sudden surge in electromagnetic output from an astronomical body.  Some are aperiodic, or at least infrequent, such as fast radio bursts, which were discovered back in 2007 by astrophysicists Duncan Lorimer and David Narkevic.  These are quick, transient pulses in the radio region of the spectrum, and are now thought to be due either to neutron star mergers or starquakes on the surface of magnetars.

Then there are the repeating ones, such as the fast blinking on-and-off of pulsars.  These are the rapidly whirling cores of collapsed massive stars, which funnel out beams of high-energy radiation aligned with the poles of their magnetic fields; because of the star's rotation, the beam appears to pulse, in some cases dozens of times a second.  They were discovered back in 1967 by the brilliant astronomer Jocelyn Bell Burnell, but because no one could figure out what might create a repeating signal that regular, and also because Burnell was a woman in a field almost entirely dominated by men, her discovery was derisively referred to as LGM ("Little Green Men"), and assumed to be from some sort of prosaic terrestrial source.  It was only when more of them were found that astronomers began to take her seriously.  In 1974, the Nobel Prize in Physics was awarded for the development of radio astronomy, and in particular, for the discovery of pulsars...

... to Antony Hewish and Martin Ryle.  Note who wasn't included.  Burnell has graciously stated that she "feels no bitterness toward the Nobel Committee," but in her place, I sure as hell would have.

The paper in Nature, however, describes an object that doesn't seem to fit any of the known types of electromagnetic pulses.  Called GPM J1839-10, it releases energy in the radio region of the spectrum.  But in terms of periodicity, it's somewhere between pulsars (which are so regular they've been proposed as celestial clocks) and fast radio bursts (which are apparently aperiodic).  GPM J1839-10 is slow -- its signal reaches a peak about every twenty-two minutes -- but it's not precisely regular.  The four hundred seconds centering on that twenty-two minute mark is when the peak is most likely to come, but sometimes the window will pass with no peak.  The length of the pulses is also variable, usually between thirty and three hundred seconds in length.  And unlike both fast radio bursts and pulsars, the amplitude of the peak is quite low in energy.

As science writer John Timmer put it in Ars Technica, "The list of known objects that can produce this sort of behavior... consists of precisely zero items."

What's weirdest is that going back through the records of astronomical observations, this object has been doing its thing for three decades, and only just now is attracting attention.  The astrophysicists thus far have no good explanation for what it might be.  It sits out there in space, slowly flashing on and off like some sort of interstellar lighthouse, and the the flat truth is that at the moment, no one has the slightest idea what it might be.

Of course, "We don't know" opens the door for a certain group of people to say "We do!"

As I've said before, no one would be more delighted than me if we did come across evidence of an extraterrestrial signal, but I strongly suspect this ain't it.  For one thing, the semi-regular blips it's putting out don't appear to contain any information; put a different way, the pattern isn't complex.  It could be a beacon, I suppose, but how you'd tell the difference between an alien-built celestial lighthouse and a star of some sort that is sending out pulses of radio waves is beyond me.  With nothing more to go on, by far the greater likelihood is that there is some natural explanation for this slowly-pulsing object -- we just haven't found it yet.

Even so, it's intriguing.  I've always loved a mystery, and this certainly is one.  It's possible that we've missed other objects of this type; the kind of detailed repeated scans of the sky in the radio region of the spectrum that it would take to detect a pulsation this slow have only begun to be done with any kind of thoroughness.  Like with Burnell's discovery of pulsars, it took finding others before astronomers had enough data to start putting together an explanation.

But if no others are found, what then?  It'll be added to the list of astronomical mysteries, of which there are plenty.  It's a big old universe, and filled with wonders, many of which we are only just beginning to understand.

And those are cool enough without the aliens.  Although, of course, I wouldn't object to the aliens as well.


Friday, July 21, 2023

Pas de deux

Ever heard of Antichthon

Sometimes called "Counter-Earth," Antichthon is a hypothesized (now known to be nonexistent) planet in the same orbit as Earth, but on the other side of the Sun.  And, therefore, invisible to earthbound observers.  It was first proposed by the fourth century B.C.E. Greek philosopher Philolaus, who argued against the prevalent geocentric models of the day.  Philolaus thought that not only was there another planet on the opposite side of the Sun from the Earth, he believed that the Sun and all of the planets were orbiting around a "Central Fire" exerting an unseen influence at a distance.  Thus, more or less accidentally, landing something near the truth, as the entire Solar System does revolve around the center of the Milky Way galaxy.

None of Philolaus's ideas, however, were based upon careful measurements and observations; another popular notion of his time was that celestial mechanics was supposed to be beautiful, and therefore you could arrive at the right answer just by thinking about what the most elegant possible model is.  (Nota bene: I took a class called Classical Mechanics in college, and what I experienced was not "beauty" and "elegance."  Mostly what it seemed like to me was "incredibly difficult math" and "intense frustration."  So honestly, maybe Philolaus was on to something.  If I could have gotten a better grade in Classical Mechanics by dreaming up pretty but untestable claims about planets we couldn't see even if we wanted to, I'd'a been all over it.)

Anyhow, Antichthon doesn't exist, which we now know for sure both because probes sent out into the Solar System don't see a planet opposite the Earth when they look back toward the Sun, and by arguments from the physics of orbiting bodies.  Kepler showed that the planets are in elliptical orbits, so even if Antichthon was out there, it wouldn't always be 180 degrees opposite to us, meaning that periodically it would peek out from behind the Sun and be visible to our telescopes.  Plus, an Earth-sized planet across from us would experience gravitational perturbations from Venus that would make its orbit unstable -- again, meaning it wouldn't stay put with the Sun in the way.

But there's no particular reason why there couldn't be two planets in the same orbit.  Way back in 1772, the brilliant astronomer and mathematician Joseph-Louis Lagrange found that there were stable points that small bodies could occupy, under the influence of two much larger orbiting objects (such as the Sun and the Earth).  There are, in fact, five such points, called "Lagrange points" in his honor:

[Image licensed under the Creative Commons Xander89, Lagrange points simple, CC BY 3.0]

And you can see that L3 is actually directly across from the Earth -- so Philolaus was before his time.  (Once again, though, not because he'd done the mathematics, the way Lagrange did.  It was really nothing more than a shrewd guess.)  In fact, there are three points that could result in a stable configuration of two planets sharing an orbit -- L3, L4, and L5.

The reason all this comes up is that scientists at the Madrid Center for Astrobiology have found for the first time a possible candidate for this elusive configuration -- around a T-Tauri type star called PDS 70 in the constellation of Centaurus.  The pair of planets, which appear to be gas giants, one of them three times the size of Jupiter, take 119 Earth years to circle their parent star once.

"Planets in the same orbit have so far been like unicorns," said study co-author Jorge Lillo-Box.  "They are allowed to exist by theory, but no one has ever detected them."

The discovery is so unusual that -- understandably -- the scientists are hesitant to state too decisively that it's proven.  Their paper, which appeared in the journal Astronomy and Astrophysics, indicates that they will continue to gather data from the ESO (European Southern Observatory) and ALMA (Atacama Large Millimeter Array) in Chile through 2026 to bolster their claim.

In any case, it's fascinating that a strange guess made 2,400 years ago by an obscure Greek philosopher, then shored up with rigorous mathematics by a French/Italian astronomer 250 years ago, has finally been shown to exist -- two planets locked in a celestial pas de deux, 370 light years away.  


Thursday, July 20, 2023

Drawing the line

A friend and loyal reader of Skeptophilia sent me a link to a YouTube video for my facepalming pleasure a couple of days ago, and being a generous sort, I wanted to share the experience will all of you.  The video is called "Nazca Lines Finally Solved!  The Answer is Amazing!", and is well worth watching in its entirety.  But if you understandably don't want to spend seven minutes of your life watching the video that you will never, ever get back, I'll provide you with a capsule summary and some editorial commentary from Yours Truly.

The Nazca Lines, you probably know, are a series of geoglyphs in southern Peru, which are large enough that their overall shape really can't be discerned except from the air.

[Image is in the Public Domain]

The relative impossibility of seeing the pattern except from above has led to wingnuts such as Erich von Däniken (of Chariots of the Gods fame) to propose that they were made to signal aliens visiting Earth from other planets.  Why aliens would be impressed by our drawing a giant monkey on the ground, I have no idea.  It also bears mention that Nazca is hardly the only place in the world that has geoglyphs, and none of them have much to do with flying saucers.  There's the Cerne Abbas Giant of Dorsetshire, England, for example, who is really really glad to see you:

[Image licensed under the Creative Commons PeteHarlow, Cerne-abbas-giant-2001-cropped, CC BY-SA 3.0]

Be that as it may, the guy in the video, one Damon T. Berry, thinks the Nazca lines are trying to tell us something.  What?  Well, he starts out with a bang by saying that "the universal language is constellations."  Whatever the fuck that means.  Given that the constellations are random assemblages of stars that would look completely different from another vantage point in space, it's hard to imagine anything "universal" about them except that they're, by default, part of the universe.

What Berry tells us then is that each of the glyphs has a code that points at a particular destination.  He starts with the glyph shaped like a bird, and then talks about birds representing flight (okay, I'm with you so far), and some of the glyphs being runways for flying machines (why the hell you'd make a runway shaped like a monkey, I have no idea), and then goes into a long part about how it's significant that the bird has four toes on one foot and five on the other.

"It is a bird," Berry says.  "It appears to be a bird.  But think like an alien.  Look closer at its feet."

I'm not sure why thinking like an alien involves looking at feet.  Maybe the aliens have some kind of weird foot fetish.  I dunno.

Anyhow, what does the fact of its having nine toes mean?  It means, Berry says, that "this is not a bird.  This is a constellation."  In fact, it's the constellation Aquila, a grouping of stars in the northern hemisphere which evidently looked like an eagle to some ancient Greeks who had just polished off their second bottle of ouzo.  The nine toes correspond to the nine brightest stars in the constellation, he says.

Then he moves on to another bird glyph, this one of a hummingbird.  Berry tells us in astonished tones that this bird has the same number of toes on each foot, as if that was an unusual condition or something.  He then says, and this is a direct quote: "The clue lies elsewhere... in the wings.  And the elongated wings are meant to draw your attention... to the wings."

I had to pause the video at this point to give myself a chance to stop guffawing.

We're then directed to count the feathers, and he comes up with eleven.  He includes the tail, but I'm not going to quibble about that because otherwise we'll be here all day.  He says that the number eleven can only mean one thing: the glyph points to the "constellation Columbia."

For the record, the constellation is actually Columba, not Columbia.  Cf. my comment about not quibbling.

The fact that Columba "has eleven stars" means there's an obvious correspondence.  Well, I have two things to say about that.
  1. Do you really think that there's nothing else in the universe that is made up of eleven parts?
  2. There are way more than eleven stars in Columba, it's just that the shape of the constellation (identified as a dove by the aforementioned ouzo-soaked Greeks) is generally outlined using the brightest eleven stars, just as Aquila was with the nine brightest as earlier described.
He then goes on to analyze the monkey glyph, and once again makes a big deal about the number of fingers and toes, which add to fifteen.  This points to the "constellation of the monkey," which he draws for us.  It's fortunate that he does, because as I do not need to point out to any astronomy buffs out there, there is no constellation of the monkey.  As far as I can tell, he just took some random dots and connected them with straight lines to look vaguely like a monkey.

Whether ouzo was involved, I don't know.

He finishes up by basically saying that aliens are out there and will be coming to visit us from those constellations.  At this point, I started shouting at my computer, "You can't be 'from a constellation!'  The stars in a constellation have nothing to do with one another!"  This caused my dog, Rosie, to come into my office and give me the Canine Head Tilt of Puzzlement, meant to communicate the one concept she's capable of hosting in her brain ("What?").  I reassured her that I wasn't mad at her, that I was mad at the silly man on YouTube, and she accepted that and toddled off to interact with something on her intellectual level, like a dust bunny.

Anyhow.  At the end we're told we can learn more if we just watch his longer and more in-depth production, available on Amazon Prime, but I don't think I'm gonna.  I've heard enough.  Me, I'll go back to trying to figure things out through science instead of pulling random correspondences out of my ass.  Call me narrow-minded, but it seems in general like a better way to understand the universe, even if it doesn't involve counting an animal's toes and acting like it means something significant.