Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, September 6, 2025

The lure of the unknown

Carl Sagan once said, "Somewhere, something incredible is waiting to be known."

I think that's one of the main things that attracted me to science as a child; its capacity to astonish.  I still remember reading the kids' books on various scientific stuff and being astounded to find out things like:

  • dinosaurs, far from being the "failed experiment" they're often characterized as, "ruled the Earth" (as it were) for about five hundred times longer than humans have even existed.  (I only much later found out that dinosaurs still exist; we call 'em birds.)
  • when supergiant stars end their lives, they detonate in a colossal explosion called a supernova that gives off in a few seconds as much energy as the Sun will emit in its entire lifetime.  What's left is called a black hole, where the gravitational pull is so powerful even light can't escape.
  • bats can hear in a frequency range far above humans, and are so sensitive to their own vocalizations that they can hear the echoes of their own voices and distinguish them from the cacophony their friends and relatives are making.
  • when an object moves, its vertical and horizontal velocities are completely independent of each other.  If you shoot a gun horizontally on a level surface, and simultaneously drop a bullet from the gun's muzzle height, the shot bullet and the dropped bullet will hit the ground at the same time.

And that's all stuff we've known for years, because (not to put too fine a point on it) I'm so old that when I was a kid, the Dead Sea was just sick.  In the intervening fifty years since I found out all of the above (and lots of other similar tidbits) the scientists have discovered tons of new, and equally amazing, information about our universe and how it works.  We've even found out that some of what we thought we understood was wrong, or at least incomplete; a good example is photoperiodism, the ability of flowering plants to keep track of day length and thus flower at the right time of year.  It was initially thought that they had a system that worked a bit like a chemical teeter-totter.  A protein called phytochrome has a "dark form" and a "light form" -- the dark form changes to the light form during the day, and the reverse happens at night, so the relative amounts of the two might allow plants to keep track of day length.  But it turns out that all it takes is a flash of red light in the middle of the night to completely upend the plant's biological clock -- so whatever is going on is more complex that we'd understood.

This sudden sense of "wow, we don't know as much as we thought!", far from being upsetting, is positively thrilling to scientists.  Scientists are some of the only people in the world who love saying, "I don't understand."  Mostly because they always follow it up with "... yet."  Take, for example, the discovery announced this week by the National Radio Astronomy Observatory of a huge cloud of gas and dust in our own Milky Way Galaxy that prior to this we hadn't even known was there.

It's been named the Midpoint Cloud, and it's about two hundred light years across.  It's an enormous whirlpool centered on Sagittarius A*, the supermassive black hole at the galaxy's center, and seems to act like a giant funnel drawing material inward toward the accretion disk.

"One of the big discoveries of the paper was the giant molecular cloud," said Natalie Butterfield, lead author of the paper on the phenomenon, which appeared this week in The Astrophysical Journal.  "No one had any idea this cloud existed until we looked at this location in the sky and found the dense gas.  Through measurements of the size, mass, and density, we confirmed this was a giant molecular cloud.  These dust lanes are like hidden rivers of gas and dust that are carrying material into the center of our galaxy.  The Midpoint Cloud is a place where material from the galaxy's disk is transitioning into the more extreme environment of the galactic center and provides a unique opportunity to study the initial gas conditions before accumulating in the center of our galaxy."

[Image credit: NSF/AUI/NSF NRAO/P.Vosteen]

Among the amazing features of this discovery is that it contains a maser -- an intense, focused microwave source, in this case thought to be caused by compression and turbulence in the ammonia-rich gas of the cloud.  Additionally, there are several sites that seem to be undergoing collapse; we might be witnessing the birth of new stars.

What's astonishing to me is that this cloud is (1) humongous, (2) in our own galaxy, and (3) glowing like crazy in the microwave region of the spectrum, yet no one had any idea it was there until now.  How much more are we overlooking because we haven't tuned into the right frequency or turned our telescopes to the right coordinates?

The universe is a big place.  And, I suspect, it's absolutely full of surprises.  Hell, there are enough surprises lying in wait right here on the Earth; to give just one example, I've heard it said that we know more about the near side of the Moon than we do about the deep oceans.

How could anyone not find science fascinating?

This is also why I've never understood how people think that science's progress could be turned into a criticism -- I used to hear it from students phrased as, "why do we have to learn all this stuff when it could all be proven wrong tomorrow?"  Far from being a downside, science's capacity to update and self-correct is its most powerful strength.  How is it somehow better to cling to your previous understanding in the face of evidence to the contrary?

That, I don't think I'll ever come close to comprehending.

I'll end with another quote from a scientific luminary -- the brilliant physicist Richard Feynman -- that I think sums it all up succinctly: "I'd much rather questions that cannot be answered than answers that cannot be questioned."

****************************************


Friday, September 5, 2025

Mind the gap

In 1869, explorer John Wesley Powell did the first systematic study of the geology of the Grand Canyon.  As impressive as it is, the Grand Canyon's not that complicated geologically; it's made of layers of sedimentary rock, most of them relatively undeformed, one on top of the other from the oldest at the bottom to the newest at the top.  A layer cake of billions of years of Earth history, and a wonderful example of the principle of superposition -- that strata form from the bottom up.

However, Powell also noted something rather peculiar.  It's called the Great Unconformity.  In geologic parlance, an unconformity is a break in the rock record, where the layer below is separated from the layer above by a gap in time when either no rocks were deposited (in that location, at least), or the rocks that were laid down were later removed by some natural process.  At that stage in the science, Powell didn't know when exactly the Great Unconformity occurred, but it was obvious that it was huge.  Something had taken away almost a billion years' worth of rocks -- and, it was later found out, that same chunk of rock was missing not only at the future site of the Grand Canyon, but across most of North America.

It was an open question as to why this happened, but one leading hypothesis was that it was massive glaciation.  Glaciers are extraordinarily good at breaking up rocks and moving them around, as I find out every time I dig in my garden and my shovel runs into the remnants of the late Pleistocene continental glaciation.  At that point, where my house is would have been under about thirty meters of ice; the southern extent is the Elmira moraine, a line of low hills fifty kilometers south of here, left behind when the glaciers, pushing piles of crushed rock and soil ahead of them like a backhoe, began to melt back and left all that debris for us gardeners to contend with ten thousand years later.

There was a time in which the Earth was -- as far as we can tell -- completely covered by ice. The Cryogenian Period, during the late Precambrian, is sometimes nicknamed the "Snowball Earth" -- and the thawing might have been one contributing factor to the development of complex animal life, an event called the "Cambrian explosion," about which I've written before.

The problem was, the better the data got, the more implausible this sounded as the cause of the Great Unconformity.  The rocks missing in the Great Unconformity seem to have preceded the beginning of the Cryogenian Period by a good three hundred million years.  And while there were probably earlier periods of worldwide glaciation -- perhaps several of them -- the fact that the Cryogenian came and went and didn't leave a second unconformity above the first led scientists away from this as an explanation.

However, a paper in Proceedings of the National Academy of Sciences, written by a team led by Francis Macdonald of the University of Colorado - Boulder, has come up with evidence supporting a different explanation.  Using samples of rock from Pike's Peak in Colorado, Macdonald's team used a clever technique called thermochronology to estimate how much rock had been removed.  Thermochronology uses the fact that some radioactive elements release helium-4 as a breakdown product, and helium (being a gas) diffuses out of the rock -- and the warmer it is, the faster it leaves.  So the amount of helium retained in the rock gives you a good idea of the temperature it experienced -- and thus, how deeply buried it was, as the temperature goes up the deeper down you dig.

What this told Macdonald's team is that the Pike's Peak granite, from right below the Great Unconformity, had once been buried under several kilometers of rock that then had been eroded away.  And from the timing of the removal -- on the order of a billion years ago -- it seems like what was responsible wasn't glaciation, but the formation of a supercontinent.

But not Pangaea, which is what most people think of when they hear "supercontinent."  Pangaea formed much later, something like 330 million years ago, and is probably one of the factors that contributed to the massive Permian-Triassic extinction.  This was two supercontinents earlier, specifically one called Rodinia.  What Macdonald's team proposes is that when Rodinia formed from prior separate plates colliding, this caused a huge amount of uplift, not only of the rocks of the continental chunks, but of the seafloor between them.  A similar process is what formed the Himalayas, as the Indian Plate collided with the Eurasian Plate -- and is why you can find marine fossils at the top of Mount Everest.

[Image is in the Public Domain]

When uplift occurs, erosion increases, as water and wind take those uplifted bits, grind them down, and attempt to return them to sea level.  And massive scale uplift results in a lot of rock being eroded.

Thus the missing layers in the Great Unconformity.

"These rocks have been buried and eroded multiple times through their history," study lead author Macdonald said, in an interview with Science Daily.  "These unconformities are forming again and again through tectonic processes.  What's really new is we can now access this much older history...  The basic hypothesis is that this large-scale erosion was driven by the formation and separation of supercontinents.  There are differences, and now we have the ability to perhaps resolve those differences and pull that record out."

What I find most amazing about this is how the subtle chemistry of rock layers can give us a lens into the conditions on the Earth a billion years ago.  Our capacity for discovery has expanded our view of the universe in ways that would have been unimaginable only thirty years ago.

And now, we have a theory that accounts for one of the great geological mysteries -- what happened to kilometer-thick layers of rock missing from sedimentary strata all over North America.

John Wesley Powell, I think, would have been thrilled.

****************************************


Thursday, September 4, 2025

The silent battle

Today, from the "Who Could Have Predicted This Besides Everybody?" department, we have: a study by psychologist David Blanchflower of Dartmouth College et al. that found that in the United States, the mental health of young people has shown enough of a decline that it has eliminated the "unhappiness hump" -- the former pattern that younger and older people were overall the happiest, with dissatisfaction ratings peaking in middle age.

Now, the lowest levels of happiness are in those between ages thirteen and twenty-five, and show a slow but steady increase with increasing age thereafter.

I don't know about you, but this came as no surprise to me.  I've often thought that I would not want to be a teenager today.  Some things have improved markedly -- opportunities for women and acceptance of minorities and LGBTQ+ people, for example -- but so many new factors have cropped up making life riskier and more difficult that it's hardly to be wondered at that young people are anxious.  

Let's start with the fact that the current regime (1) is doing its level best to strip rights from anyone who isn't a straight white Christian male, (2) shows little regard for protecting what's left of the environment, and (3) is in the process of wrecking the economy with the ongoing tariff craziness.  (About the latter, Trump and his cronies have taken the toddler-ish approach of "I'll just lie about it and everyone will believe me!" by declaring that the deficit is gone, jobs are surging, and the economy is booming.  And don't believe the cash register; the prices of groceries and gasoline are down across the nation.  Oh, and these are not the droids you're looking for.)

I mean, I'm retired, and I find it all depressing.  A college student today facing the current job market would have to be willfully blind not to be anxious about their future.

What gets me, though, is how much you still hear the "suck it up and deal" response from the adults.  To take just one example -- why should recent graduates be asking for student loan forgiveness?  After all, we paid our student loans when we were that age, right?

Yep, we did.  There's a reason for that.  Between 1978 and 1982, my tuition to the University of Louisiana, along with all my textbooks, came to a total of about a thousand dollars a semester.  Now, the average for tuition alone is around twelve thousand dollars a semester -- four times that if you go to a private school.  Housing prices have gone up drastically as well -- in 1980, the average house sale price was seventy-five thousand dollars; now it's four hundred thousand dollars.  The truth is that purchasing a house shortly after entering the job market was a realistic goal for someone in my generation, but for the current generation, it simply isn't.  In fact, owning a home in the foreseeable future is out of reach for the majority of today's college graduates.

It's no wonder there's a "looming mental health crisis" -- to quote Blanchflower et al. -- amongst today's young people.

This crisis is exacerbated by people who seem bound and determined to paint this entire generation as "lazy" or "entitled," when in fact they are reacting the way just about any of us would when faced with impossible odds.  Just yesterday I saw someone post on social media how infuriating it was that the emergency room was "clogged" with teenagers having emotional breakdowns now that the fall semester of college has started, and that they were sick of these needy kids expecting everyone to drop everything and minister to their whims.  The truth is grimmer than that, and I can say this with some authority, as a person who has struggled with crippling anxiety and depression my entire life.  Depressed people don't fake being mentally ill to get attention; we fake being okay to avoid it.  When you see someone actually having a crisis, it is almost always because they have spent hours or days or weeks trying to suppress it, and eventually simply couldn't any more.

[Image licensed under the Creative Commons Sander van der Wel from Netherlands, Depressed (4649749639), CC BY-SA 2.0]

Might there be people who fake an episode in order to get care they don't really need?  Sure.  It's called Munchausen syndrome.  But it's really uncommon.  And in any case, isn't it better to give unnecessary care to one person who is pretending to be ill than to deny care to a hundred who do need it because you've decided they're all malingerers?

Maybe try having a scrap of fucking compassion.

Most of us mentally ill people are struggling along, trying to find a way to cope with a world that seems increasingly engineered to drag us down, while relying on a mental health care system that is drastically inadequate -- understaffed, overworked, and in general spread far too thin for the need.  For myself, I manage most days.  Some days I don't.  On those days I lean hard into something a therapist told me -- "the biggest lie depression tells you is that the lows are permanent."

But as far as the way we treat others goes, that we can fix.  We can work toward changing our society to lower the stressors on the upcoming generation.  We can support our mental health care professionals, who are trying the best they can under extreme difficulties.  And -- most of all -- we can recall what a family friend told me when I was six years old.  I'd come home from school with my knickers in a twist over some perceived wrong by a classmate, and our wise friend blindsided me by saying, "Don't be so hard on your friend.  You should always be kinder than you think you need to be, because everyone you meet is fighting a terrible battle that you know nothing about."

****************************************


Wednesday, September 3, 2025

The skull in the cave

"If humans came from monkeys, why are there still monkeys?"

If there is one phrase that makes me want to throw a chair across the room, it's that one.  (Oh, that and, "The Big Bang means that nothing exploded and became everything.")  Despite the fact that a quick read of any of a number of reputable sites about evolution would make it clear that the question is ridiculous, I still see it asked in such a way that the person evidently thinks they've scored some serious points in the debate.  My usual response is, "My ancestors came from France.  Why are there still French people?"  But the equivalence of the two seems to go so far over their heads that it doesn't even ruffle their hair.

Of course, not all the blame lies with the creationists and their ilk.  How many times have you seen, in otherwise accurate sources, human evolution depicted with an illustration like this?


It sure as hell looks like each successive form completely replaced the one before it, so laypeople are perhaps to be excused for coming away with the impression that this is always the way evolution works.  In fact, cladogenesis (branching evolution) is far and away the more common pattern, where species split over and over again, with different branches evolving at different rates or in different directions, and some of them becoming extinct.

If you're curious, this is the current best model we have for the evolution of hominins:

The cladogenesis of the hominin lineage; the vertical axis is time in millions of years before present  [Image licensed under the Creative Commons Dbachmann, Hominini lineage, CC BY-SA 4.0]

The problem also lies with the word species, which is far and away the mushiest definition in all of biological science.  As my evolutionary biology professor put it, "The only reason we came up with the idea of species as being these little impermeable containers is that we have no near relatives."  In fact, we now know that many morphologically distinct populations, such as the Neanderthals and Denisovans, freely interbred with "modern" Homo sapiens.  Most people of European descent have Neanderthal markers in their DNA; when I had my DNA sequenced a few years ago, I was pleased to find out I was above average in that regard, which is undoubtedly why I like my steaks medium-rare and generally run around half-naked when the weather is warm.  Likewise, many people of East Asian, Indigenous Australian, Native American, and Polynesian ancestry have Denisovan ancestry, evidence that those hard-and-fast "containers" aren't so water-tight after all.

The reason all this comes up is because of a new study of the "Petralona Skull," a hominin skull found covered in dripstone (calcium carbonate) in a cave near Thessaloniki, Greece.  The skull has been successfully dated to somewhere between 277,000 and 539,000 years ago -- the uncertainty is because of estimates in the rate of formation of the calcite layers.

The Petralona Skull  [Image licensed under the Creative Commons Nadina / CC BY-SA 3.0]

Even with the uncertainty, this range puts it outside of the realm of possibility that it's a modern human skull.  Morphologically, it seems considerably more primitive than typical Neanderthal skulls, too.  So it appears that there was a distinct population of hominins living in southern Europe and coexisting with early Neanderthals -- one about which paleontologists know next to nothing.

Petralona Cave, where the skull was discovered [Image licensed under the Creative Commons Carlstaffanholmer / CC BY-SA 3.0]

So our family tree turns out to be even more complicated than we'd realized -- and there might well be an additional branch, not in Africa (where most of the diversification in hominins occurred) but in Europe.  

You have to wonder what life was like back then.  This would have been during the Hoxnian (Mindel-Riss) Interglacial, a period of warm, wet conditions, when much of Europe was covered with dense forests.  Fauna would have included at least five species of mammoths and other elephant relatives, the woolly rhinoceros, the cave lioncave lynx, cave bear, "Irish elk" (which, as the quip goes, was neither), and the "hypercarnivorous" giant dog Xenocyon.  

Among many others.

So as usual, the mischaracterization of science by anti-science types misses the reality by a mile, and worse, misses how incredibly cool that reality is.  The more we find out about our own species's past, the richer it becomes.

I guess if someone wants to dismiss it all with a sneering "why are there still monkeys?", that's up to them.  But me, I'd rather keep learning.  And for that, I'm listening to what the scientists themselves have to say.

****************************************


Tuesday, September 2, 2025

Cry me a river

Urban legends often have nebulous origins.  As author Jan Harold Brunvand describes in his wonderful book The Choking Doberman and Other Urban Legends, "Urban legends are kissing cousins of myths, fairy tales and rumors.  Legends differ from rumors because the legends are stories, with a plot.  And unlike myths and fairy tales, they are supposed to be current and true, events rooted in everyday reality that at least could happen...  Urban legends reflect modern-day societal concerns, hopes and fears...  They are weird whoppers we tell one another, believing them to be factual.  They maintain a persistent hold on the imagination because they have an element of suspense or humor, they are plausible, and they have a moral."

It's not that there's anything wrong with urban legends per se.  A lot of the time, we're well aware that they're just "campfire stories" that are meant to scare, amuse, or otherwise entertain, and (absent of any further evidence) are just as likely to be false as true.  After all, humans have been storytellers for a very long time, and -- as a fiction writer -- I'd be out of a job if we didn't have an appetite for tall tales.

When it becomes problematic is when someone has a financial interest in getting folks to believe that some odd claim or another is true.  Then you have unethical people making money off others' credulity -- and often along the way obscuring or covering up outright any evidence to the contrary.  And it's worse still when the guilty party is part of the news media.

Which brings us to The Sun and the legend of the "Crying Boy."

Back in 1985 the British tabloid newspaper The Sun reported that a firefighter in Essex had more than once found undamaged copies of a painting of a crying child in houses that had otherwise been reduced to rubble by fires.  Upon investigation, they said, they found that the painting was by Italian painter Giovanni Bragolin.


If that wasn't weird enough, The Sun claimed they'd found out that Bragolin was an assumed name, and that the painter was a mysterious recluse named Franchot Seville.  Seville, they said, had found the little boy -- whose name was Don Bonillo -- after an unexplained fire had killed both of his parents.  The boy was adopted by a priest, but fires seemed to follow in his wake wherever he went, to the extent that he was nicknamed "El Diablo."  In 1970, the engine of a car the boy was riding in exploded, killing him along with the painter and the priest.

But, The Sun asked, did the curse follow even the paintings of the boy's tragic, weeping face?

It's not a headline, but we can invoke Betteridge's Law, wherein we learn that anything like that phrased as a question can be answered "No."  Further inquiries by less biased investigators found that the story had enough holes to put a Swiss cheese to shame.  There was no Don Bonillo; the model for the little boy was just some random kid.  Yes, Bragolin went by the pseudonym Franchot Seville, but Bragolin was itself an assumed name; the painter's real name was Bruno Amadio, and he was still alive and well and painting children with big sad eyes until his death from natural causes in 1981 at age seventy.

As far as the survival of the painting, that turned out not to be much of a mystery, either.  Bragolin/Seville/Amadio cranked out at least sixty different crying child paintings, from which literally tens of thousands of prints were made and then shipped out to department stores all across southern England.  They sold like hotcakes for some reason.  (I can't imagine why anyone would want a painting of a weepy toddler on their wall, but hey, you do you.)  The prints were made on a heavy compressed cardboard, and then coated with fire-retardant varnish.  Investigators Steven Punt and Martin Shipp actually purchased one of the prints and tried to set it alight deliberately, but the thing wouldn't burn.  The surmise was that when the rest of the house went up in flames, the string holding the frame to the wall burned through and the print fell face-down on the floor, protecting it from being damaged.

Of course, a prosaic explanation like that was not in the interest of The Sun, which survives by keeping sensationalized stories alive for as long as possible.  So no mention was made of Punt and Shipp and the probable explanation for the paintings' survival.  Instead, they repeated the claims of a "curse," and told readers that if they owned a copy of The Crying Boy and wanted to get rid of it, The Sun would organize a public bonfire to destroy the prints forever.

How they were going to accomplish this, given that the whole shtick had to do with the fact that the painting couldn't be burned, I have no idea.  But this evidently didn't occur to the readers, because within weeks The Sun had received hundreds of copies.  A fire was held along the banks of the Thames in which the mailed-in prints were supposedly destroyed, an event about which a firefighter who had supervised the burning said, "I think there will be many people who can breathe a little easier now."

This in spite of the fact that the whole thing had been manufactured by The Sun.  There would have been no widespread fear, no need for people to "breathe uneasily," if The Sun hadn't hyped the claim to begin with -- and, more importantly, ignored completely the entirely rational explanation for the few cases where the painting had survived a house fire.

It's probably unnecessary for me to say that this kind of thing really pisses me off.  Humans are credulous enough; natural conditions like confirmation bias, dart-thrower's bias, and the argument from ignorance already make it hard enough for us to sort fact from fiction.  Okay, The Sun is a pretty unreliable source to start with, but the fact remains that thousands of people read it -- and, presumably, a decent fraction of those take its reporting seriously.

The fact that it would deliberately mislead is infuriating.

The result is that the legend still persists today.  There are online sites for discussing curses, and The Crying Boy comes up all too frequently, often with comments like "I would never have that in my house!"  (Well, to be fair, neither would I, but for entirely different reasons.)  As Brunvand points out in The Choking Doberman, one characteristic of urban legends is that they take on a life of their own.  Word of mouth is a potent force for spreading rumor, and once these sorts of tales get launched, they are as impossible to eradicate as crabgrass.

But what's certain is that we do not need irresponsible tabloids like The Sun making matters worse.

****************************************


Monday, September 1, 2025

Life, not as we know it

I've written here before about unusual paleontological discoveries -- illustrations of the fact that Darwin's lovely phrase "many forms most beautiful and most wonderful" has applied throughout Earth's biological history.

We could also add the words "... and most weird."  Some of the fossils paleontologists have uncovered look like something from a fever dream.  A while back I wrote about the absolutely bizarre "Tully Monster" (Tullimonstrum spp.) that is so different from all other life forms studied that biologists can't even figure out whether it was a vertebrate or an invertebrate.  But Tully is far from the only creature that has defied classification.  Here are a few more examples of peculiar organisms whose placement on the Tree of Life is very much up for debate.

First, we have the strange Tribrachidium heraldicum, a creature of uncertain relationships to all species at the time or afterward.  It had threefold symmetry -- itself pretty odd -- and its species name heraldicum comes from the striking resemblance to the triskelion design on the coat of arms of the Isle of Man:

Tribrachidium fossil from near Arkhangelsk, Russia [Image licensed under the Creative Commons Aleksey Nagovitsyn (User:Alnagov), Tribrachidium, CC BY-SA 3.0]

Despite superficial similarities to modern cnidarians (such as jellyfish) or echinoderms (such as sea urchins and starfish), Tribrachidium seems to be neither.  It -- along with a great many of the Ediacaran assemblage, organisms that dominated the seas during the late Precambrian Era, between 635 and 538 million years ago -- is a mystery.

The Ediacaran is hardly the only time we have strange and unclassifiable life forms.  From much later, during the Carboniferous Period (on the order of three hundred million years ago), the Mazon Creek Formation in Illinois has brought to light some really peculiar fossils.  One of the most baffling is Etacystis communis, nicknamed the "H-animal":

Reconstruction of Etacystis [Image is in the Public Domain]

It's an invertebrate, but otherwise we're still at the "but what the hell is it?" stage with this one.  Best guess is it might be a distant relative of hemichordates ("acorn worms"), but that's speculative at best.

Next we have Nectocaris.  The name means "swimming shrimp," but a shrimp it definitely was not.  It next was thought to be some kind of primitive cephalopod, perhaps related to cuttlefish or squid, but that didn't hold water, either.  They had a long fin down each side that they probably used for propulsion, and a feeding tube shaped like a funnel (that you can see folded to the left in the photograph below):

Photograph of a Nectocaris fossil from the Burgess Shale Formation, British Columbia [Image is in the Public Domain]

All of the Nectocaris fossils known come from the early Cambrian.  It's possible that they were a cousin of modern chaetognaths ("arrow worms"), but once again, no one is really sure.

Another Cambrian animal that has so far defied classification is Allonnia, which was initially thought to be related to modern sponges, but their microstructure is so different they're now placed in their own order, Chancelloriidae.  You can see why the paleontologists were fooled for a while:

Reconstruction of Allonnia from fossils recovered from the Chengjiang Formation, Yunnan Province, China [Image licensed under the Creative Commons, Yun et al. 2024 f05 (preprint), CC BY 4.0]

At the moment, Allonnia and the other chancelloriids are thought to represent an independent branch of Kingdom Animalia that went extinct in the mid Cambrian Era and left no descendants -- or even near relatives.

Last, we have the bizarre Namacalathus hermanestes, which has been found in (very) late Precambrian shales in such widely-separated sites as Namibia, Canada, Paraguay, Oman, and Russia.  Check out the reconstruction of this beast:

[Image credit Zhuravlev, Wood, and Penny, Proceedings of the Royal Society B, November 2015]

It's been tentatively connected to lophophorates (which include the much more familiar brachiopods), but if so, it must be a distant relationship, because they look a great deal more like something H. P. Lovecraft might have dreamed up:


Unlike the, um, "Yuggothians," though, Namacalathus was quite real.  And, apparently, widespread.

The early Cambrian seas must have contained plenty of nightmare fuel.

And those are just five examples of organisms that would have certainly impelled Dr. McCoy to say, "It's life, Jim, but not as we know it."  Given how infrequently organisms fossilize -- the vast majority die, decay away, and leave no traces, and the vagaries of geological upheaval often destroy the fossil-bearing strata that did form -- you have to wonder what we're missing.  Chances are, for every one species we know about, there are hundreds more we don't.

What even more bizarre life forms might we see if we actually went back there into the far distant past?

I guess we'll have to wait until someone invents a time machine to find out.

****************************************


Saturday, August 30, 2025

The universal language

Sometimes I have thoughts that blindside me.

The last time that happened was a couple of days ago, while I was working in my office and our puppy, Jethro, was snoozing on the floor.  Well, as sometimes happens to dogs, he started barking and twitching in his sleep, and followed it up with sinister-sounding growls -- all the more amusing because while awake, Jethro is about as threatening as your average plush toy.

So my thought, naturally, was to wonder what he was dreaming about.  Which got me thinking about my own dreams, and recalling some recent ones.  I remembered some images, but mostly what came to mind were narratives -- first I did this, then the slimy tentacled monster did that.

That's when the blindside happened.  Because Jethro, clearly dreaming, was doing all that without language.

How would thinking occur without language?  For almost all humans, our thought processes are intimately tied to words.  In fact, the experience of having a thought that isn't describable using words is so unusual that we have a word for it -- ineffable.

Mostly, though, our lives are completely, um, effable.  So much so that trying to imagine how a dog (or any other animal) experiences the world without language is, for me at least, nearly impossible.

What's interesting is how powerful this drive toward language is.  There have been studies of pairs of "feral children" who grew up together but with virtually no interaction with adults, and in several cases those children invented spoken languages with which to communicate -- each complete with its own syntax, morphology, and phonetic structure.

A fascinating study that came out in the Proceedings of the National Academy of Sciences, detailing research by Manuel Bohn, Gregor Kachel, and Michael Tomasello of the Max Planck Institute for Evolutionary Anthropology, showed that you don't even need the extreme conditions of feral children to induce the invention of a new mode of symbolic communication.  The researchers set up Skype conversations between monolingual English-speaking children in the United States and monolingual German-speaking children in Germany, but simulated a computer malfunction where the sound didn't work.  They then instructed the children to communicate as best they could anyhow, and gave them some words/concepts to try to get across.

They started out with some easy ones.  "Eating" resulted in the child miming eating from a plate, unsurprisingly.  But they moved to harder ones -- like "white."  How do you communicate the absence of color?  One girl came up with an idea -- she was wearing a polka-dotted t-shirt, and pointed to a white dot, and got the idea across.

But here's the interesting part.  When the other child later in the game had to get the concept of "white" across to his partner, he didn't have access to anything white to point to.  He simply pointed to the same spot on his shirt that the girl had pointed to earlier -- and she got it immediately.

Language is defined as arbitrary symbolic communicationArbitrary because with the exception of a few cases like onomatopoeic words (bang, pow, ping, etc.) there is no logical connection between the sound of a word and its referent.  Well, here we have a beautiful case of the origin of an arbitrary symbol -- in this case, a gesture -- that gained meaning only because the recipient of the gesture understood the context.

I'd like to know if such a gesture-language could gain another characteristic of true language -- transmissibility.  "It would be very interesting to see how the newly invented communication systems change over time, for example when they are passed on to new 'generations' of users," said study lead author Manuel Bohn, in an interview with Science Daily.  "There is evidence that language becomes more systematic when passed on."

In time, might you end up with a language that was so heavily symbolic and culturally dependent that understanding it would be impossible for someone who didn't know the cultural context -- like the Tamarians' language in the brilliant, poignant, and justifiably famous Star Trek: The Next Generation episode "Darmok"?

"Sokath, his eyes uncovered!"

It's through cultural context, after all, that languages start developing some of the peculiarities (also seemingly arbitrary) that led Edward Sapir and Benjamin Whorf to develop the hypothesis that now bears their names -- that the language we speak alters our brains and changes how we understand abstract concepts.  In K. David Harrison's brilliant book The Last Speakers, he tells us about a conversation with some members of a nomadic tribe in Siberia who always described positions of objects relative to the four cardinal directions -- so at the moment my coffee cup wouldn't be on my right, it would be south of me.  When Harrison tried to explain to his Siberian friends how we describe positions, at first he was greeted with outright bafflement.

Then, they all erupted in laughter.  How arrogant, they told him, that you see everything as relative to your own body position -- as if when you turn around, suddenly the entire universe changes shape to compensate for your movement!



Another interesting example of this was the subject of a 2017 study by linguists Emanuel Bylund and Panos Athanasopoulos, and focused not on our experience of space but of time.  And they found something downright fascinating.  Some languages (like English) are "future-in-front," meaning we think of the future as lying ahead of us and the past behind us, turning time into something very much like a spatial dimension.  Other languages retain the spatial aspect, but reverse the direction -- such as the Peruvian language of Aymara.  For them, the past is in front, because you can remember it, just as you can see what's in front of you.  The future is behind you -- therefore invisible.

Mandarin takes the spatial axis and turns it on its head -- the future is down, the past is up (so the literal translation of the Mandarin expression of "next week" is "down week").  Asked to order photographs of someone in childhood, adolescence, adulthood, and old age, they will place them vertically, with the youngest on top.  English and Swedish speakers tend to think of time as a line running from left (past) to right (future); Spanish and Greek speakers tended to picture time as a spatial volume, as if it were something filling a container (so emptier = past, fuller = future).

All of which underlines how fundamental to our thinking language is.  And further baffles me when I try to imagine how other animals think.  Because whatever Jethro was imagining in his dream, he was clearly understanding and interacting with it -- even if he didn't know to attach the word "squirrel" to the concept.

****************************************