Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Friday, May 31, 2024

The migrants

Most people know of at least two reasons that organisms can evolve.  The first, of course, is natural selection; members of the same species with inheritable differences can have different survival rates or reproductive rates, leading to overall shifts in the genetic makeup of the population.  The second is catastrophe; a major external event, such as the eruption of the Siberian Traps or the collision of the Chicxulub Meteorite, can completely destabilize what had been a thriving ecosystem, and cause the selective pressures to go off in a completely different direction.  (The two I mentioned were the dominant factors in the Permian-Triassic and Cretaceous-Tertiary extinctions, respectively.)

Less well-known is the role that plate tectonics can play.  When two land masses split apart, the organisms then go their separate ways evolutionarily, especially once the two pieces drift far enough away from each other to experience significantly different climates.  This is what happened to Australia, which most recently was connected to Antarctica; once they diverged, Australia moved northward and Antarctica southward, resulting in just about everything in Antarctica becoming extinct as the temperatures dropped, and leaving Australia with its unique assemblage of species.

The opposite can happen when two continents run into each other.  This occurred when India separated from Africa and collided with Asia, about fifty million years ago, carrying with it species from the southern supercontinent (Gondwana) and introducing them to the northern one (Laurasia).  But an even more striking example occurred when North and South America got close enough that a bit of the seafloor was pushed above water, creating the Isthmus of Panama.

When this happened, on the order of three million years ago, it opened up an easy avenue of two-way migration between the two continents.  This reconnected land masses that had been separated since the breakup of Pangaea in the early Triassic Period, on the order of two hundred million years ago.  That's a long time for species assemblages to evolve in their own directions, and the result was two entirely different floras and faunas.  Those began to move back and forth across the gap as soon as the isthmus formed.

What is curious -- and still largely unexplained -- is why the survival rates of the northward and southward migrants were so drastically different.  Species went both directions; that much is clear from the fossil record.  But just looking at mammals, South America gained (and still has) various species of cats, wolves, foxes, peccaries, deer, skunks, bears, and mice that it gained from North America, to name only a few of the groups that moved in and thrived.  But going the other direction?

There were only three survivors.  The opossum, the armadillo, and the porcupine are the only mammalian South American imports we still have around today.  Others that attempted the northward trek, including ground sloths, glyptodonts, "terror birds," sparassodonts, notungulates, and litopterns, struggled along for a while but eventually became extinct.

[Image is in the Public Domain]

The surmise is that moving from wet forests where it's warm year-round into cooler, drier temperate deciduous forests or grasslands is harder than the reverse, just from the perspective of resources.  Whatever the reason, though, it altered the ecosystems of South America forever, as the North American species proved to be better competitors (and predators), driving entire families of South American mammals extinct.  Some groups continued to thrive and diversify, of course.  Hummingbirds come to mind; they're a distinctly South American group. increasing in diversity as you head south.  Where I live, there's a grand total of one species of hummingbird (the Ruby-throated Hummingbird).

The little country of Ecuador has 132.

The reason all this comes up is the discovery of the complete skeleton of an extinct species of porcupine in Florida, dating to 2.5 million years ago -- and therefore, one of those early migrants northward from its ancestral homeland.  It's related to the modern North American species, but definitely not the same; the extinct species, for example, had a prehensile tail, similar to modern South American species (and which our North American porcupines lack).  It's still unknown, however, if the Florida species is ancestral to our current North American porcupines, or if they're cousins; further study of the skeleton may help to resolve that question.

It's fascinating, though, to see the fingerprints of this mass migration that was to change so radically two different continents.  The process of plate movement continues; Australia will eventually collide with Asia, for example, with similar results, mixing together two sets of species that have been isolated for millions of years.  Change is inevitable in the natural world; it can happen quickly or slowly, and sometimes occurs in ways we're just beginning to understand.


Thursday, May 30, 2024

The tale of a troublemaker

One of the things that resonates about the best fiction is its ability to point us in the direction of truths that somehow transcend the mundane factual reality that surrounds us every day.  I know that there are books that have changed my life and my worldview permanently, twisting my perception around and leaving me fundamentally altered afterward.  The Bridge of San Luis Rey by Thornton Wilder.  A Wrinkle in Time by Madeleine L'Engle.  Foucault's Pendulum by Umberto Eco.  The Lathe of Heaven by Ursula LeGuin.  1Q84 by Haruki Murakami.

These kinds of books may not come along often, but when they do, they can leave you reeling.  As science fiction writer Samuel R. Delany put it, "Fiction isn't just thinking about the world out there.  It's also thinking about how that world might be -- a particularly important exercise for those who are oppressed, because if they're going to change the world we live in, they -- and all of us -- have to be able to think about a world that works differently."

This quote immediately came to mind when I read the new book by Andrew Butters (that I was privileged enough to have a copy of prior to release), Known Order Girls.

The story's protagonist is Katherine Webb, a teenage girl who has grown up as part of the "Known Order" -- a programmed society where everything is run by a sentient AI called Commander.  Commander is the ostensibly benevolent dictator that keeps everything stable, making sure the trains are on time and the economy hums along -- and that each man, woman, and child knows exactly what their place is.

And stays there.

But Katherine is too smart for her own good, and sees that the rules that keep the society stable are also a straitjacket to creativity and growth and individuality.  So she starts to rebel -- in small ways, at first.  The penalties for breaking the Known Order are dramatic and terrifying.  But soon she finds out that the price for compliance might be higher still.

I can honestly say that I have seldom met a protagonist whom I was so invested in, whom I so deeply wanted to win the day.  I won't spoil the story by giving you any details other than a suggestion that there are points you'll want to have plenty of tissues handy.  Stories with teenage main characters are usually targeted toward the Young Adult market, but this is a novel that can (and should) be read by all ages.

In an interesting synchronicity, while I was making dinner yesterday evening, I had my iTunes going, and the wonderful song "I Was Born" by Hanson popped up.

The lyrics immediately put me in mind of Katherine Webb's fight against the monolithic control of Commander.  Sometimes there are people who are born to go places no one's ever gone, do something no one's ever done, and be someone no one's ever been; after reading Known Order Girls, I think you'll agree that Katherine is one of those.

This story is one of those infrequent deeply moving, wildly inspiring tales, reminding us that one determined, defiant troublemaker can indeed change the world for the better.

Do yourself a favor.  Get yourself a copy of Known Order Girls by Andrew Butters.  I promise you won't regret it.

Better still, buy a copy for every teenager you know.  There are features of our own Known Order that could use some defiance right about now.


Wednesday, May 29, 2024

Ghost shortage

I sometimes get grief from readers because of my tendency to dismiss claims of the paranormal.

In my own defense, I am convincible.  It just takes more than personal anecdote and eyewitness accounts to do it.  Our memories and sensory-perceptive apparatus are simply not accurate enough recording devices to be relied on for anything requiring scientific rigor.  I find myself agreeing with the hard-nosed skeptic MacPhee in C. S. Lewis's novel That Hideous Strength:
"My uncle, Dr. Duncanson," said MacPhee, "whose name may be familiar to you — he was Moderator of the General Assembly over the water, in Scotland — used to say, 'Show it to me in the word of God.'  And then he’d slap the big Bible on the table.  It was a way he had of shutting up people that came to him blathering about religious experiences.  And granting his premises, he was quite right.  I don’t hold his views, Mrs. Studdock, you understand, but I work on the same principles.  If anything wants Andrew MacPhee to believe in its existence, I’ll be obliged if it will present itself in full daylight, with a sufficient number of witnesses present, and not get shy if you hold up a camera or a thermometer."

So it's not that I'm rejecting anything out of hand, nor saying that your story of seeing your Great Aunt Mildred's ghost fluttering about in your attic last week isn't true.  What I'm saying is that thus far, I personally don't have enough evidence to support a belief in ghosts.  Neither the attempts at rigorous study I've seen, nor my own individual experience, would be at all convincing to someone who didn't already have their mind made up.

And, if you believe an article I just ran across yesterday, any opportunities I might have for changing my opinion are waning fast.

According to paranormal researcher/nuclear physicist Paul Lee, the United Kingdom is "running out of ghosts."  Lee, author of The Ghosts of King's Lynn and West Norfolk, has been tracking paranormal activity in Britain since January 2020, and has seen a marked decline in reports.  "I've been contacting all the reportedly haunted locations on my app, and asking if the residents, owners or staff have experienced any unexplained activity," Lee said.  "So far I've had almost eight hundred replies, and even some supposedly highly haunted places like Conisbrough Castle in South Yorkshire, the Ettington Park Hotel in Stratford -- said to be one of the most haunted hotels in the UK -- and Fortnum and Mason in Piccadilly, say they haven't experienced anything in the last few years."

[Image licensed under the Creative Commons Gallowglass, Medieval ghost, CC BY-SA 3.0]

As far as what's happened to all these spirits, Lee says they may have "moved on."  I guess, like in The Good Place, anything gets boring after a while, and after a few centuries of scaring the shit out of tourists, the ghosts are probably eager for a change of venue.  On the other hand, Lee cautions, just because a particular ghost hasn't been heard from in a while doesn't mean it's gone permanently.  "It may be that ghosts can be recharged," he said.  "You sometimes hear stories of ghosts suddenly reappearing again after many years' absence."

So it could be that this is just a temporary lull, and the ghosts will all come back at some point.  Maybe when the Tories get voted out.

But you have to wonder, of course, if there's something more rational going on here, like the fact that people are wising up to how easy it is to slip into superstition and credulity, and attribute every creaking floorboard to the tread of a spectral foot.  While there are groups that approach these sorts of phenomena the right way (the Society for Psychical Research comes to mind), there are so many more that look at claims of hauntings as a way of turning a quick buck that maybe people are just getting fed up.  Shows like Ghost Hunters can't have helped; week after week, they go to supposedly haunted sites, wander around brushing aside cobwebs and waving their flashlights about in an atmospheric fashion, and like Monty Python's Camel Spotters, every week find conclusive evidence of nearly one ghost.  Despite a zero percent success rate, they always high-five each other for a job well done at the end of the episode, counting on the fact that viewers will already have forgotten that they'd just spent forty-five minutes watching nothing happening.

So maybe there are fewer ghost reports because people are getting smarter about what actually constitutes something worth investigating.  Wouldn't that be nice?

Anyhow, I wish Paul Lee the best of luck.  If the sightings don't pick up, he'll have to go back to nuclear physics to make ends meet, and that would be a damn shame.  And to reiterate my first point, it's not that I'm saying what he claims is impossible; no one would be happier than me if there turned out to be an afterlife, preferably on the beach and involving hammocks, sunshine, the minimum legally-allowable amount of clothing, and drinks with cheerful little paper umbrellas.

In the interim, however, I'll keep looking for hard evidence.  And if tonight I get visited by the spirit of your Great Aunt Mildred and she gives me a stern talking-to, I guess it will serve me right.


Tuesday, May 28, 2024

Flocking together

One of the most mesmerizing sights in nature is the collective motion of large groups of animals.

I remember watching films by Jacques Cousteau as a kid, and being fascinated by his underwater footage of schools of fish swimming along and then turning as one, the light flickering from their silvery sides as if they were each reflective scales on a giant single organism.  Murmurations of starlings barely even look real; the flocks swirl and flow like some kind of weird, airborne fluid.  But the most astonishing example of collective motion I've ever seen was when Carol and I visited Bosque del Apache Wildlife Refuge, in central New Mexico, a few years ago, during the migration of snow geese through the region.

"Get there early," we were told.  "At least a half-hour before sunrise.  You'll be glad you did."

We arrived just as the light was growing in the eastern sky.  The wetland was full of tens of thousands of snow geese, all moving around in a relaxed sort of fashion, calling softly to each other.  The brightness in the sky grew, and then -- without any warning at all...

... BOOM.

They all exploded into the air, seemingly simultaneously.  We have wondered many times since what the signal was; there was nothing we could discern, no handful of birds that launched first, no change in the vocalizations that a human would interpret as, "Now!"  One moment everything was calm; the next, the air was a hurricane of flapping wings.  They whirled around, circling higher and higher, and within ten minutes they were all gone, coursing through the sky toward their next destination.

How animals manage such feats, moving as a unit without colliding or leaving members behind -- and seemingly without any central coordination -- has long fascinated zoologists.  Way back in 1987, computer simulation expert Craig Reynolds showed (using software called "Boids") that with only a handful of simple rules -- stay within so many wing-lengths of your nearest neighbors but not close enough to touch, match the speed of your neighbors within ten percent either way, steer toward the average heading of your nearest neighbors, give other members a chance to be in any given position in the group -- he was able to create simulated flocking behavior that looked absolutely convincing.  

Last week, a paper out of the Max Planck Gesellschaft showed there's another factor that's important in modeling collective motion, and this has to do with the fact that flying or swimming animals have a rhythm.  Look, for example, at a single fish swimming in an aquarium; its motion forward isn't like a car moving at a steady speed down a highway, but an oscillating swim-glide-swim-glide, giving it a pattern a little like a Slinky moving down a staircase.

Biologist Guy Amichay, who led the research, found that this gives schools of fish a pulse; he compares it to the way we alternate moving our legs while walking.  "Fish are coordinating the timing of their movements with that of their neighbor, and vice versa," Amichay said.  "This two-way rhythmic coupling is an important, but overlooked, force that binds animals in motion.  There's more rhythm to animal movement than you might expect.  In the real world most fish don't swim at fixed speeds, they oscillate."

The key in simulating this behavior is that unlike the factors that Reynolds identified, getting the oscillating movement right depends on neighboring fish doing the opposite of what their nearest neighbors are doing.  The swim-glide pattern in one fish triggers a glide-swim pattern in its friends; put another way, each swim pulse creates a delay in the swim pulse of the school members around it.  

"It's fascinating to see that reciprocity is driving this turn-taking behavior in swimming fish, because it's not always the case in biological oscillators," said study co-author Máté Nagy.  "Fireflies, for example, will synchronize even in one-way interactions.  But for humans, reciprocity comes into play in almost anything we do in pairs, be it dance, or sport, or conversation,"

"We used to think that in a busy group, a fish could be influenced by any other member that it can see," said co-author Iain Couzin. "Now, we see that the most salient bonds could be between partners that choose to rhythmically synchronize."

So zoologists have taken another step toward comprehending one of the most fascinating phenomena in nature; the ability of animals to move together.  Something to think about next time you see a school of fish or a flock of birds in flight.  Getting it right requires rapid and sophisticated coordination we are only now beginning to understand.


Monday, May 27, 2024

The enduring mystery of the Huns

In 376 C.E., an enormous group of Germanic-speaking Goths, primarily from the Tervingi and Greuthungi tribes, showed up along the Danube River, which had long stood as an uneasy boundary between the Germanic peoples and the Roman Empire.

Most people are aware that the Roman Empire -- especially the western half of it -- would, for all intents and purposes, collapse completely less than a hundred years after that.  What's less well-known is that up to this point, it was doing pretty well; no one, in 375 C.E., would have looked around them and thought, "Wow, these people are doomed."  British historian Peter Heather analyzed all the usual factors cited in Rome's fourth-century troubles, including an uncontrolled and rebellious army, restive peasantry, food shortages from a drop in agricultural production, and conflicts with Persia on the eastern border.  None appear to be sufficient to account for what was about to happen.  Rome had stood for almost a thousand years -- to put that in perspective, four times longer than the United States has been a nation -- and had survived much worse, including the chaotic "Year of Five Emperors" (193 C.E.), which started with the murder of the paranoid and megalomaniacal emperor Commodus, made famous in the movie Gladiator.

The Roman Empire had dealt with border conflicts pretty much during its entire history.  Given its expansionist agenda, it was directly responsible for a good many of them.  But this time, things would be different.  No one at the time seems to have seen it coming, but the end result would write finis on the Pax Romana.

The difference was a group of people called the Huns.

Reconstruction of a Hunnic warrior [Image licensed under the Creative Commons George S. Stuart creator QS:P170,Q5544204 Photographed by Peter d'Aprix & Dee Finning; Owned by Museum of Ventura County, Attila the Hun on horseback by George S Stuart, CC BY-SA 3.0]

The Huns are a historical enigma.  For a group so widely known -- every schoolkid has heard of Attila the Hun -- their origins are pretty much a complete mystery.  (For what it's worth, they did not give their name to the nation of Hungary; the name "Hungary" comes from the Oghur-Turkic word onogur, meaning "the ten tribes of the Oghurs."  And the Magyars, the Finno-Ugric ethnic group that makes up the majority of the ancestry in modern Hungary, didn't even come into the region until the ninth century C.E.)

As far as the Huns go, we don't even know much about what language they spoke, because they left no written records.  There are a handful of words recorded in documents from the fourth and fifth centuries, and some personal names, but the evidence is so thin that linguists haven't even been able to determine what language family Hunnic belonged to -- there are arguments that it was Turkic, Iranian, Yeniseian, Mongolian, Uralic, and Indo-European, or perhaps a linguistic isolate -- but the fact is, we simply don't know.

So basically, the Huns swept into eastern Europe from we-don't-know-where.  Certainly they at least passed through the central Asian steppe, but whether that's where they originated is a matter of pure conjecture.  There's even a contention they might have come from as far away as what is now northern China, and that they're allied to the Xiongnu culture, but the evidence for that is slim at best.

Roman chronicler Ammianus Marcellinus, who witnessed many of the events of the late fourth century that were to lead to the downfall of the Roman Empire, was grudgingly impressed by what he saw of the Huns:

The people called Huns exceed every measure of savagery.  They are aflame with an inhuman desire for plundering others' property...  They enter battle drawn up in wedge-shaped masses.  And as they are lightly-equipped, for a swift motion, and unexpected in action, they purposely divide suddenly into scattered bands and attack, rushing about in disorder here and there, dealing terrific slaughter...  They fight from a distance with missiles having sharp bone, instead of their usual points, joined to the shafts with wonderful skill; then they gallop over the intervening spaces and fight hand-to-hand with swords.

Ammianus, though, didn't know any better than anyone else where the Huns had originated; his best guess was that they'd lived on "the shores of the ice-bound ocean," but never provided any reason why he thought that.

When they did explode onto the scene, though -- wherever they'd come from -- the effects were catastrophic.  The Goths, Alans, and Sarmatians of what are now the Balkan countries of eastern Europe were shoved farther and farther west, and all of a sudden, the Roman Empire had a serious problem on its hands.  The emperor at the time, Valens, was ill-equipped to deal with a hundred thousand refugees, mostly from Germanic-speaking tribes who had long been considered little more than barbarians.  (To be fair, it's hard to imagine how anyone would be well-equipped to deal with this.)  His decision to treat the Goths as enemies, rather than joining forces with them against the greater threat of the Huns, led to the Battle of Adrianople in 378.

Valens lost both the battle and his life.

While there was some attempt to come to terms with the Goths (or even turn them into allies) by Valens's successor Theodosius I, the stage was set.  The domino effect of Huns shoving the Goths and the Goths shoving the Romans continued, chipping away at the Western Roman Empire, ultimately leading to the Gothic leader Alaric sacking Rome itself in 410.  The Huns made their way into Gaul, and even into Italy, under Attila.  This forward motion continued until the Battle of the Catalaunian Plains, fought in 451 near what is now the town of Châlons, France, at which a combined force of Romans and Goths finally defeated the Huns and forced them back.

Perhaps the most curious thing about the Huns was that after that battle, they began to fall apart themselves with a speed that was just this side of bizarre.  Attila died in 453 -- from what appears to have been an esophageal hemorrhage -- and none of his many sons proved capable as a leader.  They fractured into various factions which rapidly succumbed to internecine squabbling, and their power waned as fast as it had waxed seventy years earlier.  What happened to them after that is just as much of a mystery as everything else about them; most historians believe that what was left of the Huns were absorbed into other ethnic groups in what are now Serbia, Bulgaria, and Romania, and they more or less ceased to exist as an independent culture.

So we're left with a puzzle.  One of the most familiar, instantly recognizable civilizations in history is of unknown origin and had an unknown fate, arising from obscurity and fading back into it as quickly.  But what's certain is that after they surged through Europe, the western half of the Roman Empire never recovered.  The last Emperor of the Western Roman Empire, Romulus Augustulus, abdicated in 476.  The western half of Europe fragmented into petty kingdoms ruled by various Germanic chieftains, and the power center shifted to Constantinople, where it would remain until Charlemagne came to to the throne three hundred years later.

Historical mysteries never fail to fascinate, and this is a baffling one -- a mysterious people who swept into Europe, smashed an empire that had stood for a thousand years, and then vanished, all within the span of a single century.  Perhaps one day historians will figure out who the Huns were, but for now, all we have is scanty records, the awed and fearful accounts of the people who witnessed them, and a plethora of questions.


Saturday, May 25, 2024

The cotton-candy planet

There's a general pattern you see in astrophysics, which arises from the fact that gravity is both (1) always attractive, never repulsive, and (2) extremely weak.

It's hard to overstate the "extremely weak" bit.  The next strongest of the four fundamental forces, electromagnetism, is 36 orders of magnitude stronger; that is, the electromagnetic force is 1,000,000,000,000,000,000,000,000,000,000,000,000 times more powerful than gravity.  This may seem odd and counterintuitive, since the gravitational pull on your body seems pretty damn strong (especially when you're tired).  But think about it this way; if you use a refrigerator magnet to pick up a paper clip, that little magnet is able to overcome the force of the entire Earth pulling on the clip in the opposite direction.

The practical result of these two features of gravity is that at small scales and low masses, the effects of gravity are essentially zero.  If I'm picking up a book, I don't have to adjust for the negligible gravitational attraction between myself and the book, only the attraction between the book and the enormous mass of the Earth.  On the largest scales, too, the effects of gravity more or less even out; this is called the flatness problem, and is something I dealt with in more detail in a recent post.  (Plus, on these cosmic scales, the force of expansion of spacetime itself -- something that's been nicknamed dark energy -- takes over.)

It's at mid-range scales that gravity becomes seriously important -- objects the size of planets, stars, and galaxies.  And there, the other feature of gravity kicks in; that it always attracts and never repels.  (Whatever Lost in Space may have had to say about anti-gravity, there's never been evidence of any such thing.)  So for objects between the size of planets and galaxies, gravity always wins unless there is some other force opposing it.

This, in fact, is how stars work; the pull of gravity from their mass causes the matter to collapse inward, heating them up until the fusion of hydrogen starts in the core.  This generates heat and radiation pressure, a balancing force keeping the star in equilibrium.  Once the fuel runs out, though, and that outward force diminishes, gravitational collapse resumes -- and the result is a white dwarf, a neutron star, or a black hole, depending on how big the star is.

All of this is just a long-winded way of saying that if you've got a mass big enough to form something on the order of a planet or star, it tends to fall inward and compress until some other force stops it.  That's why the insides of planets and stars are denser than the outsides.

Well, that's how we thought it worked.

The latest wrench in the mechanism came from the discovery of a planet called WASP-193b orbiting a Sun-like star about 1,200 light years away.  On first glance, WASP-193b looks like a gas giant; its diameter is fifty percent larger than Jupiter's.  So far, nothing that odd; exoplanet studies have found lots of gas giants out there.

But... the mass of WASP-193b is only one-seventh that of Jupiter, giving it the overall density of cotton candy.

So I guess in a sense it is a gas giant, but not as we know it, Jim.  At an average density of 0.059 grams per cubic centimeter, WASP-193b would float on water if you could find an ocean big enough.  Plus, there's the problem of what is keeping it from collapsing.  A mass one-seventh that of Jupiter is still an impressive amount of matter; its gravitational pull should cause it to pull together, decreasing the volume and raising the density into something like that of the planets in our own Solar System.  So there must be something, some force that's pushing all this gas outward, keeping it... fluffy.  For want of a better word.  

But what that force might be is still unknown.

"The planet is so light that it's difficult to think of an analogous, solid-state material," said Julien de Wit of MIT, who co-authored the study, in an interview with ScienceDaily.

[Image licensed under the Creative Commons NOIRLab/NSF/AURA/J. da Silva/Spaceengine/M. Zamani, Artist impression of ultra fluffy gas giant planet orbiting a red dwarf star, CC BY 4.0]

"WASP-193b is the second least dense planet discovered to date, after Kepler-51d, which is much smaller," said Khalid Barkaoui, of the Université de Liège's EXOTIC Laboratory and first author of the paper, which was published in Nature Astronomy last week.  "Its extremely low density makes it a real anomaly among the more than five thousand exoplanets discovered to date.  This extremely-low-density cannot be reproduced by standard models of irradiated gas giants, even under the unrealistic assumption of a coreless structure."

In short, the astrophysicists still don't know what's going on.  Twelve hundred light years from here is what amounts to a planet-sized blob of cotton candy orbiting a Sun-like star.  I'm sure that like the disappearing star from my post two days ago, the theorists will be all over this trying to explain how it could possibly happen, but thus far all we have is a puzzle -- a massive cloud of matter that is somehow managing to defy gravity.

As Shakespeare famously observed, there apparently are more things in heaven and earth than are dreamt of in our philosophy.



Friday, May 24, 2024

Raw deal

A friend of mine, a veterinarian in Scotland, has proven to be a wonderful source of topics for Skeptophilia, mostly on health-related issues.  Her skeptical, evidence-based approach has given her a keen eye for nonsense -- and man, in this field, there's a lot of nonsense to choose from.

Her latest contribution was so over-the-top at first I thought it was a parody.  Sadly, it's not.  So, dear readers, allow me to introduce you to:

The Raw Meat Carnivore Diet.

Once I ruled this out as an example of Poe's Law, my next guess was that it was the creation of someone like Andrew Tate to prove to us once and for all that he's the alpha-est alpha that ever alpha-ed, but again, this seems not to be the case.  Apparently, this is being seriously suggested as a healthy way to eat.  And it's exactly what it sounds like; on this diet, you're to eat only raw meat from ruminants (beef, bison, lamb, elk, etc.), salt, and water.

[Image licensed under the Creative Commons Jellaluna, Raw beef steak, 2011, CC BY 2.0]

At the risk of stating what is (I devoutly hope) the obvious, this is a really really REALLY bad idea.  Cooking your food remains the easiest and best way to sterilize it, killing pathogens like E. coli, Salmonella, Shigella, Campylobacter, and Staphylococcus aureus, as well as other special offers like various parasitic worms I'd prefer not to even think about.  The writer of the article, one Liam McAuliffe, assures us that the acidity in our stomach is perfectly capable of killing all of the above pathogens -- which leads to the question of why, then, anyone ever becomes ill from them.

Then there's a passage about an experiment back in the 1930s showing that cats fed a raw meat diet were generally healthier, which may be true, but ignores the fact that cats are damn close to obligate carnivores, and we're not.  To convince yourself that cats and humans have evolved to thrive on different diets, all you have to do is look at the teeth.  Cats have what are called carnassial molars; narrow, with sharp shearing edges, designed to cut meat up into chunks.  Our molars are flat, with cusps, typical of -- you guessed it -- an omnivore.  Citing the cat experiment as a reason we should all eat raw meat is a little like observing that cows thrive when allowed to graze in verdant fields, and deciding that henceforth humans should eat nothing but grass.

This brings up something else that Mr. McAuliffe conveniently neglects to mention; to have our digestive systems function properly, humans (and other omnivores) need to have a good bit of plant-derived cellulose in our diets -- what dietitians call "roughage" or "fiber."  Without it, our intestines clog up like a bad drain.  Eliminating all the vegetables from your diet is a good way to end up with terminal constipation.

What a way to go.  Or not go, as the case may be.

Then, there's a bit about how cooking meat reduces the amount of nutrients it contains -- specifically the B vitamins thiamine, riboflavin, and niacin.  Once again, this may well be true; but even if it is, the next question is, how many of us are deficient enough in these nutrients that the loss from cooking is actually a problem?  Let me put it this way; how many people do you know who have had beriberi (thiamine deficiency) or pellagra (niacin deficiency)?  (Riboflavin deficiency is so rare it doesn't even have a name.)  The fact is, if you're eating a normal diet, you are almost certainly getting more of these vitamins than you need, and the small amount of loss from cooking your t-bone steak is far offset by the benefit of not dying from an E. coli infection.

Not to beat the point unto death, but McAuliffe's contention -- that we are, in his words, "hypercarnivorous apex predators" -- is nonsense.  Our closest relatives, chimps and bonobos, are thoroughgoing omnivores, who will certainly eat meat when they can get it but also love fruit, and will chow down on starch-rich roots and stems without any apparent hesitation.  What's optimal for human health, and which has been demonstrated experimentally over and over, is a varied diet including meat (or an equivalent protein source), vegetables, and fruits -- just like our jungle-dwelling cousins.

So.  Yeah.  Go easy on the moose tartare.  I'm of the opinion that a steak with a glass of fine red wine is a nice treat, but let's avoid eating it raw, okay?


Thursday, May 23, 2024

Vanishing act

In Madeleine L'Engle's seminal young-adult fantasy novel The Wind in the Door, there's something that is making the stars go out.

Not just stop shining, but disappear entirely.  Here's the scene where the protagonist, Meg Murry, first witnesses it happening:
The warm rose and lavender of sunset faded, dimmed, was extinguished.  The sky was drenched with green at the horizon, muting upwards into a deep, purply blue through which stars began to appear in totally unfamiliar constellations.

Meg asked, "Where are we?"

"Never mind where.  Watch."

She stood beside him, looking at the brilliance of the stars.  Then came a sound, a violent, silent, electrical report, which made her press her hands in pain against her ears.  Across the sky, where the stars were clustered as thickly as in the Milky Way, a crack shivered, slivered, became a line of nothingness.

Within that crack, every star that had been there only a moment ago winked out of existence.
A central point in the story is that according to the laws of physics, this isn't supposed to happen.  Stars don't just vanish.  When they end their lives, they do so in an obvious and violent fashion -- even small-mass stars like the Sun swell into a red giant, and eventually undergo core collapse and blow off their outer atmospheres, creating a planetary nebula.  

The Cat's Eye Nebula [Image is in the Public Domain courtesy of NASA/JPL and the ESO]

Larger stars end their lives even more dramatically, as supernovas which lead to the formation of a neutron star or a black hole depending on how much matter is left over once the star blows up.

Well, that's what we thought always happened.

A study out of the University of Copenhagen has found that like in A Wind in the Door, sometimes stars simply... vanish.  A team of astrophysicists has found that instead of the usual progression of Main Sequence > Giant or Supergiant > BOOM! > White Dwarf, Neutron Star, or Black Hole, there are stars that undergo what the astrophysicists are (accurately if uncreatively) calling "complete collapse."  In a complete collapse, the gravitational pull is so high that even considering the power of a supernova, there's just not enough energy available for the outer atmosphere to achieve escape velocity.  So instead of exploding, it just kind of goes...

... pfft.

Unlike what Meg Murry witnessed, though, the matter that formed those stars is still there somewhere; the Law of Conservation of Matter and Energy is strictly enforced in all jurisdictions.  The star that was the focus of the study, VFTS 243, is part of a binary system -- and its companion star continued in its original orbit around their mutual center of mass without so much as a flutter, so the mass of its now-invisible partner is still there.  But the expected cataclysmic blast that usually precedes black hole formation never happened.

"We believe that the core of a star can collapse under its own weight, as happens to massive stars in the final phase of their lives," said Alejandro Vigna-Gómez, who co-authored the study.  "But instead of the contraction culminating into a bright supernova explosion that would outshine its own galaxy, expected for stars more than eight times as massive as the Sun, the collapse continues until the star becomes a black hole.  Were one to stand gazing up at a visible star going through a total collapse, it might, just at the right time, be like watching a star suddenly extinguish and disappear from the heavens.  The collapse is so complete that no explosion occurs, nothing escapes and one wouldn't see any bright supernova in the night sky.  Astronomers have actually observed the sudden disappearance of brightly shining stars in recent times.  We cannot be sure of a connection, but the results we have obtained from analyzing VFTS 243 has brought us much closer to a credible explanation."

You can see why I was immediately reminded of the scene in L'Engle's book.  And while I'm sure the answer isn't evil beings called Echthroi who are trying to extinguish all the light in the universe, the actual phenomenon is still a little on the unsettling side.

Once again showing that we are very far from understanding everything there is out there.  This sort of vanishing act has been high on the list of Things That Aren't Supposed To Happen.  It'll be interesting to see what the theorists propose with when they've had a shot at analyzing the situation, and if they can come up with some sort of factor that determines whether a massive star detonates -- or simply disappears.


Wednesday, May 22, 2024


If yesterday's post -- about creating pseudo-interactive online avatars for dead people -- didn't make you question where our use of artificial intelligence is heading, today we have a study out of Purdue University that found an application of ChatGPT to solving programming and coding problems resulted in answers that half the time contained incorrect information -- and 39% of the recipients of these answers didn't recognize the answers as incorrect.

The problem of an AI system basically just making shit up is called a "hallucination," and it's proven to be extremely difficult to eradicate.  This is at least partly because the answers are still generated using real data, so they can sound plausible; it's the software version of a student who only paid attention half the time and then has to take a test, and answers the questions by taking whatever vocabulary words he happens to remember and gluing them together with bullshit.  Google's Bard chatbot, for example, claimed that the James Webb Space Telescope had captured the first photograph of a planet outside the Solar System (a believable lie, but it didn't).  Meta's AI Galactica was asked to draft a paper on the software for creating avatars, and cited a fictitious paper by a real author who works in the field.  Data scientist Teresa Kubacka was testing ChatGPT and decided to throw in a reference to a fictional device -- the "cycloidal inverted electromagnon" -- just to see what the AI would do with it, and it came up with a description of the thing so detailed (with dozens of citations) that Kubacka found herself compelled to check and see if she'd by accident used the name of something obscure but real.

It gets worse than that.  A study of an AI-powered mushroom-identification software found it only got the answer right fifty percent of the time -- and, frighteningly, provided cooking instructions when presented with a photograph of a deadly Amanita mushroom.  Fall for that little "hallucination" and three days later at your autopsy they'll have to pour your liver out of your abdomen.  Maybe the AI was trained on Terry Pratchett's line that "All mushrooms are edible.  Some are only edible once."

[Image licensed under the Creative Commons Marketcomlabo, Image-chatgpt, CC BY-SA 4.0]

Apparently, in inventing AI, we've accidentally imbued it with the very human capacity for lying.

I have to admit that when the first AI became widely available, it was very tempting to play with it -- especially the photo modification software of the "see what you'd look like as a Tolkien Elf" type.  Better sense prevailed, so alas, I'll never find out how handsome Gordofindel is.  (A pity, because human Gordon could definitely use an upgrade.)  Here, of course, the problem isn't veracity; the problem is that the model is trained using art work and photography that is (to put not too fine a point on it) stolen.  There have been AI-generated works of "art" that contained the still-legible signature of the artist whose pieces were used to train the software -- and of course, neither that artist nor the millions of others whose images were "scrubbed" from the internet by the software received a penny's worth of compensation for their time, effort, and skill.

It doesn't end there.  Recently actress Scarlett Johansson announced that she actually had to sue Sam Altman, CEO of OpenAI, to get him to discontinue the use of a synthesized version of her voice that was so accurate it fooled her family and friends.  Here's her statement:

Fortunately for Ms. Johansson, she's got the resources to sue Altman, but most creatives simply don't.  If we even find out that our work has been lifted, we really don't have any recourse to fight the AI techbros' claims that it's "fair use." 

The problem is, the system is set up so that it's already damn near impossible for writers, artists, and musicians to make a living.  I've got over twenty books in print, through two different publishers and a handful that are self-published, and I have never made more than five hundred dollars a year.  My wife, Carol Bloomgarden, is an astonishingly talented visual artist who shows all over the northeastern United States, and in any given show it's a good day when she sells enough to pay for her booth fees, lodging, travel expenses, and food.

So throw a bunch of AI-insta-generated pretty-looking crap into the mix, and what happens -- especially when the "artist" can sell it for one-tenth of the price and still turn a profit? 

I'll end with a plea I've made before; until lawmakers can put the brakes on AI to protect safety, security, and intellectual property rights, we all need to stop using it.  Period.  This is not out of any fundamental anti-tech Luddite-ism; it's simply from the absolute certainty that the techbros are not going to police themselves, not when there's a profit to be made, and the only leverage we have is our own use of the technology.  So stop posting and sharing AI-generated photographs.  I don't care how "beautiful" or "precious" they are.  (And if you don't know the source of an image with enough certainty to cite an actual artist or photographer's name or Creative Commons handle, don't share it.  It's that simple.)

As a friend of mine put it, "As usual, it's not the technology that's the problem, it's the users."  Which is true enough; there are a myriad potentially wonderful uses for AI, especially once they figure out how to debug it.  But at the moment, it's being promoted by people who have zero regard for the rights of human creatives, and are willing to steal their writing, art, music, and even their voices without batting an eyelash.  They are shrugging their shoulders at their systems "hallucinating" incorrect information, including information that could potentially harm or kill you.

So just... stop.  Ultimately, we are in control here, but only if we choose to exert the power we have.

Otherwise, the tech companies will continue to stomp on the accelerator, authenticity, fairness, and truth be damned.


Tuesday, May 21, 2024

Memento mori

In this week's episode of the current season of Doctor Who, entitled "Boom," the body of a soldier killed in battle is converted into a rather creepy-looking cylinder that has the capacity for producing a moving, speaking hologram of the dead man, which has enough of his memory and personality imprinted on it that his friends and family can interact with it as if he were still alive.

I suspect I'm not alone in having found this scene rather disturbing, especially when his daughter has a chat with the hologram and seems completely unperturbed that her dad had just been brutally killed.  

Lest you think this is just another wild trope dreamed up by Steven Moffat and Russell T. Davies, there are already (at least) two companies that do exactly this -- Silicon Intelligence and Super Brain.  Both of them have models that use generative AI that scour your photos, videos, and written communication to produce a convincing online version of you, that then can interact with your family and friends in (presumably) a very similar fashion to how you did when you were alive.

I'm not the only one who is having a "okay, just hold on a minute" reaction to this.  Ethicists Katarzyna Nowaczyk-Basińska and Tomasz Hollanek, both of Cambridge University, considered the implications of "griefbots" in a paper last week in the journal Philosophy & Technology, and were interviewed this week in Science News, and they raise some serious objections to the practice.

The stance of the researchers is that at the very least there should be some kind of safeguard to protect the young from accessing this technology (since, just as in Doctor Who, there's the concern that children wouldn't be able to recognize that they weren't talking to their actual loved one, with serious psychological repercussions), and that it be clear to all users that they're communicating with an AI.  But they bring up a problem I hadn't even thought of; what's to stop companies from monetizing griefbots by including canny advertisements for paying sponsors?  "Our concern," said Nowaczyk-Basińska, "is that griefbots might become a new space for a very sneaky product placement, encroaching upon the dignity of the deceased and disrespecting their memory."

Ah, capitalism.  There isn't anything so sacred that it can't be hijacked to make money.

But as far as griefbots in general go, my sense is that the entire thing crosses some kind of ethical line.  I'm not entirely sure why, other than the "it just ain't right" arguments that devolve pretty quickly into the naturalistic fallacy.  Especially given my atheism, and my hunch that after I die there'll be nothing left of my consciousness, why would I care if my wife made an interactive computer model of me to talk to?  If it gives her solace, what's the harm?

I think one consideration is that by doing so, we're not really cheating death.  To put it bluntly, it's deriving comfort from a lie.  The virtual-reality model inside the computer isn't me, any more than a photograph or a video clip is.  But suppose we really go off the deep end, here, and consider what it would be like if someone could actually emulate the human brain in a machine -- and not just a random brain, but yours?

There's at least a theoretical possibility that you could have a computerized personality that would be completely authentic, with your thoughts, memories, sense of humor, and emotions.  (The current ones are a long way from that -- but even so, they're still scarily convincing.)  Notwithstanding my opinions on the topic of religion and the existence of the soul, there's a part of me that simply rebels at this idea.  Such a creation might look and act like me, but it wouldn't be me.  It might be a convincing facsimile, but that's about it.

But what about the Turing test?  Devised by Alan Turing, the idea of the Turing test for artificial intelligence is that because we don't have direct access to what any other sentient being is experiencing -- each of us is locked inside his/her own skull -- the only way to evaluate whether something is intelligent is the way it acts.  The sensory experience of the brain is a black box.  So if scientists made a Virtual Gordon, who acted on the computer screen in a completely authentic Gordonesque manner, would it not only be intelligent and alive, but... me?

In that way, some form of you might achieve immortality, as long as there was a computer there to host you.

This is moving into some seriously sketchy territory for most of us.  It's not that I'm eager to die; I tend to agree with my dad, who when he was asked what he wanted written on his gravestone, responded, "He's not here yet."  But as hard as it is to lose someone you love, this strikes me as a cheat, a way to deny reality, closing your eyes to part of what it means to be human.

So when I die, let me go.  Give me a Viking funeral -- put me on my canoe, set it on fire, and launch it out into the ocean.  Then my friends and family need to throw a huge party in my honor, with lots of music and dancing and good red wine and drunken debauchery.  And I think I want my epitaph to be the one I created for one of my fictional characters, also a science nerd and a staunch atheist: "Onward into the next great mystery."

For me, that will be enough.