Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, February 10, 2025

Executive orders, task forces, and paranoia

In further evidence that we're living in the Upside Down, a man who once publicly said "I have no reason to ask God for forgiveness when I have never made any mistakes," and who says his favorite book of the Bible is "Two Corinthians," has once again somehow convinced evangelical Christians that he is the Lord's Anointed One, despite his most striking claim to fame being embodying all Seven Deadly Sins in one individual.

The latest stunt by Donald Trump is the creation of a "task force to eliminate anti-Christian bias" from the United States.  This plays right into the evangelicals' all-time favorite hobby, which is looking around for stuff to be outraged about.  To listen to their preachers and televangelists and whatnot, you'd swear being a Christian in the United States was to risk being dragged into the Superdome, Roman-Colosseum-style, and fed to the lions.  Unsurprisingly -- to people who have at least some glancing connection to reality -- the opposite is true.  Just shy of ninety percent of the members of Congress identify as Christian; amongst Republican members, the figure rises to 98%.  In some parts of the country you couldn't be elected as Village Roadkill Collector unless you're a Christian.

For Trump, of course, this move is not because he actually believes that Christians are being persecuted, or would particularly care if they were.  As far as I've seen, Trump's beliefs can be summed up as "I'm in support of whatever gets me praise, power, and money."  This is all about cozying up to evangelical power brokers like John Hagee and Mike Huckabee, and through them, to their rabid MAGA supporters.  As far as the "anti-Christian bias" they're trying to eliminate, it's mostly regarding issues like requiring the Bible be taught as factual in public school classrooms, the Ten Commandments being in every governmental office building, and eliminating evil stuff like admitting we queer people actually exist and deserve rights.

The thing is, the clownish attempts by Trump and people like Representative Nancy Mace of South Carolina (who recently accomplished the astonishing feat of edging out both Lauren Boebert and Marjorie Taylor Greene as the stupidest person in Congress) are only a smokescreen for a far darker and more insidious push toward turning the United States into a Christofascist theocracy.  Trump may not have the first clue about actual Christian theology, but you can bet that people like Pete Hegseth, Russell Vought, and J. D. Vance do.  Those three, and others like them, are deadly serious; given free rein, and they'd look very like the American version of the Taliban.

[Image is in the Public Domain]

Fortunately for those of us who like the idea of separation of church and state, Trump has one saving grace; he has the attention span of a disordered toddler.  As Representative Alexandria Ocasio-Cortez put it, "Yes, this administration is dangerous and cruel, but they are also shockingly dim and incompetent."  As further evidence of this, Trump has now (by executive order, of course) created a "White House Faith Office" with televangelist and certifiable lunatic Paula White-Cain in charge.  White-Cain, you might recall, has been something of a frequent flyer here at Skeptophilia, most recently because of a claim that she'd had a vision wherein "God came to me last night and showed me a vision of Trump riding alongside Jesus on a horse made of gold and jewels.  This means he will play a critical role in Armageddon as the United States stands alongside Israel in the battle against Islam," and that because of this the faithful should donate their entire January salary to her and she'll make sure to pass the cash along to Jesus just as soon as she gets around it it.

In choosing White-Cain, however, Trump hasn't pleased everyone.  Illustrating the general rule that for every evangelical there's an equal and opposite evangelical, some prominent Christian leaders have objected to White-Cain's prominence, one even going so far as to call her a "heretic and known false teacher who has no regard for the Gospel of Jesus Christ."  Scott Ross, a Texas-based "Christian leadership coach," said, "Paula White, head of Trump’s White House Faith Office, is no Christian leader.  She preaches the heresies of Word of Faith & Prosperity Gospel, both utterly opposed to authentic Christianity.  Worse, she has lived a life of scandal, with multiple husbands, twisting the Gospel for profit.  Arguably, this is the worst and most dangerous thing President Trump has done—putting a false teacher at the helm of faith outreach.  Lord, have mercy on our country and this administration."

Even so, it's doubtful this will be enough to change many people's minds.  All Trump and White-Cain will have to do is to start snarling about the evil anti-religious libs and us hellbound LGBTQ+ people running around clamoring for equal rights (if you can even imagine), and the MAGA types will pull right back together into a nice, orderly herd again.

It'll take more than this minor internal squabbling to rid the Religious Right of its paranoia.

In one way, of course, the Christians are right to be freaking out.  Church attendance has been dropping steadily for twenty-five years; in 2018, for the first time ever, the number of people who state that they attend church weekly dropped below the number who say they never attend.   Estimates are that Christian church attendance has been decreasing by around twelve percent yearly for the past fifteen years, and there's no sign of that changing -- regardless of any mandates via executive order.

Funny how when religious leaders embrace hate, intolerance, and bigotry, use their religion to impose their will on others, and champion a president who is a narcissistic, vengeful, spiteful serial adulterer and compulsive liar, a lot of people decide it's time to find better things to do with their Sunday mornings.

I'll add here something I've said many times; it's not that I have anything against Christianity per se.  I have a lot of Christian friends of various denominations, and by and large, we get along fine.  My staunchly-held opinion is that we all come to an understanding of the universe and our place within it, and the big questions like the existence of God (or gods), the role of spirituality, and the meaning of life, in our own way and time.

But if you start using your religion as a weapon, either to force your own particular subset of beliefs on others or to deny rights to people you don't happen to like, I (and many of my friends, of both the believing and nonbelieving varieties) are gonna object.  Strenuously.

And if that makes you feel "persecuted" -- well, that sounds like a "you problem" to me.

****************************************

Saturday, February 8, 2025

The bellringer

Between December 16, 1811 and February 7, 1812, a series of four earthquakes -- each estimated to be above magnitude 7, with the first and last perhaps at magnitude 8 -- hit what you might think is one of the most unlikely places on Earth; southeastern Missouri.

The centers of continents are ordinarily thought to be tectonically stable, as they are generally far from any of the three typical sorts of faults -- divergences, or rifts, where plates are moving apart (e.g. the East African Rift Zone); convergences, or thrust faults, where plates are moving together (e.g. the Cascadia Subduction Zone); and strike-slip faults, where plates are moving in opposite directions parallel to the fault (e.g. the San Andreas Fault).  The Midwest is located in the middle of the North American Craton, an enormous block of what should, according to the conventional wisdom, be old, stable, geologically inactive rock. 

But the 1811-1812 earthquake series happened anyhow.  If they'd occurred today, it would likely have flattened the nearby city of Memphis, Tennessee.

So much for conventional wisdom.

The fault responsible was named the New Madrid Seismic Zone for the county right in the center of it, and its capacity for huge temblors is staggering.  The biggest (and final) earthquake of the four was powerful enough that it was felt thousands of kilometers away, and rang church bells in Charleston, South Carolina.  The shift in terrain changed the course of the Mississippi River, cutting off a meander and creating horseshoe-shaped Reelfoot Lake.

So what created a seismic zone where one shouldn't be?

[Image is in the Public Domain courtesy of the USGS]

The topic comes up because I just finished reading seismologist Susan Elizabeth Hough's excellent book Earthshaking Science: What We Know (and Don't Know) About Earthquakes, which is one of the best laypersons' introductions to plate tectonics and seismicity I've come across.  She devotes a good bit of space to the New Madrid earthquakes, and -- ultimately -- admits that the answer to this particular question is, "We're still not sure."  The problem is, the fault is deeply buried under layers of sediments; current estimates are that the hypocenter (the point directly underneath the epicenter where the fault rupture occurred) is between fifteen and thirty kilometers beneath the surface.  And since the quakes in question happened before seismometers were invented, we're going off inferences from written records, and such traces that were left on the surface (such as "sand blows," where compression forces subsurface sand upward through cracks in the stratum, and it explodes through the surface).

As far as the cause, Hough has a plausible explanation; the New Madrid Seismic Zone is an example of a failed rift, where a mantle plume (or hotspot) tried to crack the continent in half, but didn't succeed.  This stretched the plate and created a weak point -- called the Reelfoot Rift -- where any subsequent stresses were likely to trigger a rupture.  Since that time, the North American Plate has been continuously pushed by convection at the Mid Atlantic Rift, which is compressing the entire plate from east to west; those stresses cause buckling at vulnerable points, and may well have been the origin of the New Madrid earthquakes.

One puzzle, though, is what happened to the hotspot since then.  This is still a matter of speculation.  Some geologists think that friction with the rigid and (relatively) cold underside of the plate damped down the mantle plume and ultimately shut down convection.  Others think that as the North American Plate moved, it simply slid off the hotspot, making the plume appear to move eastward (when in actuality, the plate itself was moving westward).  This may be why another anomalous mid-plate earthquake zone is in coastal South Carolina, and it might also be the cause of the Bermuda Rise.

That point is still being debated.

Another open question is the current risk of the fault failing again.  There's paleoseismic data suggesting major earthquake sequences from the Reelfoot Rift/New Madrid Seismic Zone in around 900 and 1400 C.E., suggesting a timing between events of about four to five hundred years.  But these are estimates themselves, and I probably don't need to tell you that earthquake prediction is still far from precise.  Faults don't fail on a schedule -- which is why it annoys me every time I see someone say that an area is "overdue for an earthquake," as if they were on some kind of calendar.

Still, I can say with at least moderate confidence that it's unlikely to generate another big earthquake soon, which is kind of a relief.

So that's our geological curiosity of the day.  I have a curious family connection to the area; my wandering ancestor Sarah (Handsberry) Overby-Biles-Rulong (she married three times, had nine children, and outlived all three husbands) lived in the town of New Madrid in 1800, after traveling there from her home near Philadelphia as a single woman in the last decade of the eighteenth century.  I've never been able to discover what impelled her to leave her home and, with a group of relative strangers, cross what was then trackless wilderness to a remote outpost, and I've often wondered if she might have been either running away from something, or perhaps might have been a prostitute.  I'm not trying to malign her memory; it bears mention that a good eighty percent of my forebears were rogues, ne'er-do-wells, miscreants, and petty criminals, so it would hardly be a surprise to add prostitution to the mix.  And whatever else you can say about my family members, they were interesting.  I've often wished I could magically get a hold of Sarah's diary.

In any case, Sarah was in Lafayette, Louisiana by 1801, so she missed the New Madrid earthquakes by ten years.  But kind of interesting that she lived for a time in the little village that was about to be the epicenter of one of the biggest earthquakes ever to hit the continental United States, one that rang bells thousands of kilometers away, and which created a geological mystery the scientists are still trying to work out.

****************************************

Friday, February 7, 2025

To dye for

The history of dyes is actually way more interesting than it sounds.

People have been coloring cloth (and pottery, and cave walls, and their own bodies) for a very long time, but all colors don't turn out to be equally accessible to the palette.  Red, for example, is fairly easy, especially if you don't care if it's not screaming scarlet and has a slight brownish tint (what we'd call "brick red"), because that's the color of iron oxide, better known as rust.  Iron oxide is plentiful, and I know from messing around with pottery glazes that it's got two properties: (1) mixed with other minerals and/or heated in the absence of oxygen, it can give you a variety of other colors, from black to dark blue to green; and (2) it sticks to everything.  I have brushes I use in the glazing process that I used once or twice to apply an iron-based glaze, and now they're permanently stained red.

Other colors, however, aren't so easy.  Some of the more notoriously difficult ones are true blues and purples; our appending the word "royal" to royal blue and royal purple is an indicator of the fact that back then, only the really rich could afford blue or purple-dyed cloth.  Blue can be achieved using small amounts of cobalt, or finely powdered lapis lazuli, but neither is common and although they have other uses (cobalt in pottery pigments, lapis in paints) neither works well for dyeing cloth.  Lapis, in fact, was used to produce the finest rich blue pigment for oil paints, which got named ultramarine because the mineral was imported from what is now Afghanistan -- a place that was ultramarinus ("beyond the sea") to the people in Italy and France who were using it.

But dyeing cloth was another matter.  One solution was, bizarrely enough, a secretion of a sea snail of the genus Murex.  These snails' hypobranchial glands produce a gunk that when purified produces a rich purple dye that is "color fast" on cloth.

How anyone thought of doing this is an open question.  Maybe they just smeared slime from various animals on cloth until they found one that worked, I dunno.

Be that as it may, the color of the dye was called φοῖνιξ (phoinix) by the ancient Greeks, and the sea traders who cornered the market on producing and selling the dye were called the Φοίνικες (Phoinikhes).  We anglicized the word to Phoenicians -- so Phoenician means, literally, "people of the purple."

The reason all of this colorful stuff comes up is a paper in Science Advances that describes how a group of chemists in Portugal successfully determined the origin of a purple to blue (depending on how it's prepared) watercolor pigment called folium that was used in medieval watercolors.  It is a gorgeous color, but all previous attempts either to replicate it or to determine its source had been unsuccessful.  The difficulty with trying to figure out things like this is that there was no standardized naming system for plants (or anything else) back then, so the name in one place could (and probably did) vary from the name in another place.  Reading manuscripts about natural dyes from that time period, about all we can figure out is "it's made by boiling this plant we found" or "it's made from special snail slime," which doesn't really tell us much in the way of details.

Samples of medieval folium on cloth [Image courtesy of Paula Nabais/NOVA University]

In the case of folium, it was known that it came from a weedy plant of some sort, but there was no certainty about which plant it was or where it grew.  But now some Portuguese chemists have identified the source of folium as the seedpods of a roadside weed in the genus Chrozophora, a little unassuming plant in the Euphorbia family that likes dry, sunny, rocky hillsides, and when you grind up the seedpods, creates a knock-your-socks-off purple dye.  The dye was then applied to cloth, and you took small bits of the cloth and soaked them in water when you were ready to use them to make a natural watercolor paint.

The scientists were able to determine the chemical structure of the dye itself, which is pretty astonishing.  But even finding the plant was a remarkable accomplishment.  "We found it, guided by biologist Adelaide Clemente, in a very beautiful territory in Portugal [called] Granja, near a very beautiful small town Monsaraz -- a magical place, still preserved in time," said study co-author Maria João Melo, in an interview with CNN.  "Nobody in the small village of Granja knew [anything] about this little plant.  It may look like a weed, yet it is so elegant with its silvery stellate hairs that combine so well with the greyish green, and what a story there is behind it."

I'm always impressed with how intrepid our forebears were at using the resources around them to their fullest, but as with the snail slime, I'm mystified as to where that knowledge came from.  Some of it was probably by happy accident -- I think fermented milk products like yogurt and cheese probably were discovered because of milk that spoiled in just the right way, for example.  But bread has always mystified me.  Who first thought, "Let's take these seeds, and grind 'em up, and add this fungus powder to it with water until it gets all bubbly and smells funny, then stick it in the fire!  That'll be delicious with jam spread on it!"

And here -- grinding up the seedpods of a random weed ended up producing one of the rarest and prettiest dyes ever discovered.  Undoubtedly the brainstorm of some medieval artist or botanist (or both) who happened to get lucky.  Makes you wonder what other plants are out there that could have odd artistic, medicinal, or culinary uses -- especially in places of enormous biodiversity like the Amazonian rainforest, where there are probably as many plant species that have not been identified as there are ones that have been.

So if you needed another good reason to preserve biodiversity, there it is.

****************************************

Thursday, February 6, 2025

Wretched hives of scum and villainy

Being a fiction writer, I think about villains a lot.

Of course, the proper word is "antagonist," but "villain" is a lot more evocative, bringing to mind such characters as the the dastardly Snidely Whiplash from the brilliant Adventures of Dudley Doright of the Canadian Mounties.

Left to right: Snidely Whiplash, Dudley Doright, Fair Nell Fenwick, and Dudley's horse, who is named... Horse.  They just don't write comedy like that any more.

One of the things that I've always tried to do with the villains in my own novels is to make them three-dimensional.  I don't like stories where the villains are just evil because they're evil (unless it's for comedic effect, like Mr. Whiplash).  My college creative writing teacher, Dr. Bernice Webb (one of the formative influences on my writing) told us, "Every villain is the hero of his own story," and that has stuck with me.  Even with the most awful antagonists I've written -- Lydia Moreton in In the Midst of Lions comes to mind -- I hope my readers come away with at least understanding why they acted as they did.

Of course, understanding their motivation, whether it be money, sex, power, revenge, or whatever,  doesn't mean you need to sympathize with it.  I wrote a while back about the character of Carol Last from Alice Oseman's amazing novel Radio Silence, who I find to be one of the most deeply repulsive characters I've ever come across, because what motivates her is pure sadism (all the while wearing a smug smile).

Oseman's story works because we've all known people like her, who use their power to hurt people simply because they can, who take pleasure in making their subordinates' lives miserable.  What's worse is because of that twist in their personality, a frightening number of them become parents, bosses, teachers, and -- as we're currently finding out here in the United States -- political leaders. 

The reason this whole villainous topic comes up is because of a paper published in the journal Psychological Science called "Can Bad Be Good?  The Attraction of a Darker Self," by Rebecca Krause and Derek Rucker, both of Northwestern University.  In a fascinating study of the responses of over 235,000 test subjects to fictional characters, Krause and Rucker found that people are sometimes attracted to villains -- and the attraction is stronger if the villain embodies positive characteristics they themselves share.

For example, Emperor Palpatine is ruthless and cruel, but he also is intelligent and ambitious -- character traits that in a better person might be considered virtuous.  The Joker is an essentially amoral character who has no problem killing people, but his daring, his spontaneity, his quirkiness, and his sense of humor are all attractive characteristics.  Professor Moriarty is an out-and-out lunatic -- especially as played by Andrew Scott in the series Sherlock -- but he's brilliant, clever, inventive, and fearless.

And what Krause and Rucker found was that spontaneous and quirky people (as measured by personality assessments) tended to like characters like The Joker, but not characters like the humorless Palpatine.  Despite his being essentially evil, Moriarty appealed to people who like puzzles and intellectual games -- but those same people weren't so taken with the more ham-handed approach of a character like Darth Vader.

"Given the common finding that people are uncomfortable with and tend to avoid people who are similar to them and bad in some way, the fact that people actually prefer similar villains over dissimilar villains was surprising to us," said study co-author Rucker, in an interview in the Bulletin for the Association of Psychological Science.  "Honestly, going into the research, we both were aware of the possibility that we might find the opposite."

What seems to be going on here is that we can admire or appreciate a villain who is similar to us in positive ways -- but since the character is fictional, it doesn't damage our own self-image as it would if the villain was a real person harming other real people, or (worse) if we shared the villain's negative traits as well.

"Our research suggests that stories and fictional worlds can offer a ‘safe haven’ for comparison to a villainous character that reminds us of ourselves," said study lead author Rebecca Krause.  "When people feel protected by the veil of fiction, they may show greater interest in learning about dark and sinister characters who resemble them."

Which makes me wonder about myself, because my all-time favorite villain is Missy from Doctor Who.


Okay, she does some really awful things, is erratic and unpredictable and has very little concern about human life -- but she's brilliant, and has a wild sense of humor, deep curiosity about all the craziness that she's immersed in, and poignant grief over the loss of her home on Gallifrey.  Played by the stupendous Michelle Gomez, Missy is a complex and compelling character I just love to hate.

What that says about me, I'll leave as an exercise for the reader.

On the other hand, I still fucking loathe Carol Last.  I would have loved to see her tied to the railroad tracks, Dudley Doright-style, at the end of the book.

But I guess you can't have everything.

****************************************

Wednesday, February 5, 2025

Revising Drake

Most of you probably know about the Drake Equation, a way to estimate the number of intelligent civilizations in the universe.  The Equation is one of those curiosities that is looked upon as valid science by some and as pointless speculation by others.  Here's what it looks like:


Math-phobes, fear not; it's not as hard as it looks.  The idea, which was dreamed up by cosmologist Frank Drake back in 1961, is that you can estimate the number of civilizations in the universe with whom communication might be possible (Nb) by multiplying the probabilities of seven other independent variables, to wit:
R* = the average rate of star formation in our galaxy
fp = the fraction of those stars that have planets
ne = the fraction of those stars with planets whose planets are in the habitable zone
fl = the fraction of planets in the habitable zone that develop life
fi = the fraction of those planets which eventually develop intelligent life
fc = the fraction of those planets with intelligent life whose inhabitants develop the capability of communicating over interstellar distances
L = the average lifetime of those civilizations
Some of those (such as R*) are considered to be understood well enough that we can make a fairly sure estimate of their magnitudes.  Others -- such as fp and ne -- were complete guesses in Drake's time.  How many stars have planets?  Seemed like it could have been nearly all of them, or it perhaps the Solar System was some incredibly fortunate fluke, and we're one of the only planetary systems in existence.

The encouraging thing, at least for people like me who would love nothing better than to find we lived in a Star Trek universe where there's intelligent life wherever you look, just about all of these parameters have been revised upward since Drake first put his equation together.  Exoplanets, including ones in the so-called "Goldilocks zone," have turned out to be pretty much everywhere; not having planets turns out to be a much rarer situation.  There are over a hundred billion stars in the Milky Way alone; the number of planets in our galaxy is almost certainly in the trillions.  

As far as developing life... well, that one is still open to question, given that thus far we have a sample size of one to draw inferences from.  But that parameter -- fl -- just got a significant boost from a study done collaboratively by Hokkaido University and NASA of samples brought back from the asteroid Bennu by NASA's OSIRIS-REx mission, which found significant traces of all five nitrogenous bases that make up the genetic material in every living thing known (adenine, cytosine, guanine, thymine, and uracil).

Not only that, but they found the organic compounds xanthine and hypoxanthine (precursors of many bioactive compounds, including caffeine and theobromine), and nicotinic acid (vitamin B3).

This is an absolutely astonishing result.

"In previous research, uracil and nicotinic acid were detected in the samples from asteroid Ryugu, but the other four nucleobases were absent," said Toshiki Koga, who co-authored the paper, which appeared last week in Nature Astronomy.  "The difference in abundance and complexity of N-heterocycles between Bennu and Ryugu could reflect the differences in the environment to which these asteroids have been exposed in space."

What it brings to mind for me, though, is that if these five critical compounds can form on an airless, icy rubble pile (which is what Bennu honestly is), they've got to be pretty much everywhere in the universe that isn't so hot they fall apart.  And in case I haven't made the case strenuously enough, they are the basis of the genetic information shared by all life on Earth.

I think N -- the all-important Drake Equation estimate of the number of technological civilizations in the universe -- just got revised upward again.

Of course, even with my excited leaping about, I have to admit there's still a great deal we don't know, especially about the parameters that are lower on the list.  How many planets that do develop life end up with intelligent, technological life?  A while back I did a post about the rather terrifying idea of the Great Filter, which looks at the roadblocks that might prevent technological civilizations from forming or persisting.  Because the fact remains that when we look out there, we don't see signals from other civilizations -- something called the "Fermi Paradox" after the great physicist Enrico Fermi, who after listening to all the arguments for extraterrestrial life, famously quipped, "Then where is everybody?"

And we still have no idea about the scary parameter L -- how long, on average, technological civilizations last.  Given recent horrific developments in U.S. politics, I rather think I'm revising my own estimate of this one in the downward direction.  Maybe a benevolent alien will come and fix the mess we're in.  I know who I'm hoping for:


But even so, the Bennu study is exciting, and gives me hope that we might still one day find extraterrestrial life.  Perhaps even from the recently-launched Europa Clipper mission, which in April 2030 will do flybys of Jupiter's moon Europa -- widely considered to be our best shot of a place hosting extraterrestrial life in our own Solar System -- in the hopes of picking up biosignatures.

So we continue to wait, and wonder, and learn.  And -- as astronomer Neil deGrasse Tyson always says, at the end of his talks -- "Keep looking up!"

****************************************

Tuesday, February 4, 2025

The riddle of the sun stones

When you think about it, it's unsurprising that our ancestors invented "the gods" as an explanation for anything they didn't understand.

They were constantly bombarded by stuff that was outside of the science of their time.  Diseases caused by the unseen action of either genes or microorganisms.  Weather patterns, driven by forces that even in the twenty-first century we are only beginning to understand deeply, and which controlled the all-important supply of food and water.  Earthquakes and volcanoes, whose root cause only began to come clear sixty years ago.

Back then, everything must have seemed as mysterious as it was precarious.  For most of our history, we've been at the mercy of forces we didn't understand and couldn't control, where they were one bad harvest or failed rainy season or sudden plague from dying en masse.

No wonder they attributed it all to gods and sub-gods -- and devils and demons and witches and evil spirits.

As much as we raise an eyebrow at the superstition and seeming credulity of the ancients, it's important to recognize that they were no less intelligent, on average, than we are.  They were trying to make sense of their world with the information they had at the time, just like we do.  That we have a greater knowledge base to draw upon -- and most importantly, the scientific method as a protocol -- is why we've been more successful.  But honestly, it's no wonder that they landed on supernatural, unscientific explanations; the natural and scientific ones were out of their reach.

The reason this comes up is a recent discovery that lies at the intersection of archaeology and geology, which (as regular readers of Skeptophilia know) are two enduring fascinations for me.  Researchers excavating sites at Vasagård and Rispebjerg, on the island of Bornholm, Denmark, have uncovered hundreds of flat stone disks with intricate patterns of engraving, dating from something on the order of five thousand years ago.  Because many of the disks have designs of circles with branching radial rays extending outward, they've been nicknamed "sun stones."  Why, in around 2,900 B.C.E., people were suddenly motivated to create, and then bury, hundreds of these stones, has been a mystery.

Until now.

[Image credit: John Lee, Nationalmuseet, Copenhagen, Denmark]

Data from Greenland ice cores has shown a sudden spike in sulfates and in dust and ash from right around the time the sun stones were buried -- both hallmarks of a massive volcanic eruption.  The location of the volcano has yet to be determined, but what is clear is that it would have had an enormous effect on the climate.  "It was a major eruption of a great magnitude, comparable to the well-documented eruption of Alaska’s Okmok volcano in 43 B.C.E. that cooled the climate by about seven degrees Celsius," said study lead author Rune Iversen, of the Saxo Institute at the University of Copenhagen.  "The climate event must have been devastating for them."

The idea that the volcanic eruption in 2,900 B.C.E. altered the climate worldwide got a substantial boost with the analysis of tree rings from wood in Europe and North America.  Right around the time of the sulfate spike in the Greenland ice cores, there's a series of narrow tree rings -- indicative of short growing seasons and cool temperatures.  Wherever this eruption took place, it wrought havoc with the weather, with all of the results that has on human survival.

While the connection between the eruption and the sun stones is an inference, it certainly has some sense to it.  How else would you expect a pre-technological culture to respond to a sudden, seemingly inexplicable dimming of the sun, cooler summers and bitter winters with resultant probable crop failures, and even the onset of wildly fiery sunrises and sunsets?  It bears keeping in mind that our own usual fallback of "there must be a scientific explanation even if I don't know what it is" is a relatively recent development. 

So while burying engraved rocks might seem like a strange response to a climatic change, it is understandable that the ancients looked to a supernatural solution for what must have been a mystifying natural disaster.  And we're perhaps not so very much further along, ourselves, given the way a substantial fraction of people in the United States are responding to climate change even though the models have been predicting this for decades, and the evidence is right in front of our faces.  We still have plenty of areas we don't understand, and are saddled with unavoidable cognitive biases even if we do our best to fight them.  As the eminent science historian James Burke put it, in his brilliant and provocative essay "Worlds Without End":

Science produces a cosmogony as a general structure to explain the major questions of existence.  So do the Edda and Gilgamesh epics, and the belief in Creation and the garden of Eden.  Myths provide structures which give cause-and effect reasons for the existence of phenomena.  So does science.  Rituals use secret languages known only to the initiates who have passed ritual tests and who follow the strictest rules of procedure which are essential if the magic is to work.  Science operates in the same way.  Myths confer stability and certainty because they explain why things happen or fail to happen, as does science.  The aim of the myth is to explain existence, to provide a means of control over nature, and to give to us all comfort and a sense of place in the apparent chaos of the universe.  This is precisely the aim of science.

Science, therefore for all the reasons above, is not what it appears to be.  It is not objectively impartial, since every observation it makes of nature is impregnated with theory.  Nature is so complex, and sometimes so seemingly random, that it can only be approached with a systematic tool that presupposes certain facts about it.  Without such a pattern it would be impossible to find an answer to questions even as simple as "What am I looking at?"
****************************************

Monday, February 3, 2025

Riding on a light beam

Some of you probably have read about a project called Breakthrough Starshot that began perhaps eight years ago (and which was championed by none other than Stephen Hawking), which proposed sending small remote-controlled cameras to nearby star systems, powered by lasers that could propel them up to twenty percent of the speed of light.

If something like this were launched today, it would mean we could be getting photographs back from Proxima Centauri in twenty years.

[Image licensed under the Creative Commons ESO/M. Kornmesser, Artist's impression of the planet orbiting Proxima Centauri, CC BY 4.0]

It's an ambitious project, and faces significant hurdles.  Even if propelled by lasers -- which, being light, travel at the speed thereof -- navigation becomes increasingly difficult the farther away it gets.  Just at the distance of Pluto, our intrepid little spacecraft would be 4.5 light-hours from Earth, meaning if we tried to beam it instructions to dodge around an incoming meteor, it would be 4.5 hours until the command arrived, at which point all that would be left is intrepid scrap metal.  And Proxima Centauri is 4.3 light years away.

You see the problem.  The Starshot spacecraft would have to be able, on some level, to think for itself, because there simply wouldn't be time for Mission Control to steer it to avoid danger.

There are other obstacles, though.  Besides the obvious difficulties of being in the cold vacuum of interstellar space, contending with cosmic rays and the like, there's the problem engendered by its speed.  Assuming the estimate of a maximum velocity of twenty percent of light speed is correct, even tiny particles of dust would become formidable projectiles, so Starshot is going to require some heavy-duty shielding, increasing its mass (and thus the amount of energy needed to make it go).

Three years ago we got an encouraging proof of concept, when the group working on the mission -- Russian entrepreneur Yuri Milner's Breakthrough Foundation -- launched a test of the Starshot craft.  It was a tiny little thing, small enough to fit in your hand and weighing about the same as a stick of gum, designed and built by engineers at the University of California - Santa Barbara.  In the test flight it achieved an altitude of nineteen miles, all the while functioning flawlessly, returning four thousand images of the Earth taken from aloft.

And just last week, a paper in Nature Photonics describes further research on how to overcome the weight/propulsion issue, with the creation of a fifty-nanometer-thick membrane of silicon nitride that was tested to measure the actual thrust a laser could create on something that lightweight -- a feat that has never been done before.  The miniature sail passed with flying colors.

"There are numerous challenges involved in developing a membrane that could ultimately be used as lightsail," said Harry Atwater of Caltech, who led the study.  "It needs to withstand heat, hold its shape under pressure, and ride stably along the axis of a laser beam.  But before we can begin building such a sail, we need to understand how the materials respond to radiation pressure from lasers.  We wanted to know if we could determine the force being exerted on a membrane just by measuring its movements.  It turns out we can."

The most significant remaining hurdle is to design the laser system to make Starshot move -- lasers that are extremely powerful yet so finely collimated that they can still strike a ten-centimeter craft square-on from several light years away.  The engineering director for Breakthrough, Peter Klupar, is designing a 100,000 gigawatt laser -- to be located, he says, in Chile -- that could be the answer.  Of course, such a powerful device is not without its dangers.  Reflected off a mirror in space, Klupar says, such a laser could "ignite an entire city in minutes."

Not that there's a mirror out there.  So you shouldn't worry at all about that.

"You would think that this is all impossible, but we have folks at Caltech and the University of Southampton and Exeter University working on about fifty contracts on making all [of] this happen," Klupar said.  "No one has come up with a deal-breaker that we can find yet.  It all seems real."

All of which may seem like science fiction, but it's phenomenal how fast things go from the realm of Star Trek to reality.  Klupar compares his light sails to CubeSats, tiny (ten by ten centimeters, weighing a little over a kilogram) orbiting telemetry devices that are now common.  "It feels a lot like the way CubeSats felt twenty years ago," he said.  "People were saying, 'Those are toys, they're never going to develop into anything, there's no way I can see that ever working.'  And today and look them: hundreds of millions of dollars is being spent on them."

So keep your eye on this project.  If there's a chance at a remote visit to another star system, I think this is our best bet.  The Breakthrough Foundation estimates an actual, honest-to-goodness launch toward a nearby star as early as 2030.  Meaning perhaps we could get our first photographs of planets around another star by 2050.

I'll be ninety years old at that point, but if that's what I'm waiting for, I can make it till then.

****************************************

Saturday, February 1, 2025

Remembrance of things past

"The human brain is rife with all sorts of ways of getting it wrong."

This quote is from a talk by eminent astrophysicist Neil deGrasse Tyson, and is just about spot on.  Oh, sure, our brains work well enough, most of the time; but how many times have you heard people say things like "I remember that like it was yesterday!" or "Of course it happened that way, I saw it with my own eyes"?

Anyone who knows something about neuroscience should immediately turn their skepto-sensors up to 11 as soon as they hear either of those phrases.

fMRI scan of a human brain during working memory tasks [Image is in the Public Domain courtesy of the Walter Reed National Military Medical Center]

Our memories and sensory-perceptual systems are selective, inaccurate, heavily dependent on what we're doing at the time, and affected by whether we're tired or distracted or overworked or (even mildly) inebriated.  Sure, what you remember might have happened that way, but -- well, let's just say it's not as much of a given as we'd like to think.  An experiment back in 2005 out of the University of Portsmouth looked memories of the Tavistock Square (London) bus bombing, and found that a full forty percent of the people questioned had "memories" of the event that were demonstrably false -- including a number of people who said they recalled details from CCTV footage of the explosion, down to what people were wearing, who showed up to help the injured, when police arrived, and so on.

Oddly enough, there is no CCTV footage of the explosion.  It doesn't exist and has never existed.

Funny thing that eyewitness testimony is considered some of the most reliable evidence in courts of law, isn't it?

There are a number of ways our brains can steer us wrong, and the worst part of it all is that they leave us simultaneously convinced that we're remembering things with cut-crystal clarity.  Here are a few interesting memory glitches that commonly occur in otherwise mentally healthy people, that you might not have heard of:

  • Cryptomnesia.  Cryptomnesia occurs when something from the past recurs in your brain, or arises in your external environment, and you're unaware that you've already experienced it.  This has resulted in several probably unjustified accusations of plagiarism; the author in question undoubtedly saw the text they were accused of plagiarizing some time earlier, but honestly didn't remember they'd read it and thought that what they'd come up with was entirely original.  It can also result in some funnier situations -- while the members of Aerosmith were taking a break from recording their album Done With Mirrors, they had a radio going, and the song "You See Me Crying" came on.  Steven Tyler said he thought that was a pretty cool song, and maybe they should record a cover of it.  Joe Perry turned to him in incredulity and said, "That's us, you fuckhead."
  • Semantic satiation.  This is when a word you know suddenly looks unfamiliar to you, often because you've seen it repeatedly over a fairly short time.  Psychologist Chris Moulin of Leeds University did an experiment where he had test subjects write the word door over and over, and found that after a minute of this 68% of the subjects began to feel distinctly uneasy, with a number of them saying they were doubting that "door" was a real word.  I remember being in high school writing an exam in an English class, and staring at the word were for some time because I was convinced that it was spelled wrong (but couldn't, of course, remember how it was "actually" spelled).
  • Confabulation.  This is the recollection of events that never happened -- along with a certainty that you're remembering correctly.  (The people who claimed false memories of the Tavistock Square bombing were suffering from confabulation.)  The problem with this is twofold; the more often you think about the false memory or tell your friends and family about it, the more sure you are of it; and often, even when presented with concrete evidence that you're recalling incorrectly, somehow you still can't quite believe it.  A friend of mine tells the story of trying to help her teenage son find his car keys, and that she was absolutely certain that she'd seen them that day lying on a blue surface -- a chair, tablecloth, book, she wasn't sure which, but it was definitely blue.  They turned the house upside down, looking at every blue object they could find, and no luck.  Finally he decided to walk down to the bus stop and take the bus instead, and went to the garage to get his stuff out of the car -- and the keys were hanging from the ignition, where he'd left them the previous evening.  "Even after telling me this," my friend said, "I couldn't accept it.  I'd seen those keys sitting on a blue surface earlier that day, and remembered it as clearly as if they were in front of my face."
  • Declinism.  This is the tendency to remember the past as more positive than it actually was, and is responsible both for the "kids these days!" thing and "Make America Great Again."  There's a strong tendency for us to recall our own past as rosy and pleasant as compared to the shitshow we're currently immersed in, irrespective of the fact that violence, bigotry, crime, and general human ugliness are hardly new inventions.  (A darker aspect of this is that some of us -- including a great many MAGA types -- are actively longing to return to the time when straight White Christian men were in charge of everything; whether this is itself a mental aberration I'll leave you to decide.)  A more benign example is what I've noticed about travel -- that after you're home, the bad memories of discomfort and inconveniences and delays and questionable food fade quickly, leaving behind only the happy feeling of how much you enjoyed the experience.
  • The illusion of explanatory depth.  This is a dangerous one; it's the certainty that you understand deeply how something works, when in reality you don't.  This effect was first noted back in 2002 by psychologists Leonid Rozenblit and Frank Keil, who took test subjects and asked them to rank from zero to ten their understanding of how common devices worked, including zippers, bicycles, electric motors, toasters, and microwave ovens, and found that hardly anyone gave themselves a score lower than five on anything.  Interestingly, the effect vanished when Rozenblit and Keil asked the volunteers actually to explain how the devices worked; after trying to describe in writing how a zipper works, for example, most of test subjects sheepishly realized they actually had no idea.  This suggests an interesting strategy for dealing with self-styled experts on topics like climate change -- don't argue, ask questions, and let them demonstrate their ignorance on their own.
  • Presque vu.  Better known as the "tip-of-the-tongue" phenomenon -- the French name means "almost seen" -- this is when you know you know something, but simply can't recall it.  It's usually accompanied by a highly frustrating sense that it's right there, just beyond reach.  Back in the days before The Google, I spent an annoyingly long time trying to recall the name of the Third Musketeer (Athos, Porthos, and... who???).  I knew the memory was in there somewhere, but I couldn't access it.  It was only after I gave up and said "to hell with it" that -- seemingly out of nowhere -- the answer (Aramis) popped into my head.  Interestingly, neuroscientists are still baffled as to why this happens, and why turning your attention to something else often makes the memory reappear.

So be a little careful about how vehemently you argue with someone over whether your recollection of the past or theirs is correct.  Your version might be right, or theirs -- or it could easily be that both of you are remembering things incompletely or incorrectly.  I'll end with a further quote from Neil deGrasse Tyson: "We tend to have great confidence in our own brains, when in fact we should not.  It's not that eyewitness testimony by experts or people in uniform is better than that of the rest of us; it's all bad....  It's why we scientists put great faith in our instruments.  They don't care if they've had their morning coffee, or whether they got into an argument with their spouse -- they get it right every time."

****************************************

Friday, January 31, 2025

Unleashing the tsunami

Today I have for you two news stories that are interesting primarily in juxtaposition.

The first is a press release about a study out of Stanford University that found LGBTQ+ people have, across the board, a higher rate of mental health disorders involving stress, anxiety, and depression than straight people do.  Here's the relevant quote:

New research looking at health data of more than a quarter of a million Americans shows that LGBTQ+ people in the US have a higher rate of many commonly diagnosed mental health conditions compared to their with cisgender and straight peers, and that these links are reflective of wider societal stigma and stress.  For example, cisgender women who are a sexual minority, such as bisexual or lesbian, had higher rates of all 10 mental health conditions studied compared to straight cisgender women.  Gender diverse people, regardless of their sex assigned at birth, and cisgender sexual minority men and had higher rates of almost all conditions studied compared to straight cisgender men, with schizophrenia being the one exception.  A separate commentary says these differences are not inevitable, and could likely be eliminated through legal protections, social support, and additional training for teachers and healthcare professionals.

The second is from Newsbreak and is entitled, "Trump Signs Sweeping Executive Orders That Overhaul U.S. Education System."  The orders, as it turns out, have nothing to do with education per se, and everything to do with appeasing his homophobic Christofascist friends who are determined to remove every protection from queer young people against discrimination.  Once again, here's the quote:

The executive order titled Ending Radical Indoctrination in K-12 Schools threatens to withhold federal funding for "illegal and discriminatory treatment and indoctrination in K-12 schools," including based on gender ideology and the undefined and vague "discriminatory equity ideology."

The order calls for schools to provide students with an education that instills "a patriotic admiration" for the United States, while claiming the education system currently indoctrinates them in "radical, anti-American ideologies while deliberately blocking parental oversight."...

"These practices not only erode critical thinking but also sow division, confusion and distrust, which undermine the very foundations of personal identity and family unity," the order states.

So how is it surprising to anyone that we queer people have a higher rate of depression, stress, and anxiety?  Funny how that happens when elected officials not only claim we exist because of "radical indoctrination," but are doing their damnedest to erase us from the face of the Earth.

If you think I'm exaggerating, take a look at this:


It's a good thing I retired in 2019 (after 32 years in the classroom), because if anyone -- from a school administrator all the way up to the president himself -- told me I couldn't call a trans kid by their desired name or pronouns, or had to take down the sticker I had on my classroom door that had a Pride flag and the caption "Everyone Is Safe Here," my response would have been:

FUCK.  YOU.

And I'd probably have added a single-finger salute, for good measure.

Mr. Trump, you do not get to legislate us out of existence.  You do not get to tell us who we can be kind to, who we can treat humanely, whose rights we can honor, who we can help to feel safe and secure and accepted for who they are.  I lost four damn decades of my life hiding in the closet out of fear and shame because of the kind of thinking you are now trying to cast into law, and I will never stand silent and watch that happen to anyone else.

So maybe your yes-men and yes-women -- your hand-picked loyalist cronies who do your bidding without question and line up to kiss your ass even before you ask it -- are jumping up and down in excitement over enacting this latest outrage, but you (and they) can threaten us all you want.

I'm not complying.  I will never comply.  And I know plenty of high school teachers and administrators who feel exactly the same way I do.  You may think you've picked an easy target, but what you are doing has unveiled how deeply, thoroughly cruel your motives are -- and it will unleash a tsunami of resistance.

LGBTQ+ people and minorities and the other groups you get your jollies by bullying will always be safe with me.  And if you think any stupid fucking command from on high will change that, you'd better think again.

To put it in a way even someone of your obviously limited intellectual capacity can understand: you can take your executive order and stick it up your bloated ass.

Sideways.

****************************************