The fault responsible was named the New Madrid Seismic Zone for the county right in the center of it, and its capacity for huge temblors is staggering. The biggest (and final) earthquake of the four was powerful enough that it was felt thousands of kilometers away, and rang church bells in Charleston, South Carolina. The shift in terrain changed the course of the Mississippi River, cutting off a meander and creating horseshoe-shaped Reelfoot Lake.
Skeptophilia
Fighting Gullibility with Sarcasm, 6 days a week
Saturday, February 8, 2025
The bellringer
The fault responsible was named the New Madrid Seismic Zone for the county right in the center of it, and its capacity for huge temblors is staggering. The biggest (and final) earthquake of the four was powerful enough that it was felt thousands of kilometers away, and rang church bells in Charleston, South Carolina. The shift in terrain changed the course of the Mississippi River, cutting off a meander and creating horseshoe-shaped Reelfoot Lake.
Friday, February 7, 2025
To dye for
People have been coloring cloth (and pottery, and cave walls, and their own bodies) for a very long time, but all colors don't turn out to be equally accessible to the palette. Red, for example, is fairly easy, especially if you don't care if it's not screaming scarlet and has a slight brownish tint (what we'd call "brick red"), because that's the color of iron oxide, better known as rust. Iron oxide is plentiful, and I know from messing around with pottery glazes that it's got two properties: (1) mixed with other minerals and/or heated in the absence of oxygen, it can give you a variety of other colors, from black to dark blue to green; and (2) it sticks to everything. I have brushes I use in the glazing process that I used once or twice to apply an iron-based glaze, and now they're permanently stained red.
Other colors, however, aren't so easy. Some of the more notoriously difficult ones are true blues and purples; our appending the word "royal" to royal blue and royal purple is an indicator of the fact that back then, only the really rich could afford blue or purple-dyed cloth. Blue can be achieved using small amounts of cobalt, or finely powdered lapis lazuli, but neither is common and although they have other uses (cobalt in pottery pigments, lapis in paints) neither works well for dyeing cloth. Lapis, in fact, was used to produce the finest rich blue pigment for oil paints, which got named ultramarine because the mineral was imported from what is now Afghanistan -- a place that was ultramarinus ("beyond the sea") to the people in Italy and France who were using it.
But dyeing cloth was another matter. One solution was, bizarrely enough, a secretion of a sea snail of the genus Murex. These snails' hypobranchial glands produce a gunk that when purified produces a rich purple dye that is "color fast" on cloth.
How anyone thought of doing this is an open question. Maybe they just smeared slime from various animals on cloth until they found one that worked, I dunno.
Be that as it may, the color of the dye was called φοῖνιξ (phoinix) by the ancient Greeks, and the sea traders who cornered the market on producing and selling the dye were called the Φοίνικες (Phoinikhes). We anglicized the word to Phoenicians -- so Phoenician means, literally, "people of the purple."
The reason all of this colorful stuff comes up is a paper in Science Advances that describes how a group of chemists in Portugal successfully determined the origin of a purple to blue (depending on how it's prepared) watercolor pigment called folium that was used in medieval watercolors. It is a gorgeous color, but all previous attempts either to replicate it or to determine its source had been unsuccessful. The difficulty with trying to figure out things like this is that there was no standardized naming system for plants (or anything else) back then, so the name in one place could (and probably did) vary from the name in another place. Reading manuscripts about natural dyes from that time period, about all we can figure out is "it's made by boiling this plant we found" or "it's made from special snail slime," which doesn't really tell us much in the way of details.
In the case of folium, it was known that it came from a weedy plant of some sort, but there was no certainty about which plant it was or where it grew. But now some Portuguese chemists have identified the source of folium as the seedpods of a roadside weed in the genus Chrozophora, a little unassuming plant in the Euphorbia family that likes dry, sunny, rocky hillsides, and when you grind up the seedpods, creates a knock-your-socks-off purple dye. The dye was then applied to cloth, and you took small bits of the cloth and soaked them in water when you were ready to use them to make a natural watercolor paint.
The scientists were able to determine the chemical structure of the dye itself, which is pretty astonishing. But even finding the plant was a remarkable accomplishment. "We found it, guided by biologist Adelaide Clemente, in a very beautiful territory in Portugal [called] Granja, near a very beautiful small town Monsaraz -- a magical place, still preserved in time," said study co-author Maria João Melo, in an interview with CNN. "Nobody in the small village of Granja knew [anything] about this little plant. It may look like a weed, yet it is so elegant with its silvery stellate hairs that combine so well with the greyish green, and what a story there is behind it."
I'm always impressed with how intrepid our forebears were at using the resources around them to their fullest, but as with the snail slime, I'm mystified as to where that knowledge came from. Some of it was probably by happy accident -- I think fermented milk products like yogurt and cheese probably were discovered because of milk that spoiled in just the right way, for example. But bread has always mystified me. Who first thought, "Let's take these seeds, and grind 'em up, and add this fungus powder to it with water until it gets all bubbly and smells funny, then stick it in the fire! That'll be delicious with jam spread on it!"
And here -- grinding up the seedpods of a random weed ended up producing one of the rarest and prettiest dyes ever discovered. Undoubtedly the brainstorm of some medieval artist or botanist (or both) who happened to get lucky. Makes you wonder what other plants are out there that could have odd artistic, medicinal, or culinary uses -- especially in places of enormous biodiversity like the Amazonian rainforest, where there are probably as many plant species that have not been identified as there are ones that have been.
So if you needed another good reason to preserve biodiversity, there it is.
Thursday, February 6, 2025
Wretched hives of scum and villainy
Of course, the proper word is "antagonist," but "villain" is a lot more evocative, bringing to mind such characters as the the dastardly Snidely Whiplash from the brilliant Adventures of Dudley Doright of the Canadian Mounties.
One of the things that I've always tried to do with the villains in my own novels is to make them three-dimensional. I don't like stories where the villains are just evil because they're evil (unless it's for comedic effect, like Mr. Whiplash). My college creative writing teacher, Dr. Bernice Webb (one of the formative influences on my writing) told us, "Every villain is the hero of his own story," and that has stuck with me. Even with the most awful antagonists I've written -- Lydia Moreton in In the Midst of Lions comes to mind -- I hope my readers come away with at least understanding why they acted as they did.
Of course, understanding their motivation, whether it be money, sex, power, revenge, or whatever, doesn't mean you need to sympathize with it. I wrote a while back about the character of Carol Last from Alice Oseman's amazing novel Radio Silence, who I find to be one of the most deeply repulsive characters I've ever come across, because what motivates her is pure sadism (all the while wearing a smug smile).
Oseman's story works because we've all known people like her, who use their power to hurt people simply because they can, who take pleasure in making their subordinates' lives miserable. What's worse is because of that twist in their personality, a frightening number of them become parents, bosses, teachers, and -- as we're currently finding out here in the United States -- political leaders.
The reason this whole villainous topic comes up is because of a paper published in the journal Psychological Science called "Can Bad Be Good? The Attraction of a Darker Self," by Rebecca Krause and Derek Rucker, both of Northwestern University. In a fascinating study of the responses of over 235,000 test subjects to fictional characters, Krause and Rucker found that people are sometimes attracted to villains -- and the attraction is stronger if the villain embodies positive characteristics they themselves share.
For example, Emperor Palpatine is ruthless and cruel, but he also is intelligent and ambitious -- character traits that in a better person might be considered virtuous. The Joker is an essentially amoral character who has no problem killing people, but his daring, his spontaneity, his quirkiness, and his sense of humor are all attractive characteristics. Professor Moriarty is an out-and-out lunatic -- especially as played by Andrew Scott in the series Sherlock -- but he's brilliant, clever, inventive, and fearless.
And what Krause and Rucker found was that spontaneous and quirky people (as measured by personality assessments) tended to like characters like The Joker, but not characters like the humorless Palpatine. Despite his being essentially evil, Moriarty appealed to people who like puzzles and intellectual games -- but those same people weren't so taken with the more ham-handed approach of a character like Darth Vader.
"Given the common finding that people are uncomfortable with and tend to avoid people who are similar to them and bad in some way, the fact that people actually prefer similar villains over dissimilar villains was surprising to us," said study co-author Rucker, in an interview in the Bulletin for the Association of Psychological Science. "Honestly, going into the research, we both were aware of the possibility that we might find the opposite."
What seems to be going on here is that we can admire or appreciate a villain who is similar to us in positive ways -- but since the character is fictional, it doesn't damage our own self-image as it would if the villain was a real person harming other real people, or (worse) if we shared the villain's negative traits as well.
"Our research suggests that stories and fictional worlds can offer a ‘safe haven’ for comparison to a villainous character that reminds us of ourselves," said study lead author Rebecca Krause. "When people feel protected by the veil of fiction, they may show greater interest in learning about dark and sinister characters who resemble them."
Which makes me wonder about myself, because my all-time favorite villain is Missy from Doctor Who.
Okay, she does some really awful things, is erratic and unpredictable and has very little concern about human life -- but she's brilliant, and has a wild sense of humor, deep curiosity about all the craziness that she's immersed in, and poignant grief over the loss of her home on Gallifrey. Played by the stupendous Michelle Gomez, Missy is a complex and compelling character I just love to hate.
What that says about me, I'll leave as an exercise for the reader.
On the other hand, I still fucking loathe Carol Last. I would have loved to see her tied to the railroad tracks, Dudley Doright-style, at the end of the book.
Wednesday, February 5, 2025
Revising Drake
Math-phobes, fear not; it's not as hard as it looks. The idea, which was dreamed up by cosmologist Frank Drake back in 1961, is that you can estimate the number of civilizations in the universe with whom communication might be possible (Nb) by multiplying the probabilities of seven other independent variables, to wit:
R* = the average rate of star formation in our galaxySome of those (such as R*) are considered to be understood well enough that we can make a fairly sure estimate of their magnitudes. Others -- such as fp and ne -- were complete guesses in Drake's time. How many stars have planets? Seemed like it could have been nearly all of them, or it perhaps the Solar System was some incredibly fortunate fluke, and we're one of the only planetary systems in existence.
fp = the fraction of those stars that have planets
ne = the fraction of those stars with planets whose planets are in the habitable zone
fl = the fraction of planets in the habitable zone that develop life
fi = the fraction of those planets which eventually develop intelligent life
fc = the fraction of those planets with intelligent life whose inhabitants develop the capability of communicating over interstellar distances
L = the average lifetime of those civilizations
Tuesday, February 4, 2025
The riddle of the sun stones
When you think about it, it's unsurprising that our ancestors invented "the gods" as an explanation for anything they didn't understand.
They were constantly bombarded by stuff that was outside of the science of their time. Diseases caused by the unseen action of either genes or microorganisms. Weather patterns, driven by forces that even in the twenty-first century we are only beginning to understand deeply, and which controlled the all-important supply of food and water. Earthquakes and volcanoes, whose root cause only began to come clear sixty years ago.
Back then, everything must have seemed as mysterious as it was precarious. For most of our history, we've been at the mercy of forces we didn't understand and couldn't control, where they were one bad harvest or failed rainy season or sudden plague from dying en masse.
No wonder they attributed it all to gods and sub-gods -- and devils and demons and witches and evil spirits.
As much as we raise an eyebrow at the superstition and seeming credulity of the ancients, it's important to recognize that they were no less intelligent, on average, than we are. They were trying to make sense of their world with the information they had at the time, just like we do. That we have a greater knowledge base to draw upon -- and most importantly, the scientific method as a protocol -- is why we've been more successful. But honestly, it's no wonder that they landed on supernatural, unscientific explanations; the natural and scientific ones were out of their reach.
The reason this comes up is a recent discovery that lies at the intersection of archaeology and geology, which (as regular readers of Skeptophilia know) are two enduring fascinations for me. Researchers excavating sites at Vasagård and Rispebjerg, on the island of Bornholm, Denmark, have uncovered hundreds of flat stone disks with intricate patterns of engraving, dating from something on the order of five thousand years ago. Because many of the disks have designs of circles with branching radial rays extending outward, they've been nicknamed "sun stones." Why, in around 2,900 B.C.E., people were suddenly motivated to create, and then bury, hundreds of these stones, has been a mystery.
Until now.
Data from Greenland ice cores has shown a sudden spike in sulfates and in dust and ash from right around the time the sun stones were buried -- both hallmarks of a massive volcanic eruption. The location of the volcano has yet to be determined, but what is clear is that it would have had an enormous effect on the climate. "It was a major eruption of a great magnitude, comparable to the well-documented eruption of Alaska’s Okmok volcano in 43 B.C.E. that cooled the climate by about seven degrees Celsius," said study lead author Rune Iversen, of the Saxo Institute at the University of Copenhagen. "The climate event must have been devastating for them."
The idea that the volcanic eruption in 2,900 B.C.E. altered the climate worldwide got a substantial boost with the analysis of tree rings from wood in Europe and North America. Right around the time of the sulfate spike in the Greenland ice cores, there's a series of narrow tree rings -- indicative of short growing seasons and cool temperatures. Wherever this eruption took place, it wrought havoc with the weather, with all of the results that has on human survival.
While the connection between the eruption and the sun stones is an inference, it certainly has some sense to it. How else would you expect a pre-technological culture to respond to a sudden, seemingly inexplicable dimming of the sun, cooler summers and bitter winters with resultant probable crop failures, and even the onset of wildly fiery sunrises and sunsets? It bears keeping in mind that our own usual fallback of "there must be a scientific explanation even if I don't know what it is" is a relatively recent development.
So while burying engraved rocks might seem like a strange response to a climatic change, it is understandable that the ancients looked to a supernatural solution for what must have been a mystifying natural disaster. And we're perhaps not so very much further along, ourselves, given the way a substantial fraction of people in the United States are responding to climate change even though the models have been predicting this for decades, and the evidence is right in front of our faces. We still have plenty of areas we don't understand, and are saddled with unavoidable cognitive biases even if we do our best to fight them. As the eminent science historian James Burke put it, in his brilliant and provocative essay "Worlds Without End":
Science produces a cosmogony as a general structure to explain the major questions of existence. So do the Edda and Gilgamesh epics, and the belief in Creation and the garden of Eden. Myths provide structures which give cause-and effect reasons for the existence of phenomena. So does science. Rituals use secret languages known only to the initiates who have passed ritual tests and who follow the strictest rules of procedure which are essential if the magic is to work. Science operates in the same way. Myths confer stability and certainty because they explain why things happen or fail to happen, as does science. The aim of the myth is to explain existence, to provide a means of control over nature, and to give to us all comfort and a sense of place in the apparent chaos of the universe. This is precisely the aim of science.
Science, therefore for all the reasons above, is not what it appears to be. It is not objectively impartial, since every observation it makes of nature is impregnated with theory. Nature is so complex, and sometimes so seemingly random, that it can only be approached with a systematic tool that presupposes certain facts about it. Without such a pattern it would be impossible to find an answer to questions even as simple as "What am I looking at?"
Monday, February 3, 2025
Riding on a light beam
If something like this were launched today, it would mean we could be getting photographs back from Proxima Centauri in twenty years.
It's an ambitious project, and faces significant hurdles. Even if propelled by lasers -- which, being light, travel at the speed thereof -- navigation becomes increasingly difficult the farther away it gets. Just at the distance of Pluto, our intrepid little spacecraft would be 4.5 light-hours from Earth, meaning if we tried to beam it instructions to dodge around an incoming meteor, it would be 4.5 hours until the command arrived, at which point all that would be left is intrepid scrap metal. And Proxima Centauri is 4.3 light years away.
You see the problem. The Starshot spacecraft would have to be able, on some level, to think for itself, because there simply wouldn't be time for Mission Control to steer it to avoid danger.
There are other obstacles, though. Besides the obvious difficulties of being in the cold vacuum of interstellar space, contending with cosmic rays and the like, there's the problem engendered by its speed. Assuming the estimate of a maximum velocity of twenty percent of light speed is correct, even tiny particles of dust would become formidable projectiles, so Starshot is going to require some heavy-duty shielding, increasing its mass (and thus the amount of energy needed to make it go).
Three years ago we got an encouraging proof of concept, when the group working on the mission -- Russian entrepreneur Yuri Milner's Breakthrough Foundation -- launched a test of the Starshot craft. It was a tiny little thing, small enough to fit in your hand and weighing about the same as a stick of gum, designed and built by engineers at the University of California - Santa Barbara. In the test flight it achieved an altitude of nineteen miles, all the while functioning flawlessly, returning four thousand images of the Earth taken from aloft.
The most significant remaining hurdle is to design the laser system to make Starshot move -- lasers that are extremely powerful yet so finely collimated that they can still strike a ten-centimeter craft square-on from several light years away. The engineering director for Breakthrough, Peter Klupar, is designing a 100,000 gigawatt laser -- to be located, he says, in Chile -- that could be the answer. Of course, such a powerful device is not without its dangers. Reflected off a mirror in space, Klupar says, such a laser could "ignite an entire city in minutes."
Not that there's a mirror out there. So you shouldn't worry at all about that.
"You would think that this is all impossible, but we have folks at Caltech and the University of Southampton and Exeter University working on about fifty contracts on making all [of] this happen," Klupar said. "No one has come up with a deal-breaker that we can find yet. It all seems real."
All of which may seem like science fiction, but it's phenomenal how fast things go from the realm of Star Trek to reality. Klupar compares his light sails to CubeSats, tiny (ten by ten centimeters, weighing a little over a kilogram) orbiting telemetry devices that are now common. "It feels a lot like the way CubeSats felt twenty years ago," he said. "People were saying, 'Those are toys, they're never going to develop into anything, there's no way I can see that ever working.' And today and look them: hundreds of millions of dollars is being spent on them."
So keep your eye on this project. If there's a chance at a remote visit to another star system, I think this is our best bet. The Breakthrough Foundation estimates an actual, honest-to-goodness launch toward a nearby star as early as 2030. Meaning perhaps we could get our first photographs of planets around another star by 2050.
I'll be ninety years old at that point, but if that's what I'm waiting for, I can make it till then.
Saturday, February 1, 2025
Remembrance of things past
"The human brain is rife with all sorts of ways of getting it wrong."
This quote is from a talk by eminent astrophysicist Neil deGrasse Tyson, and is just about spot on. Oh, sure, our brains work well enough, most of the time; but how many times have you heard people say things like "I remember that like it was yesterday!" or "Of course it happened that way, I saw it with my own eyes"?
Anyone who knows something about neuroscience should immediately turn their skepto-sensors up to 11 as soon as they hear either of those phrases.
Our memories and sensory-perceptual systems are selective, inaccurate, heavily dependent on what we're doing at the time, and affected by whether we're tired or distracted or overworked or (even mildly) inebriated. Sure, what you remember might have happened that way, but -- well, let's just say it's not as much of a given as we'd like to think. An experiment back in 2005 out of the University of Portsmouth looked memories of the Tavistock Square (London) bus bombing, and found that a full forty percent of the people questioned had "memories" of the event that were demonstrably false -- including a number of people who said they recalled details from CCTV footage of the explosion, down to what people were wearing, who showed up to help the injured, when police arrived, and so on.
Oddly enough, there is no CCTV footage of the explosion. It doesn't exist and has never existed.
Funny thing that eyewitness testimony is considered some of the most reliable evidence in courts of law, isn't it?
There are a number of ways our brains can steer us wrong, and the worst part of it all is that they leave us simultaneously convinced that we're remembering things with cut-crystal clarity. Here are a few interesting memory glitches that commonly occur in otherwise mentally healthy people, that you might not have heard of:
- Cryptomnesia. Cryptomnesia occurs when something from the past recurs in your brain, or arises in your external environment, and you're unaware that you've already experienced it. This has resulted in several probably unjustified accusations of plagiarism; the author in question undoubtedly saw the text they were accused of plagiarizing some time earlier, but honestly didn't remember they'd read it and thought that what they'd come up with was entirely original. It can also result in some funnier situations -- while the members of Aerosmith were taking a break from recording their album Done With Mirrors, they had a radio going, and the song "You See Me Crying" came on. Steven Tyler said he thought that was a pretty cool song, and maybe they should record a cover of it. Joe Perry turned to him in incredulity and said, "That's us, you fuckhead."
- Semantic satiation. This is when a word you know suddenly looks unfamiliar to you, often because you've seen it repeatedly over a fairly short time. Psychologist Chris Moulin of Leeds University did an experiment where he had test subjects write the word door over and over, and found that after a minute of this 68% of the subjects began to feel distinctly uneasy, with a number of them saying they were doubting that "door" was a real word. I remember being in high school writing an exam in an English class, and staring at the word were for some time because I was convinced that it was spelled wrong (but couldn't, of course, remember how it was "actually" spelled).
- Confabulation. This is the recollection of events that never happened -- along with a certainty that you're remembering correctly. (The people who claimed false memories of the Tavistock Square bombing were suffering from confabulation.) The problem with this is twofold; the more often you think about the false memory or tell your friends and family about it, the more sure you are of it; and often, even when presented with concrete evidence that you're recalling incorrectly, somehow you still can't quite believe it. A friend of mine tells the story of trying to help her teenage son find his car keys, and that she was absolutely certain that she'd seen them that day lying on a blue surface -- a chair, tablecloth, book, she wasn't sure which, but it was definitely blue. They turned the house upside down, looking at every blue object they could find, and no luck. Finally he decided to walk down to the bus stop and take the bus instead, and went to the garage to get his stuff out of the car -- and the keys were hanging from the ignition, where he'd left them the previous evening. "Even after telling me this," my friend said, "I couldn't accept it. I'd seen those keys sitting on a blue surface earlier that day, and remembered it as clearly as if they were in front of my face."
- Declinism. This is the tendency to remember the past as more positive than it actually was, and is responsible both for the "kids these days!" thing and "Make America Great Again." There's a strong tendency for us to recall our own past as rosy and pleasant as compared to the shitshow we're currently immersed in, irrespective of the fact that violence, bigotry, crime, and general human ugliness are hardly new inventions. (A darker aspect of this is that some of us -- including a great many MAGA types -- are actively longing to return to the time when straight White Christian men were in charge of everything; whether this is itself a mental aberration I'll leave you to decide.) A more benign example is what I've noticed about travel -- that after you're home, the bad memories of discomfort and inconveniences and delays and questionable food fade quickly, leaving behind only the happy feeling of how much you enjoyed the experience.
- The illusion of explanatory depth. This is a dangerous one; it's the certainty that you understand deeply how something works, when in reality you don't. This effect was first noted back in 2002 by psychologists Leonid Rozenblit and Frank Keil, who took test subjects and asked them to rank from zero to ten their understanding of how common devices worked, including zippers, bicycles, electric motors, toasters, and microwave ovens, and found that hardly anyone gave themselves a score lower than five on anything. Interestingly, the effect vanished when Rozenblit and Keil asked the volunteers actually to explain how the devices worked; after trying to describe in writing how a zipper works, for example, most of test subjects sheepishly realized they actually had no idea. This suggests an interesting strategy for dealing with self-styled experts on topics like climate change -- don't argue, ask questions, and let them demonstrate their ignorance on their own.
- Presque vu. Better known as the "tip-of-the-tongue" phenomenon -- the French name means "almost seen" -- this is when you know you know something, but simply can't recall it. It's usually accompanied by a highly frustrating sense that it's right there, just beyond reach. Back in the days before The Google, I spent an annoyingly long time trying to recall the name of the Third Musketeer (Athos, Porthos, and... who???). I knew the memory was in there somewhere, but I couldn't access it. It was only after I gave up and said "to hell with it" that -- seemingly out of nowhere -- the answer (Aramis) popped into my head. Interestingly, neuroscientists are still baffled as to why this happens, and why turning your attention to something else often makes the memory reappear.
So be a little careful about how vehemently you argue with someone over whether your recollection of the past or theirs is correct. Your version might be right, or theirs -- or it could easily be that both of you are remembering things incompletely or incorrectly. I'll end with a further quote from Neil deGrasse Tyson: "We tend to have great confidence in our own brains, when in fact we should not. It's not that eyewitness testimony by experts or people in uniform is better than that of the rest of us; it's all bad.... It's why we scientists put great faith in our instruments. They don't care if they've had their morning coffee, or whether they got into an argument with their spouse -- they get it right every time."