Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, May 31, 2023

Analysis of a partnership

You probably recall from biology class the word symbiosis -- when two organisms share living space.  This sort of relationship can result in a fused life form where even so, the two participants retain a discernible separateness.  (Remember the Trill from Star Trek?)  The melding can go deeper, though; lichens, commonly seen growing on rocks and tree trunks in damp areas, are an example of such a composite, in this case between one or more types of fungus and photosynthetic cyanobacteria.  Deeper still are mitochondria -- the organelles in all eukaryotic cells that conduct cellular respiration and provide the majority of the energy required by the organism -- which are the descendants of single-celled aerobic bacteria that billions of years ago formed a partnership with their host cells so mutually beneficial that now, neither can live without the other.

Symbiosis is usually broken down into three broad classes.  The distinction is how the participating organisms fare.  That one of them benefits in some way is a given; if both were harmed, the relationship would be strongly selected against and probably wouldn't persist very long.  It's what happens to the other that determines what kind of symbiosis it is:
  • parasitism -- one organism benefits, the other is harmed (an example is disease-causing bacteria)
  • commensalism  -- one organism benefits, the other breaks even (such as the bacteria passively riding on our skin)
  • mutualism -- both organisms benefit (such as a good many of the bacteria in our gut, which have increasingly been found to be absolutely essential for health)
The trouble is, nothing in biology is clear-cut.  Our commensal skin bacteria occupy niches that, if they were eradicated, might be taken over by pathogenic species.  (Thus the adjuration by doctors not to overuse topical antibiotics and hand sanitizers.)  So are they actually mutualistic?  Then there are the species that help in some ways and harm in others -- or, perhaps, help one species and harm another.

This, in fact, is why the whole topic comes up today.  Scientists in New Zealand have been working to preserve endangered species on the islands.  There are quite a few, owing to the country's geological (and thus biological) isolation -- it's developed a singular group of endemic species that are uniquely vulnerable to loss of habitat from agriculture and from the introduction of exotic species like cats, pigs, and the ubiquitous sheep.  One such species is the rare Cooper's black orchid (Gastrodia cooperae), which is nearly invisible for most of the year -- the only above-ground part is a long, creeping stem -- and puts on a flower stalk once during the growing season.

[Image licensed under the Creative Commons Kathy Warburton/INaturalist (CC BY 4.0)]

Orchids are notorious for being difficult to grow from seed.  The seeds are minute, and most orchid species are extreme specialists, able to survive only in a very narrow range of conditions.  The result is that conservation efforts are fraught with difficulty.  Trying to germinate the seeds in the lab requires knowing exactly what that particular species needs, which can mean a lot of trial-and-error, and the potential loss of batches of seeds when the efforts fail.

The Cooper's black orchid is no exception.  It's so rare it was only identified in 2016, and is known to live in only three sites in New Zealand.  Fortunately for this species, there is a related orchid species, Gastrodia sesamoides, that is quite common and appears to need many of the same conditions that the Cooper's black does, so scientists have been trying to identify what those conditions are so they can be replicated in the lab.

And it turns out that one of the conditions is the presence of a symbiotic fungus -- Resinicium bicolor.  The fungus infiltrates the roots of the orchid, creating a greater surface area for nutrient and water uptake, much like the mycorrhizae familiar to organic gardeners that can increase crop yields without the addition of inorganic fertilizers.

Where it gets interesting is that Resinicium bicolor was already known to botanists -- as a plant pathogen.  It's a deadly parasite on Douglas firs, an introduced tree in New Zealand that is much used for lumber, causing "white-rot disease."

So is Resinicium a mutualist or a parasite?  The question is, "with respect to what?"  It's lethal to Douglas firs, but essential to the Cooper's black orchid (and, presumably, other native orchid species).

Biology, as I mentioned before, isn't simple.

That, of course, is why it's so endlessly fascinating.  The more we look into the complexity of the natural world, the more it brings home the truth of the quote from Albert Einstein: "Life is a great tapestry.  The individual is only an insignificant thread in an immense and miraculous pattern."


Tuesday, May 30, 2023

Fingerprint of a catastrophe

Ever heard of the Bruneau-Jarbridge event?

If not, it's unsurprising; neither had I.  Plus, it happened twelve million years ago, during the mid-Miocene Epoch.  It's a supervolcano eruption of the Yellowstone Hotspot, which was at the time under what is now southwestern Idaho.  Between then and now, the hotspot has stayed pretty much where it was, but the North American Plate has moved, resulting in its current location underneath northwestern Wyoming,

The Bruneau-Jarbridge event was enormous.  It created monstrous pyroclastic flows that traveled 150 kilometers from the caldera, incinerating everything in their path.  The winds at the time of the eruption were from the west; we know this because the ash produced by the eruption traveled at least 1,600 kilometers to the east, creating meters-thick layers including the ones at the amazing Ashfall Fossil Beds in northeastern Nebraska.

In fact, it's the Ashfall Fossil Beds -- now an official National Natural Landmark and State Historical Park -- that's why the topic comes up.  A friend and frequent contributor of topics for Skeptophilia sent me a photograph of the site, and asked me if I'd heard of it:

[Image licensed under the Creative Commons Carl Malamud, Ashfall fossil beds - Baby rhino "T. L.", CC BY 2.0]

I hadn't, so naturally I had to look into it.

The whole thing is staggering, if grim.  Ashfall contains the skeletons of thousands of animals killed, more or less simultaneously, by the Bruneau-Jarbridge ash cloud.  The remains of the rhinoceros species Teloceras are so common there that one part of the fossil bed has been nicknamed "the Rhino Barn."  But there are lots of other species represented as well; five different kinds of prehistoric horses, including both three-toed and one-toed; three species of camels; two canids, the fox-sized Leptocyon and the wolf-sized Cynarctus; a saber-toothed (!) deer species, Longirostromeryx; three species of turtles; and three species of birds -- a crane, a hawk, and a vulture.

Despite the size of the eruption and resulting ash cloud, everything in the area didn't die during the ashfall.  Some of the bones show signs of scavenging, and some have breaks and tooth marks consistent with the dentition of the hyena-like canid Aelurodon.  So even a horrific catastrophe like Bruneau-Jarbridge didn't extinguish life completely; there were still scavengers around to chow down on the victims.

When looking at this sort of event, the question inevitably comes up of whether it could happen again.  The facile answer is: of course it could.  The Earth is still very much tectonically active, and more specifically, the Yellowstone Hotspot is a live volcano, as the frequent earthquakes and boiling-hot geysers and lakes should indicate.  It's likely to erupt again -- whether a monumental cataclysm like Bruneau-Jarbridge, or something smaller, isn't certain.

But despite the prevalence of clickbait-y YouTube videos about how "Yellowstone is about to erupt!" and "Scientists fear the Earth will crack wide open!" (both direct quotes from video titles), there is no imminent danger from the Yellowstone Hotspot.  What the geologists are actually saying is that a major eruption is likely some time in the next hundred thousand years, which puts it well outside the realm of what most of us should be worried about.

However, there's no doubt the the Ashfall Fossil Beds are a sobering reminder of what the Earth is capable of.  They're the fingerprint of a twelve-million-year-old catastrophe that makes any recent eruption look like a wet firecracker.  But as horrible as it was for the Miocene animals in the path of the ash cloud, it's provided us with a snapshot of what life was like back then, when Nebraska had a climate more like modern Kenya -- and the Great Plains was home to rhinos, camels, horses, and wild dogs.


Monday, May 29, 2023

Going up

Well, it's happened again; a reader has sent me a weird superstition (this one almost amounts to an urban legend) that I'd never heard of before.

You've all heard about the goofy children's game "Bloody Mary," wherein you're supposed to stare into a mirror at night and chant "Bloody Mary" a bunch of times (even those in the know vary the requirement greatly; I've seen everything from twenty to a hundred), and then nothing happens.

So it's a pretty exciting game, as you will no doubt agree.

What's supposed to happen is that the blood-drenched visage of a female ghost will appear in the mirror instead of your own face.  She's supposedly the restless spirit of a woman who killed children.  Which I can sort of sympathize with.  If I was yanked around and forced to appear in mirrors over and over all night long by kids at sleepovers chanting my name, I'd probably want to throttle the little brats, too.

Be that as it may, we have a tale out of South Korea that is similar in spirit (rimshot), if not in detail, to the Bloody Mary legend.  This one is called "Elevator to Another World," and gives you instructions for using an elevator to access some hitherto unreachable and mysterious place.

[Image licensed under the Creative Commons Joe Mabel, Hotel Vancouver elevators 01, CC BY-SA 3.0]

Here's what you're supposed to do:
  1. Find a building that's at least ten stories tall.  (Nota bene: Through all of the remaining steps except the last one, you're supposed to stay in the elevator.)
  2. Go to the tenth floor.
  3. Go to the fourth floor.
  4. Go to the sixth floor.
  5. Go back to the tenth floor.  If you hear voices at this point, don't answer 'em.
  6. Go to the fifth floor.  When the door opens, if a woman gets on, don't talk to her.  Which sounds like good advice re: people on elevators in most cases.
  7. Press the button for the first floor.  If the elevator goes down, you did something wrong.  What should happen is that the elevator should go back up to the tenth floor.  The woman may shriek at you at this point, but you're supposed to ignore her, even if she shrieks what I would, which would be, "Will you stop playing with the fucking elevator and let me go to my floor?"
  8. When the door opens on the tenth floor, get out.  You're in another world.  What you're supposed to do about the woman, I don't know.
  9. So after having a nice look-see in the alternate universe, to get back, return to the elevator (it has to be the same one you used for steps #1-8), and do the steps again in that order.  When you press the button for the first floor in step #7 and the elevator begins to ascend, find the "stop" button and halt the elevator, then press the first floor button again.  You should return safely to the first floor, and must exit the building immediately.
What is this "Other World" like, you might be wondering?  From the account linked above, the two most common characteristics reported are that the Other World is (1) dark, and (2) empty.  Which makes it sound rather unappealing.  If I'm going to expend a lot of time and effort, I want to at least end up somewhere sunny, featuring drinks with little umbrellas.  But none of that, apparently.  Some people have mentioned seeing a "red cross" in the distance, but the author of the article says that "it may not be a cross."

Whatever that means.

This all puts me in mind of a wonderful book by Haruki Murakami called Dance Dance Dance, wherein a guy in a Japanese hotel takes an elevator and stumbles on a mysterious floor that is somehow sandwiched in between two other ordinary floors, and therein he meets a weird character called the Sheep Man.  It's weird, surreal fun, and is written with Murakami's signature lucid, simple style -- he has a way of making the oddest things seem as if they're absolutely normal.

I'm not sure if the Korean urban legend inspired Murakami's book, which would be nice because then it'd actually have accomplished something other than making gullible people waste time going up and down on an elevator.  On the other hand, if you want to give it a try, I encourage you to do so and post your results here.

Other than building security telling you to stop playing with the elevator.


Saturday, May 27, 2023

Clothes make the monster

In new developments in cryptozoology, today we consider: when Bigfoot wears clothes.

The reason this comes up is because of an article by the ever-entertaining Nick Redfern over at Mysterious Universe, which has the title "Further Accounts of Clothed Monsters."  My first reaction was, "Further?  I didn't know that was a thing in the first place."

But it turns out that this isn't the first time Redfern has considered the possibility, and he references an article he wrote a year and a half ago called "When Bigfoot Gets Stylish," which begins thusly:
Without doubt, one of the most bizarre aspects of the Bigfoot phenomenon is that relative to nothing less than clothed Bigfoot!  It’s one thing to encounter such a creature.  It’s quite another, however, to see it fashionably attired in pants and shirts...  Cryptozoologist Loren Coleman says: “In the 1960s and 1970s, reports from the American West would occasionally surface of hairy bipedal Bigfoot being seen with tattered plaid shirts and ragged shorts on their bodies.  In some research, there were intriguing attempts to relate these to files of paranormal encounters with sightings of upright entities said to be wearing ‘checkered shirts.’  (Within parapsychology, there is a subfield of study regarding ‘checkered shirted ghosts.’)  Investigators generally did not know what to make of these Sasquatch wearing plaid shirts, but dutifully catalogued and filed them away, nevertheless.”
I have three questions about this:
  1. Where does Bigfoot get his clothes?  I mean, I can accept spotting Bigfoots wearing shirts and pants, but you very rarely ever see them in the clothing department at Macy's.  Maybe they order them online or something.
  2. There's a "subfield" of paranormal studies specializing in ghosts in checkered shirts?  That seems like kind of a narrow field of study, as if a psychologist decided only to use test subjects who were wearing argyle socks.  You'd think it'd limit your access to data pretty considerably.
  3. So Bigfoots like plaid, eh?  No pinstripes or paisley or hoodies or NFL jerseys or anything?  Someone really needs to work with them on their fashion sense.  Not that I have anything against plaid (or, honestly, have that much room to criticize), but if that's all you wear it becomes a little monotonous.
The more recent article, though, gives us some additional examples, such as a family in Colorado whose car was attacked by "a hairy man or hairy animal... (who) had on a blue-and-white checkered shirt and long pants," a woman in Barnstaple, England who saw a "large black dog... (that) walked on its hind legs... and was covered in a cloak and a monk's hood," and a woman in Kent, England who saw a "hulking figure... (who) had a loincloth around its waist and furred boots."

So that's kind of alarming.  Not that monsters are adopting clothes, but that given the choice, they're deciding to wear blue-and-white check, monk's hoods, loincloths, and furry boots.  I mean, it's not that I'm expecting them to wear Armani suits, but even by my own dubious standards of sartorial elegance, this seems a little odd.

It also occurs to me, apropos of the plaid-wearing Bigfoots, that we might be talking about... people.  I say this from personal experience, given that my mom's family comes from the bayou country of southeastern Louisiana.  You know those folks on the This No Longer Has Anything To Do With History Channel, on the show Swamp People?  Yeah, those folks are all cousins of mine.  Seriously.  I have a photograph of my great-grandfather, along with his wife and ten children, wherein he could easily be mistaken for a Sasquatch in overalls.  My family might be weird as fuck, but they definitely have no problem growing hair.

In any case, the whole thing throws us back into the realm of "the plural of anecdote is not data."  Unfortunately.  Because it adds a certain je ne sais quoi to the field of cryptozoology.  It's also nice to think that in a harsh winter, the Sasquatches have some woolens to keep themselves warm, when their pelts, loincloths, cloaks, and furry boots aren't enough.


Friday, May 26, 2023

The Silpho Moor mystery

Pieces of one of the most enduring mysteries in UFO lore have allegedly been discovered in the National Archives of London.

Called the "Silpho Moor Crash," the incident occurred in November of 1957, when two men who were hiking on Silpho Moor in North Yorkshire, England, saw "a red light falling from the sky" and went to investigate, despite the fact that every time someone does this in a science fiction movie, they end up being messily devoured by evil aliens.  Fortunately for the two men, this did not happen. Instead, they found a saucer-shaped object made of metal, eighteen inches in diameter, which upon opening was found to contain thin copper sheets covered with "unidentifiable hieroglyphics."

The Silpho Moor artifacts, including the "hieroglyphic sheets" (lower right)

The objects were much talked about, and eventually (sources indicate in 1963) they were sent to the London Science Museum for expert analysis.

After that, they were "lost to history."

It's kind of weird how often this happens.  Somebody gets amazing evidence of some hitherto-unproven and unexpected apparition -- UFOs, ghosts, Bigfoot, Ron DeSantis's conscience -- and then after a little bit of buzz and maybe a few blurry photographs, it mysteriously disappears.  The conspiracy theorists waggle their eyebrows suggestively about this, and say that of course the evidence disappears, because the powers-that-be don't want ordinary slobs like you and me to have proof of any of this stuff.

Why the powers-that-be would care if we proved the existence of alien intelligence (for example), I have no idea.  As far as I've seen, the powers-that-be are much more interested in destroying the evil, cunning environmental scientists' conspiracy to defeat a beleaguered but plucky band of heroic corporate billionaires.  I can't imagine they give a rat's ass whether UFOs exist, except insofar as these would really be undocumented aliens.

Be that as it may, the Silpho Moor artifacts were lost -- until now.  Maybe.  Because some people digging around in the London National Archives found, hiding in an old cigarette tin, some shards that are supposedly from the Silpho Moor Crash.

What seems odd to me is that every photograph from the actual crash shows an intact object that looks like an almost comically stereotypical flying saucer, and everything in this latest discovery is just a bunch of broken-up metal.  I suppose the scientists back in 1963 could have hacked the thing apart, but isn't it funny that there's no record of that?

Anyhow, the objects were discovered by an exhibit developer named Khalil Thirlaway, who brought them to the attention of Dr. David Clarke, a journalism professor at Sheffield Hallam University.

"He [Thirlaway] opened the tin box and took out the pieces, it was an amazing revelation -- it had just been sitting there for half a century," Clarke said.  "There must be a lot of it still out there, sitting in someone's attic, or maybe these are the last remaining pieces... I thought it was a prank, but the question remains -- who went to all that trouble at great expense and what did they gain from it?  It has been described several times as Britain's answer to Roswell, and I don't think that's too great an exaggeration."

Well, yes, in the sense that it's a sketchy set of evidence for an incident that no one is sure has anything to do with alien intelligence anyway.  But at least now the fragments are out in the light of day, and with luck some scientists will get involved and analyze them.

Still, I wonder what they'd find that could prove it one way or the other.  Metal fragments are metal fragments, whether they come from outer space or not.  Despite what Geordi LaForge would have you believe, an extraterrestrial spaceship would not be composed of the rare elements whatsisium and thingamajite, because the periodic table is kind of full-up with elements we already know well.  So I don't see any way to differentiate between an alloy from Earth and one from the Klingon Home World.

But that's something we can worry about later.  At least the objects were relocated.  Myself, I'm all for submitting hard evidence for study, whether or not it turns up anything significant.  Otherwise, you're back at the level of personal anecdote -- which is the worst form of evidence there is.


Thursday, May 25, 2023

Facing the impostor

I'll be honest with you. I've felt like an impostor for most of my life.

My job for over thirty years was teaching science in public schools, mostly biology (and other life-science-related classes).  However, I have neither a bachelors nor a master's degree in biology.  My bachelor's degree is in physics -- and I was a lackluster physics student at best -- and my master's degree is in linguistics, of all things.  Along the way I started a master's program in oceanography, but I was kind of lousy at that, too, and got out of research science entirely.  I've taken enough classes in biology for a teaching license (obviously), but frankly, I learned most of the biology I know by the seat of the pants.

Even in my two favorite avocations -- writing and music -- I didn't get where I am by any kind of legitimate, credentialed pathway.  I wasn't in band in school, having been told that I was no good at it by a 6th grade band director, and taught myself the flute and piano.  I was lucky enough to study flute with a wonderful teacher, Margaret Vitus, when I was in my twenties, but that is the sum total of my formal musical background.

I don't even have that in writing.  I took two creative writing classes, one in high school, one in college.  The end.

So I've got a striking lack of framed certificates in Latin to hang on my wall.  When I think about it rationally, it doesn't bother me.  I know I'm competent enough at what I do (in all three realms) that I don't have anything to apologize for.  But that visceral voice isn't so kind -- one of the reasons I feel uncomfortable and outclassed when I'm around academics, people who are in my mind "true intellectuals."

Impostor syndrome is all too common.  Way back in the 1970s, it was studied in women, when in interviews of 150 highly successful and professional women, the vast majority experienced no internal sense of accomplishment, and were constantly afraid that they'd be "found out" as having poorer abilities, knowledge, and qualifications than their bosses and coworkers thought.

[Image licensed under the Creative Commons Mark J Sebastian, Jackie Martinez with a mask, CC BY-SA 2.0]

Recently a team of psychologists gave a closer look to this phenomenon -- and found it's more ubiquitous than anyone thought.  In "Are All Impostors Created Equal?  Exploring Gender Differences in the Impostor Phenomenon-Performance Link," by Rebecca L.Badawy, Brooke A.Gazdag, Jeffrey R. Bentley, and Robyn L. Brouer, of Youngstown State University, Ludwig-Maximilians Universität München, California State University, and Canisius College, respectively, the researchers found that males and females both experience impostor syndrome -- they just respond to it differently.

The research, which appeared in the Journal of Personality and Individual Differences, looked at over 250 people in professional careers, and found some interesting correlations.  First, they did not see a link between feeling like an impostor and actual work performance.  Put more simply; self-styled impostors and people who feel like they deserve to be where they are have about the same levels of competency at work.

What is even more interesting, however, is the difference in reaction between males and females.  In the first experiment, a group was given five problems from the GRE (Graduate Record Examination), used to determine admittance to graduate school.  After working on the problems, they're given feedback on how they did -- but some of the test subjects were told (incorrectly) that they'd gotten all five wrong.

Looking at the responses to this harsh feedback between male "impostors" and female "impostors," the males responded to subsequent tasks with higher anxiety, less effort, and poorer performance, while the females' emotional responses were nearly the opposite -- they were anxious regardless of whether the feedback was positive or negative, but they responded by improving their effort, and their performance went up, too.

In a second experiment, the subjects were told their answers would be shown to a college professor -- placing them in a high-stress, high-accountability context.  Once again, the men who scored high on impostor syndrome responded by an increase in anxiety, and a decrease in both effort and performance; but the women's results were unchanged from a low-stress, low-accountability situation.  The researchers suggested that the cause of the change in the men's responses may have been that exerting lower effort in high-stress situations might give them an "out" to explain poor performance -- but that's only speculation.

As the researchers put it, "Assuming that traditional gender norms hold, males [with impostor syndrome] may have exhibited stronger negative reactions because they believe that society at large values males who demonstrate high competence and at the same time, do not believe that they can fulfill this standard."

Whatever the reason for all this, it's kind of sad, don't you think?  The fact that so many of us can't take honest pleasure in our accomplishments, and feel the need to devalue what we do based on inaccurate standards of who we should be or how we attained our position in our workplace, is a tragedy.  The problem is, these feelings are not rational; I know from experience that all the logical arguments in the world haven't eliminated my sense that I've arrived where I am by illegitimate means.

But I wish -- both for myself and for my fellow impostors -- that it was that easy to eliminate.


Wednesday, May 24, 2023

Nerds FTW

There's a stereotype that science nerds, and especially science fiction nerds, are hopeless in the romance department.

I'd sort of accepted this without question despite being one myself, and happily married to a wonderful woman.  Of course, truth be told, said wonderful woman pretty much had to tackle me to get me to realize she was, in fact, interested in me, because I'm just that clueless when someone is flirting with me.  But still.  Eventually the light bulb appeared over my head, and we've been a couple ever since.

Good thing for me, because not only am I a science nerd and a science fiction nerd, I write science fiction.  Which has to rank me even higher on the romantically-challenged scale.

Or so I thought, till I read a study by Stephanie C. Stern, Brianne Robbins, Jessica E. Black, and, Jennifer L. Barnes that appeared in the journal Psychology of Aesthetics, Creativity, and the Arts, entitled, "What You Read and What You Believe: Genre Exposure and Beliefs About Relationships."  And therein we find a surprising result.

Exactly the opposite is true.  We sci-fi/fantasy nerds make better lovers.

Who knew?  Not me, for sure, because I still think I'm kind of clueless, frankly.  But here's what the authors have to say:
Research has shown that exposure to specific fiction genres is associated with theory of mind and attitudes toward gender roles and sexual behavior; however, relatively little research has investigated the relationship between exposure to written fiction and beliefs about relationships, a variable known to relate to relationship quality in the real world.  Here, participants were asked to complete both the Genre Familiarity Test, an author recognition test that assesses prior exposure to seven different written fiction genres, and the Relationship Belief Inventory, a measure that assesses the degree to which participants hold five unrealistic and destructive beliefs about the way that romantic relationships should work.  After controlling for personality, gender, age, and exposure to other genres, three genres were found to be significantly correlated with different relationship beliefs. Individuals who scored higher on exposure to classics were less likely to believe that disagreement is destructive.  Science fiction/fantasy readers were also less likely to support the belief that disagreement is destructive, as well as the belief that partners cannot change, the belief that sexes are different, and the belief that mindreading is expected in relationships.  In contrast, prior exposure to the romance genre was positively correlated with the belief that the sexes are different, but not with any other subscale of the Relationships Belief Inventory.
Get that?  Of the genres tested, the sci-fi/fantasy readers score the best on metrics that predict good relationship quality.  So yeah: go nerds.

As Tom Jacobs wrote about the research in The Pacific Standard, "[T]he cliché of fans of these genres being lonely geeks is clearly mistaken.  No doubt they have difficulties with relationships like everyone else.  But it apparently helps to have J. R. R. Tolkien or George R. R. Martin as your unofficial couples counselor."

Tolkien?  Okay.  Aragorn and Arwen, Celeborn and Galadriel, even Sam Gamgee and Rose Cotton -- all romances to warm the heart.  But George R. R. Martin?  Not so sure if I want the guy who crafted Joffrey Baratheon's family tree to give me advice about who to hook up with.

One other thing I've always wondered, though, is how book covers affect our expectations.  I mean, look at your typical romance, which shows a gorgeous woman wearing a dress from the Merciful-Heavens-How-Does-That-Stay-Up school of haute couture, being seduced by a gorgeous shirtless guy with a smoldering expression who exudes so much testosterone that small children go through puberty just by walking past him.  Now, I don't know about you, but no one I know actually looks like that.  I mean, I think the people I know are nice enough looking, but Sir Trevor Hotbody and Lady Viola de Cleevauge they're not.

Of course, high fantasy isn't much better.  There, the hero always has abs you could crack a walnut against, and is raising the Magic Sword of Wizardry aloft with arms that give you the impression he works out by bench pressing Subarus.  The female protagonists usually are equally well-endowed, sometimes hiding the fact that they have bodily proportions that are anatomically impossible by being portrayed with pointed ears and slanted eyes, informing us that they're actually Elves, so all bets are off, extreme-sexiness-wise.

And being chased by a horde of Amazon Space Women in Togas isn't exactly realistic, either.  [Image is in the Public Domain]

So even if we sci-fi nerds have a better grasp on reality as it pertains to relationships in general, you have to wonder how it affects our bodily images.  Like we need more to feel bad about in that regard; between Victoria's Secret and Abercrombie & Fitch, it's a wonder that any of us are willing to go to the mall without wearing a burqa.

But anyhow, that's the news from the world of psychology.  Me, I find it fairly encouraging that the scientifically-minded are successful at romance.  It means we have a higher likelihood of procreating, and heaven knows we need more smart people in the world these days.  It's also nice to see a stereotype shattered.  After all, as Oliver Wendell Holmes said, "No generalization is worth a damn.  Including this one."


Tuesday, May 23, 2023

Discarded genius

Way back in 1952, British mathematician and computer scientist Alan Turing proposed a mathematical model to account for pattern formation that results in (seemingly) random patches -- something observed in as disparate manifestations as leopard spots and the growth patterns of desert plants.

Proving that this model accurately reflected what was going on, however, was more difficult.  It wasn't until three months ago that an elegant experiment using thinly-spread chia seeds on a moisture-poor growth medium showed that Turing's model predicted the patterns perfectly.

"In previous studies,” said study co-author Brendan D’Aquino, who presented the research at the March meeting of the American Physical Society, "people kind of retroactively fit models to observe Turing patterns that they found in the world.  But here we were actually able to show that changing the relevant parameters in the model produces experimental results that we would expect."

Honestly, it shouldn't have been surprising.  Turing's genius was unparalleled; the "Turing pattern" model is hardly the only brainchild of his that is still bearing fruit, almost seventy years after his death.  His research on the halting problem -- figuring out if it is possible to determine ahead of time whether a computer program designed to prove the truth or falsity of mathematical theorems will reach a conclusion in a finite number of steps -- generated an answer of "no" and a paper that mathematician Avi Wigderson called "easily the most influential math paper in history."  Turing's work in cryptography is nothing short of mind-blowing; he led the research that allowed the deciphering of the incredibly complex code produced by Nazi Germany's Enigma machine, a feat that was a major contribution to Germany's defeat in 1945.

A monument to Alan Turing at Bletchley Park, where the cryptographic team worked during World War II [Image licensed under the Creative Commons Antoine Taveneaux, Turing-statue-Bletchley 14, CC BY-SA 3.0]

Turing's colleague, mathematician and cryptographer Peter Hilton, wrote the following about him:
It is a rare experience to meet an authentic genius.  Those of us privileged to inhabit the world of scholarship are familiar with the intellectual stimulation furnished by talented colleagues.  We can admire the ideas they share with us and are usually able to understand their source; we may even often believe that we ourselves could have created such concepts and originated such thoughts.  However, the experience of sharing the intellectual life of a genius is entirely different; one realizes that one is in the presence of an intelligence, a sensibility of such profundity and originality that one is filled with wonder and excitement.  Alan Turing was such a genius, and those, like myself, who had the astonishing and unexpected opportunity, created by the strange exigencies of the Second World War, to be able to count Turing as colleague and friend will never forget that experience, nor can we ever lose its immense benefit to us.

Hilton's words are all the more darkly ironic when you find out that two years after the research into pattern formation, Turing committed suicide at the age of 41.

His slide into depression started in January 1952, when his house was burgled.  The police, while investigating the burglary, found evidence that Turing was in a relationship with another man, something that was illegal in the United Kingdom at the time.  In short order Turing and his lover were both arrested and charged with gross indecency.  After a short trial in which Turing refused to argue against the charges, he was found guilty, and avoided jail time if he agreed to a hormonal treatment nicknamed "chemical castration" designed to destroy his libido.

It worked.  It also destroyed his spirit.  The "authentic genius" who helped Britain win the Second World War, whose contributions to mathematics and computer science are still the subject of fruitful research today, poisoned himself to death in June of 1954 because of the actions taken against him by his own government.

How little we've progressed in seven decades.

Here in the United States, state after state are passing laws discriminating against queer people, denying gender-affirming care to trans people, legislating what is and is not allowable based not upon any real concrete harm done, but on thinly-veiled biblical moralism.  The result is yet another generation growing up having to hide who they are lest they face the same kind of soul-killing consequences Alan Turing did back in the early 1950s.

People like Florida governor Ron DeSantis and Texas governor Greg Abbott, who have championed this sort of legislation, seem blind to the consequences.  Or, more likely, they know the consequences and simply don't give a damn how many lives this will cost.  Worse, some of their allies actually embrace the potential death toll.  At the Conservative Political Action Conference in March, Daily Wire host Michael Knowles said, "For the good of society… transgenderism must be eradicated from public life entirely.  The whole preposterous ideology, at every level."

No, Michael, there is no "ism" here.  It's not an "ideology;" it's not a political belief or a religion.  What you are saying is "eradicate transgender people."  You are advocating genocide, pure and simple.

And so, tacitly, are the other people who are pushing anti-LGBTQ+ laws.  Not as blatantly, perhaps, but that's the underlying message.  They don't want queer people to be quiet; they want us erased.

I can speak first-hand to how devastating it is to be terrified to have anyone discover who you are.  I was in the closet for four decades out of shame, not to mention fear of the consequences of being out.  When I was 54 I finally said "fuck it" and came out to friends and family; I came out publicly -- here at Skeptophilia, in fact -- five years after that.  

I'm one of the lucky ones.  I had nearly uniform positive responses.

But if I lived in Florida or Texas?  Or in my home state of Louisiana?  I doubt very much whether I'd have had the courage to speak my truth.  The possibility of dire consequences would have very likely kept me silent.  In Florida, especially -- I honestly don't know how any queer people or allies are still willing to live there.  I get that upping stakes and moving simply isn't possible for a lot of people, and that even if they could all relocate, that's tantamount to surrender.  But still.  Given the direction things are going, it's a monumental act of courage simply to stay there and continue to fight.

It's sickening that we are still facing these same battles.  Haven't we learned anything from the example of a country that discarded the very genius who helped them to defeat the Nazis, in the name of some warped puritanical moralism? 

This is no time to give up out of exhaustion, however, tempting though it is.  Remember Turing, and others like him who suffered (and are still suffering) simply because of who they are.  Keep speaking up, keep voting, and keep fighting.  And remember the quote -- of uncertain origin, though often misattributed to Edmund Burke -- "All that is necessary for the triumph of evil is that good people do nothing."


Monday, May 22, 2023

Dawn life

Currently I'm working my way through Mark McMenamin's book The Garden of Ediacara, an analysis of the fossil evidence from the Vendian Period, the last bit of the Precambrian (650-543 million years ago).

The subject of McMenamin's book is undeniably fascinating -- more about that in a moment -- but it's uneven reading.  Part of it is a travelogue of his work in Namibia, Mexico, and Australia, places where there are significant outcrops of late Precambrian sedimentary rocks, but it's obvious from page one that most of what he does is write papers for scholarly journals.  As a result, it's halfway between an introduction to the topic for laypeople and an extended academic paper, and I've been glad as I worked my way through it that I have at least a passing background in paleontology.

Something that struck me right away, however, was that I've been laboring under a serious misunderstanding of the Ediacaran biota; that it overlapped significantly with the Cambrian explosion fauna, the bizarre creatures like Anomalocaris and Opabinia and the aptly-named Hallucigenia.  In reality, there was almost no overlap, and the Ediacaran organisms such as Cloudina and Dickinsonia were almost certainly driven to extinction and replaced by the large predatory forms of the early Cambrian.

A fossil of Dickinsonia costata from Australia [Image licensed under the Creative Commons Verisimilus at English Wikipedia, DickinsoniaCostata, CC BY-SA 3.0]

While the early Cambrians (best known from the Burgess Shale formation of British Columbia) are clearly animals, the bizarre Ediacarans are of completely uncertain affinities.  When McMenamin wrote his book (1998) there was considerable contention about what they were, with various paleontologists arguing vehemently that they were early animals, fungi, algae, or even giant protists (or protist colonies).  Despite the passage of twenty-five years, the issue is still far from settled.  Some make persuasive arguments that the Vendian biota doesn't belong to any of the five modern kingdoms of life (animals, plants, fungi, bacteria, and archaea), but are representatives of a completely different lineage, or more than one, that left no descendants at all.

So I'm grateful to McMenamin and his book for clearing up something I'd misunderstood for years.

I was in the middle of reading The Garden of Ediacara when, coincidentally, a friend and frequent contributor of topics for Skeptophilia sent me a link to an article in Smithsonian magazine about the evolutionary origin of animals.  Another point of contention amongst biologists is determining, out of the entire kingdom Animalia, which group branched off first.  (This is sometimes phrased as which is the "oldest" or "most primitive" -- both terminology I don't like, because every living animal on Earth has an exactly equal length of evolutionary history.  It's just that during that time, some branches have changed a great deal faster than others, and some groups share more recent common ancestry than others do.)

In any case, the argument is about which group of modern animals is the outgroup -- the one that split off first, and therefore is the most distantly related to all other animals.  When I took zoology (many, many years ago) the conventional wisdom was that it was sponges (Phylum Porifera).  And there's certainly a good case to be made there; sponges are weird animals, with no differentiated organs, skeletons made of either protein fibers, bits of calcium carbonate, or slivers of glass, and no nerves, muscles, or digestive tracts.  But genetic analysis has shown unequivocally that there's an even more distantly-related group -- the comb jellies (Phylum Ctenophora).

They look superficially like jellyfish, and that similarity led scientists to put them on the same branch as Phylum Cnidaria (which not only contains jellyfish, but sea anemones and corals).  The genetic studies, though, show that there's only a distant relationship between comb jellies and jellyfish.  The comb jellies, in fact, show more of a genetic similarity to certain species of protists than they do to other animals.

"That was the smoking gun," said Daniel Rokhsar, of the University of California - Berkeley, who co-authored the paper.

So this goes to show that there's a lot we still have to learn about the earliest life on our planet.  And I'm sure that as definitive as this study seems to be, it won't be the last word.  As more evidence surfaces, expect the arrangement to change.  This, after all, is how science works; it has a mechanism for self-correcting.  And far from the reaction I've seen people have -- that the shifting understanding means "it could all be proven wrong tomorrow" -- that capacity for change is science's main strength.

After all, isn't it a good thing to have your model shift to accommodate new information?  Seems like standing firm on what you believe despite strong evidence to the contrary is the cause of a lot of the problems in the world.


Saturday, May 20, 2023

Raw nonsense

Despite the fact that our modern lifestyle has increased our life expectancy to longer than it's ever been in the history of humanity, romanticizing the practices of the past is still ridiculously widespread.

People who claim that "everything causes cancer" conveniently ignore two things: first, that a good many forms of cancer would decline dramatically if we'd do things doctors recommend, like cutting out tobacco and getting vaccinated against HPV; and second, that one of the reasons cancer rates have climbed is that we're no longer dying of other stuff, like diphtheria, typhoid, measles, and smallpox.

But that kind of thinking seldom makes any inroads into the minds of people committed to anti-vaxx (or completely anti-medical) propaganda.  The levels of irrationality some of this thinking reaches are truly staggering.  I had one person comment on one of my posts -- in all apparent seriousness -- "my great-grandma never got vaccinated against anything, and she survived."

Well, of course she did.  If she'd died at age three of diphtheria, she wouldn't have been your great-grandma, now would she?

How about asking great-grandma how many of her siblings and cousins died of childhood infectious diseases -- like my grandfather's two oldest sisters, Marie-Aimée and Anne-Désée, who died five days apart at the ages of 22 and 16 -- of measles.

The person who posted that comment should win some sort of award for compressing the greatest number of fallacies into the shortest possible space.  Confirmation bias, cherry-picking, anecdotal evidence, and the post hoc fallacy, all in nine words.  Kind of impressive, actually.

Despite all this, there are huge numbers of people who want to return to what our distant ancestors did, claiming that it's "healthier" or "more natural," conveniently neglecting the fact that back then, as Thomas Hobbes so trenchantly put it, "life was solitary, poor, nasty, brutish, and short."

The result is the kind of thing I ran into in an article in Ars Technica last week about a trend I hadn't heard of, which is to drink "raw water."  "Raw water," which you might guess from the name, is water that hasn't been filtered or treated, but is collected (or even bottled and sold) right from a spring or river or whatnot.  And predictably, what happened was that nineteen people fell ill with a diarrheal disease (specifically Campylobacter jejuni) when it turned out that their trendy "natural spring water" turned out to be just ordinary runoff from a creek drainage that had been contaminated by bacteria from bird nests.

The amount of pseudoscience you run into with this stuff is astonishing.  In researching this topic, I found people who claim that "industrially-processed water" (i.e. most tap water) has "mind-control drugs" in it, designed to turn us all into Koolaid-drinkin' sheeple, and even one that said treatment plants deliberately "alter the molecular structure of water, turning it into a toxin."

Making me wonder how, or if, these people passed high school chemistry.

I spent the summers during my twenties and thirties back-country camping in the Cascades and Olympics, and I know how careful you have to be.  The clearest bubbling mountain brook can be contaminated with nasty stuff like Giardia and Salmonella, two diseases that should be high on the list of germs you never want to have inside you.  I used iodine sterilizing tablets for all the water I drank -- and I never got sick.  But I knew people who did, and as one of them vividly described it, "Having Giardia means that for three weeks you're going to be on a first-name basis with your toilet."

Which is funny until you find out that in the process, he lost twenty pounds and spent three days in the hospital hooked up to an IV so he could stay hydrated.

Look, I know our high-tech world isn't perfect.  I know about pesticides and herbicides and industrial contamination and coverups and food additives with dubious health effects.  My wife and I try as hard as we can to eat locally-sourced organic meat and produce, not to mention growing our own vegetables.  But the admittedly true statement that technology and the pharmaceuticals industry have created some problems does not equate to "therefore we should jettison everything they provide and return to the Stone Age."

Speaking of fallacies, there's another one for you: the package-deal fallacy.  You get into this stuff, it reads like the "what not to do" section of a critical thinking textbook.

So if you're inclined to switch over to "raw water," just don't.  Drinking water is treated for a reason.  Our Stone Age ancestors didn't have such great lives, and idealizing it as some kind of idyllic Garden of Eden is complete horse shit. 

Horse shit ironically being one of the things that might well be in your "raw water."


Friday, May 19, 2023

Mapping our world

My novel The Scattering Winds is the second of a trilogy, of which the first book (In the Midst of Lions) is scheduled to be out this summer.  The setting of the trilogy is the Pacific Northwest.  In the first book, there's a worldwide collapse of civilization.  In the second, set six hundred years later, what's left of humanity has reverted to a new Dark Ages, mostly non-literate and non-technological.  In the third (The Chains of the Pleiades), six hundred years after that, technology and space flight have been re-invented -- along with all the problems that brings.

The main character of the second book, Kallian Dorn, comes from a people have lost the knowledge of reading, committing all of their culture's memory to the mind of one person, called the Guardian of the Word.  But when they find a girl from a distant town, a refugee, who knows the rudiments of reading and writing, they recognize what's been lost, and struggle, slowly, to reclaim it.  Kallian undertakes a voyage, on foot, to the girl's home town -- and finds there a mostly-intact library from what he calls "the Before Times."

The following takes place when Kallian, who by this time has learned the basics of how to read, discovers a room full of maps in the library:

He went into the first room he encountered. It was labeled “Maps.”  Holding the lamp aloft, he passed into a room filled with odd cabinets, most of which had very wide, shallow drawers.  The nearest one said, “North America,” and he set the lamp down to open the top drawer.

Sitting on top was a yellowed piece of paper, about an arm’s length wide and tall, with a drawing of… what was it?  He peered closer, and read the inscription at the top, written in an ornate, curly script he could barely decipher.  It said, “United States of America, The Year of Our Lord 1882.”  There were names written in smaller, but equally frilly, lettering, and gave him enough information to conclude that it was a drawing of a land, as if seen from above.  The faded blue bits were bodies of water: Lake Ontario.  The Caribbean Sea.  The Atlantic Ocean.  The green parts—well, they were only green in splotches, mostly they had faded to a yellowish-brown—were land.  He saw features like “Appalachian Mountains” and “Great Plains” and “Mississippi Delta.”  The land was divided by oddly artificial-looking black lines, some dead straight, others following natural features such as the course of rivers.  Each of the blocks thus delineated had a strange and unfamiliar name: Massachusetts.  New York.  Georgia.  Kentucky.

Had these been kingdoms of the Before Time?

1882—if he was correct about what the date-numbers signified, this would have been about a century and a half before the collapse, before the floods and plagues that had ended the old world.  And a full 750 years before now.

But where was this United States of America, with its bizarrely-named mountains and lakes and kingdoms?  Without a referent, without having an arrow on the map saying “You are here,” he had no way to know if it was a day’s march away or on the other side of the world.

He flipped through the maps in those and other cabinets, handling them carefully to keep the age-worn paper from crumbling in his hands.  His mind was overwhelmed with how many different lands there were—whole cabinets devoted to maps from places called Europe, Africa, Asia, Australia.  But even looking at them, as fascinating as it was, was not like reading the books he’d found, where meaning provided an anchor to keep him fastened to reality as he knew it.  Without a key, the maps gave him no way to tell scale or location of anything.  Learning to read had unlocked one type of cipher; here was an entirely different kind, one where even though he could read the words, they didn’t make sense.

I was reminded of this scene when I read an article yesterday in Science News about archaeologists who believe they've discovered the oldest-ever aerial-view scale drawings -- in other words, maps.  There are structures in the Middle East nicknamed "kites" that were huge stone-walled enclosures used to trap animals like gazelles, funneling their movements toward waiting hunters.  And a team of archaeologists working in Jordan and Saudi Arabia have found nine-thousand-year-old engravings on stones that appear to be maps of nearby kites -- perhaps made by people strategizing how best to use them in their game-harvesting efforts.

Map-making, when you think about it, is kind of an amazing accomplishment.  It requires changing your perspective, picturing what some thing -- a city, a body of water, a country, an entire continent -- would look like from above.  And even if to our modern eyes, when we can see what things look like from the air, old maps look pretty inaccurate, it's important to remember that they did it all by surveying from ground (or sea) level.

And given that, they did pretty damn well, I think.

A map of the world, ca. 1565 [Image is in the Public Domain]

The fact that we were doing this nine thousand years ago is kind of astonishing.  Intrepid folks, our ancestors.

So many of the things we do today, and consider "modern," have far deeper roots than we realize.  And this ability to shift perspective, to consider what things would look like from another angle, is something we've had for a very long time -- even if to someone like Kallian Dorn, the results look very like magic.