Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, March 18, 2023

It's the end of the world, if you notice

I have commented more than once about my incredulity with regards to end-of-the-world predictions.  Despite the fact that to date, they have had a 100% failure rate, people of various stripes (usually of either the ultra-religious persuasion or the woo-woo conspiracy one) continue to say that not only is the world doomed, they know exactly when, how, and why.  (If you don't believe me, take a look at the Wikipedia page for apocalyptic predictions, which have occurred so often they had to break it down by century.)  

As far as why this occurs -- why repeated failure doesn't make the true believers say, "Well, I guess that claim was a bunch of bullshit, then" -- there are a variety of reasons.  One is a sort of specialized version of the backfire effect, which occurs when evidence against a claim you believe strongly leaves you believing it even more strongly.  Way back in 1954 psychologists Leon Festinger, Henry Riecken, and Stanley Schachter infiltrated a doomsday cult, and in fact Festinger was with the cult on the day they'd claimed the world was going to end.  When 11:30 PM rolled around and nothing much was happening, the leader of the cult went into seclusion.  A little after midnight she returned with the joyous news that the cult's devotion and prayers had averted the disaster, and god had decided to spare the world, solely because of their fidelity.

Hallelujah!  We better keep praying, then!

(Note bene: The whole incident, and the analysis of the phenomenon by Festinger et al., is the subject of the fascinating book When Prophecy Fails.)

Despite this, the repeated failure of an apocalyptic prophecy can cause your followers to lose faith eventually, as evangelical preacher Harold Camping found out.  So the people who believe this stuff often have to engage in some fancy footwork after the appointed day and hour arrive, and nothing happens other than the usual nonsense.

Take, for example, the much-publicized "Mayan apocalypse" on December 21, 2012 that allegedly was predicted by ancient Mayan texts (it wasn't) and was going to herald worldwide natural disasters (it didn't).  The True Believers mostly retreated in disarray when December 22 dawned, as well they should have.  My wife and I threw a "Welcoming In The Apocalypse" costume party on the evening of December 21 (I went as a zombie, which I felt was fitting given the theme), and I have to admit to some disappointment when the hour of midnight struck and we were all still there.  But it turns out that not all of the Mayan apocalyptoids disappeared after the prediction failed; one of them, one Nick Hinton, says actually the end of the world did happen, as advertised...

... but no one noticed.

Hinton's argument, such as it is, starts with a bit of puzzling over why you never hear people talking about the 2012 apocalypse any more.  (Apparently "it didn't happen" isn't a sufficient reason.)  Hinton finds this highly peculiar, and points out that this was the year CERN fired up the Large Hadron Collider and discovered the Higgs boson, and that this can't possibly be a coincidence.  He wonders if this event destroyed the universe and/or created a black hole, and then "sucked us in" without our being aware of it.

[Image licensed under the Creative Commons Lucas Taylor / CERN, CMS Higgs-event, CC BY-SA 3.0]

Me, I think I'd notice if I got sucked into a black hole.  They're kind of violent places, as I described a recent post about Sagittarius A* and the unpleasant process called "spaghettification."   But Hinton isn't nearly done with his explanation.  He writes:
There's the old cliché argument that "nothing has felt right" since 2012.  I agree with this...  [E]ver since then the world seems to descend more and more into chaos each day.  Time even feels faster.  There's some sort of calamity happening almost daily.  Mass shootings only stay in the headlines for like 12 hours now.  Did we all die and go to Hell?...  Like I've said, I think we live in a series of simulations.  Perhaps the universe was destroyed by CERN and our collective consciousness was moved into a parallel universe next door.  It would be *almost* identical.
Of course, this is a brilliant opportunity to bring out the Mandela effect, about which I've written before.  The idea of the Mandela effect is that people remember various stuff differently (such as whether Nelson Mandela died in prison, whether it's "Looney Tunes" or "Loony Tunes" and "The Berenstein Bears" or "The Berenstain Bears," and so forth), and the reason for this is not that people's memories in general suck, but that there are alternate universes where these different versions occur and people slip back and forth between them.

All of which makes me want to take Ockham's Razor and slit my wrists with it.

What I find intriguing about Hinton's explanation is not all the stuff about CERN, though, but his arguing that the prediction didn't fail because he was wrong, but that the world ended and seven-billion-plus people didn't even notice.  Having written here at Skeptophilia for over twelve years, I'm under no illusions about the general intelligence level of humanity, but for fuck's sake, we're not that unobservant.  And even if somehow CERN did create an alternate universe, why would it affect almost nothing except for things like the spelling of Saturday morning cartoon titles?

So this is taking the backfire effect and raising it to the level of performance art.  This is saying that it is more likely that the entire population of the Earth was unaware of a universe-ending catastrophe than it is that you're simply wrong.

Which is so hubristic that it's kind of impressive.

But I better wind this up, because I've got to prepare myself for the next end of the world, which (according to Messiah Foundation International, which I have to admit sounds pretty impressive) is going to occur in January of 2026.  This only gives us all a bit shy of three years to get ready, so I really should get cracking on my next novel.  And if that apocalypse doesn't pan out, evangelical Christian lunatic Kent Hovind says not to worry, the Rapture is happening in 2028, we're sure this time, cross our hearts and hope to be assumed bodily into heaven.

So many apocalypses, so little time.

****************************************



Friday, March 17, 2023

The heart of the world

One of the biggest mysteries in science lies literally beneath our feet; the structure and composition of the interior of the Earth.

We have direct access only to the barest fraction of it.  The deepest borehole ever created is the Kola Superdeep Borehole, on the Kola Peninsula in Russia near the border of Norway.  It's 12.26 kilometers deep, which is pretty impressive, but when you realize that the mean radius of the Earth is just under 6,400 kilometers, it kind of puts things in perspective.

What we know is that the crust is principally silicate rock -- lower-density felsic rocks (like granite) forming the majority of the continental crust, and denser mafic rocks (like basalt) comprising the thinner oceanic crust.  Beneath that is the semisolid mantle, which makes up two-thirds of the Earth's mass.  Inside that is the outer core, thought (primarily from estimates of density) to be made up of liquid iron and nickel, and within that the inner core, a solid ball of red-hot iron and nickel.

At least that's what we thought.  All of this was determined through inference from evidence like the relative speed of different kinds of seismic waves; despite what Jules Verne would have you believe, no one has been to the center of the Earth (nor is likely to).  But figuring all this out is important not just from the standpoint of adding to our knowledge of the planet we live on, but in comprehending phenomena like magnetic field reversals -- something that would have obvious impacts on our own lives, and which are still poorly understood at best.

We just got another piece of the puzzle in the form of a paper last week in Nature that suggests our picture of the Earth's inner core as a homogeneous ball of solid iron and nickel may not be right.  Using data from seismic waves, scientists at the Australian National University in Canberra have concluded that the inner core itself has two layers.  The exact difference between the two isn't certain -- as I said before, we're limited by what information we can get long-distance -- but the best guess is that it's a difference in crystal structure, probably caused by the immense pressures at the center.

[Image courtesy of Drew Whitehouse, Hrvoje Tkalčić, and Thanh-Son Phạm]

In general, whenever a wave crosses a boundary from one medium to another, it refracts (changes angle); this is why a straw leaning in a glass of water looks like it's bent.  If the angle is shallow enough, some of the wave's energy can also reflect off the interface.  When that happens to seismic waves inside the Earth, those reflected waves bounce around inside the core; when they finally make it back out and are measured by scientists on the Earth's surface, features such as the energy, wavelength, and angle can provide a lot of information about the materials it passed through on its journey.

The authors write:
Earth’s inner core (IC), which accounts for less than 1% of the Earth’s volume, is a time capsule of our planet’s history.  As the IC grows, the latent heat and light elements released by the solidification process drive the convection of the liquid outer core, which, in turn, maintains the geodynamo.  Although the geomagnetic field might have preceded the IC’s birth5, detectable changes in the IC’s structures with depth could signify shifts in the geomagnetic field’s operation, which could have profoundly influenced the Earth’s evolution and its eco-system.  Therefore, probing the innermost part of the IC is critical to further disentangling the time capsule and understanding Earth’s evolution in the distant past.

The discovery of the Earth's hitherto-unknown center could help us to understand one of the most fundamental questions in geology; the structure of the inside of the Earth.  We still have a very long way to go, of course.  As I said, even understanding how exactly the core generates the Earth's protective magnetic field is far from achieved.  But the new research gives us a deeper comprehension of the structure of the inner core -- the red-hot heart hidden beneath the deceptively tranquil surface of our home planet. 

****************************************



Thursday, March 16, 2023

The reanimators

An announcement a few weeks ago by microbiologist Jean-Michel Claverie of Aix-Marseille University that he and his team had successfully resuscitated a 48,500-year-old virus from the Siberian permafrost brought horrified comments like, "I read this book, and it didn't end well" and "wasn't this an episode of The X Files?  And didn't just about everyone die?"  It didn't help when Claverie's team mentioned that the particular virus they brought back to life belonged to a group called (I shit you not) "pandoraviruses," and the media started referring to them by the nickname "zombie viruses."

Claverie's pandoravirus [Image courtesy of Chantal Abergel and Jean-Michel Claverie]

The team hastened to reassure everyone that the virus they found is a parasite on amoebas, and poses no threat to humans.  This did little to calm everyone down, because (1) not that many laypeople understand viral host specificity, and (2) shows like The Last of Us, in which a parasitic fungus in insects jumps to human hosts and pretty much wipes out humanity, have a fuckload more resonance in people's minds than some dry scientific paper.

What's scary about Claverie's study, though, isn't what you might think.  First, the good news.  Not only is the virus they found harmless to humans, the team is made up of trained microbiologists who are working under highly controlled sterile conditions.  Despite what the "lab leak" proponents of the origins of COVID-19 would have you believe, the likelihood of an accidental release of a pathogen from a lab is extremely unlikely.  (The overwhelming consensus of scientists is that COVID is zoonotic in origin, and didn't come from a lab leak, accidental or deliberate.)  So the obvious "oh my god what are we doing?" reaction, stemming from a sense that we shouldn't "wake up" a frozen virus because it could get out and wreak havoc, is pretty well unfounded.

What worries me is the reason Claverie and his team are doing the research in the first place.

Permafrost covers almost a quarter of the land mass of the Northern Hemisphere.  A 2021 study found that every gram of Arctic permafrost soil contains between a hundred and a thousand different kinds of microbes, some of which -- like Claverie's pandoravirus -- have been frozen for millennia.  A three-degree Celsius increase in global average temperature could melt over thirty percent of the upper layers of Arctic soil.

So potentially, what Claverie's team did under controlled, isolated conditions could happen out in the open with nothing to keep it in check.

Concern over this isn't just hype.  In 2016, melting permafrost in Siberia thawed out the carcass of a reindeer that had died of anthrax.  Once thawed, the spores were still viable, and by the time the incident had been contained, dozens of people had been hospitalized, one had died, and over two thousand reindeer had been infected.  Anthrax isn't some prehistoric microbe that scientists know nothing about, which actually acted in our favor; once it was identified, doctors knew how to treat it and prevent its further spread.

But what if the thawing frost released something we haven't had exposure to for tens of thousands of years, and that was unknown to science?

"We really don’t know what’s buried up there," said Birgitta Evengård, a microbiologist at Umeå University in Sweden, which in a few words says something that is absolutely terrifying.

So the hysteria over Claverie's reawakening of the "zombie virus" focused on the wrong thing.  The reanimators we should be worried about aren't Claverie and his team; they're us.  There were already a myriad excellent reasons to curb fossil fuel use (hard) and try to rein in climate change, but this study just gave us another one.

As always, the problem isn't the scientists; the scientists are the ones trying to figure all this out in time to prevent a catastrophe.  (And, if I haven't made this point stridently enough already, the scientists have been trying to warn us about the effects of climate change for decades.)  The problem is the fact that politicians, and the voters who elect them, have steadfastly refused to do a damn thing about a problem that we could have addressed years ago and that has so many potential horrible outcomes you'd think any one of them would be sufficient justification for acting.  

So how about we stop worrying about the wrong thing and face the fact that we're the ones who need to change what we're doing?

****************************************



Wednesday, March 15, 2023

Life in the shadows

In Michael Ray Taylor's brilliant1999 book Dark Life, the author looks at some of the strangest forms of life on Earth -- extremophiles, organisms (mainly bacteria) that thrive in places where nothing else does.  Surrounding hydrothermal vents under crushing pressures and temperatures over 100 C, buried underground below the deepest mines, frozen in Antarctic ice, floating in boiling, acidic hot springs.  Taylor himself is a veteran spelunker and got interested in the topic after running into the aptly-named snottites -- biofilms found in caves that hang downward from the ceiling and are the consistency of, well, snot.

The brilliant colors of Grand Prismatic Spring in Yellowstone National Park are due, in part, to extremophilic bacteria [Image is in the Public Domain]

Taylor's contention -- that such bizarre creatures are so numerous that they outnumber all other life forms on Earth put together -- got a boost from a piece of research published in the Journal of Geomicrobiology.  Written by a team from the University of Toronto -- Garnet S. Lollar, Oliver Warr, Jon Telling, Magdalena R. Osburn, and Barbara Sherwood Lollar -- it describes the discovery, 7,900 meters underground, of a thriving ecosystem of microbes in a mine 350 kilometers north of Toronto.

The life forms are odd in a number of respects.  The first is that they're anaerobic -- they don't need oxygen to survive.  The second is that they metabolize sulfur, primarily in the form of iron sulfate, better known as pyrite or fool's gold.  It's a food chain completely unhooked from light -- for nearly every other organism on Earth, the energy they contain and utilize can ultimately be traced back to sunlight.  Here, if you follow the energy backwards, you arrive at the geothermal heat from the mantle of the Earth producing reduced (high energy) compounds that can support a food web, similar to what you see in deep-sea hydrothermal vents.

"It's a fascinating system where the organisms are literally eating fool's gold to survive," team member Barbara Sherwood Lollar said in an interview with NBC News.  "What we are finding is so exciting — like ‘being a kid again’ level exciting."  The ecosystem is in the Laurentian Shield, one of the oldest and most geologically-stable places on Earth, so it's likely that this thriving community deep underground has been there for a billion years or more.  "The number of systems we've looked at so far really is limited, but they probably had a single origin at some point in life’s four-billion-year history."  As far as their discovery, she added, "We see only what we look for.  If we don't look for something, we miss it."

And it's a lot to miss.  The current research springboards off a 2018 report sponsored by the Deep Carbon Observatory conducted by a team led by Cara Magnabosco, a geobiologist at the Swiss technical university ETH Zurich, which estimated that some 5 x 10^29 cells live in the deep Earth.

For those you who don't like scientific notation, that's five hundred thousand trillion trillion organisms.  Put succinctly, it's a really freakin' huge number.

Considering the (to us) inhospitable conditions a lot of these organisms live under, it raises hopes of finding life in other, perhaps unexpected, places in the universe.  Astronomers talk about the "Goldilocks zone," the region around a star that has temperatures where water is a liquid, and that to host life a planet would have to have a similar mass to Earth and be orbiting a star relatively similar to the Sun.  The University of Toronto research suggests that may be placing unnecessary and inaccurate strictures on where life can exist, and that we may have to rethink our definition of what we mean by "hospitable conditions."

"We're finding we really don't understand the limits to life," Sherwood Lollar said.

Which also raises the question of whether we'd recognize alien life if we saw it.  Star Trek may have been prescient; they expanded the boundaries of what we think of as life by featuring aliens that were gaseous, crystalline, thrived at searing temperatures, could tolerate the chill dark vacuum of space, or were composed of pure energy.  While some of these -- at least at first glance -- seem pretty far-fetched, what the current research suggests is that we shouldn't be too hasty to say, "Okay, that's out of the question."

"We've literally only scratched the surface of the deep biosphere," said Robert Hazen, mineralogist at the Carnegie Institution’s Geophysical Laboratory in Washington, and co-founder of Deep Carbon Observatory.  "Might there be entire domains that are not dependent on the DNA, RNA and protein basis of life as we know it?  Perhaps we just haven’t found them yet."

****************************************



Tuesday, March 14, 2023

Genes, lost and found

There's a famous anecdote about British biologist J. B. S. Haldane.  Haldane was a brilliant geneticist and evolutionary biology but was also notorious for being an outspoken atheist -- something that during his lifetime (1892-1964) was seriously frowned upon.  The result was that religious types frequently showed up at his talks, whether or not the topic was religion, simply to heckle him.

At one such presentation, there was a question-and-answer period at the end, and a woman stood up and asked, "Professor Haldane, I was wondering -- what have your studies of biology told you about the nature of God?"

Without missing a beat, Haldane said, "All I can say, ma'am, is that he must have an inordinate fondness for beetles."

There's some justification for the statement.  Beetles, insects of the order Coleoptera, are the most diverse order in Kingdom Animalia, with over four hundred thousand different species known.  (This accounts for twenty-five percent of known animal species, in a single order of insects.)  The common ancestor of all modern species of beetles was the subject of an extensive genetic study in 2018 by Zhang et al., which found that the first beetles lived in the early Permian Period, on the order of three hundred million years ago.  They survived the catastrophic bottleneck at the end of the Permian and went on to diversify more than any other animal group.

One striking-looking family in Coleoptera is Buprestidae, better known as "jewel beetles" because of their metallic, iridescent colors.  Most of them are wood-borers; a good many dig into dying or dead branches, but a few (like the notorious emerald ash borer, currently ripping its way through forests in the northern United States and Canada) are significant agricultural pests.

A few of them have colors that barely look real:

An Australian jewel beetle, Temognatha alternata [Image licensed under the Creative Commons John Hill at the English-language Wikipedia]

What's curious about this particular color pattern is that beetles apparently had a gene loss some time around the last common ancestor three hundred million years ago that knocked out the ability of the entire group to see in the blue region of the spectrum.  This kind of thing happens all the time; every species studied has pseudogenes, genetic relics left behind as non-functional copies of once-working genes that suffered mutations either to the promoter or coding regions.  However, it's odd that animals would have colors they themselves can't see, given that bright coloration is very often a signal to potential mates.

That's not the only reason for bright coloration, of course; there is also aposematic coloration (also known as warning coloration), in which flashy pigmentation is a signal that an animal is toxic or otherwise dangerous.  There, of course, it's not important to be seen by other members of your own species; all that counts is that you're visible to potential predators.  But jewel beetles aren't toxic, so their bright colors don't appear to be aposematic.

The puzzle was solved in a paper in Molecular Biology and Evolution that came out last week, in which a genetic study of jewel beetles found that unlike other beetles, they can see in the blue region of the spectrum -- and in fact, have unusually good vision in the orange and ultraviolet regions, too.  What appears to have happened is that a gene coding for a UV-sensitive protein in the eye was duplicated a couple of times (another common genetic phenomenon), and those additional copies of the gene were then free to accrue mutations and take off down their own separate evolutionary paths.  One of them gained mutations that altered the peak sensitivity of the protein into the blue region of the spectrum; the other gave their hosts the ability to see light in the orange region.

The result is that jewel beetles became tetrachromats; their eyes have acuity peaks in four different regions of the spectrum.  (Other than a few people --who themselves have an unusual mutation -- humans are trichromats, with peaks in the red, green, and blue regions.) 

What this shows is that lost genes can be recreated.  The gene loss that took out beetles' blue-light sensitivity was replaced by a duplication and subsequent mutation of a pre-existing gene.  It highlights the fundamental misunderstanding inherent in the creationists' mantra that "mutations can't create new information;" if that's not exactly what this is, there's something seriously amiss with their definition of the word "information."  (Of course, I'm sure any creationists in the studio audience -- not that there are likely to be many left -- would vehemently disagree with this.  But since willfully misunderstanding scientific research is kind of their raison d'être, that should come as no surprise to anyone.)

Anyhow, the jewel beetle study is a beautiful and elegant piece of research.  It showcases the deep link between genetics and evolution, and reminds me of the quote from Ukrainian-American biologist Theodosius Dobzhansky, which seems a fitting place to end: "Nothing in biology makes sense except in light of evolution."

****************************************



Monday, March 13, 2023

Lord of frenzy

I'm sure most of you have heard of the Norse god Odin, at least from his appearance in the Marvel universe.  My first exposure to this bit of mythology came from my near-obsession with the book D'Aulaire's Book of Norse Myths, which I checked out from my elementary school library approximately 538 times.  This, in fact, is why to this day when I think of the trickster god Loki, I picture this:


And not this:

Be that as it may, the Norse pantheon is a fascinating bunch, and unusual amongst the gods of myth and legend in being mortal.  In fact, one of the most famous parts of the mythos is the tale of Ragnarök -- literally, "the doom of the gods" -- in which Loki unleashes chaos and destruction by causing the death of Baldr, the beloved god of light and joy.  The whole thing is described in brilliant detail in the Prose Edda and Poetic Edda of the thirteenth-century Icelandic scholar Snorri Sturluson, to whom we owe much of what we know about the beliefs of pre-Christian Scandinavia.

Odin (or Wōden, as he was called in Saxon England; this form of his name is the origin of the word Wednesday), the "All-Father," was one of the principal figures in the Germanic pantheon.  His name comes from a reconstructed Proto-Germanic root *Wōðanaz, which means "lord of frenzy."  There are dozens of curious stories about him -- that he hanged himself from Yggdrasil, the "World Tree," in order to gain the knowledge of the runes and writing; that he created the first man and woman from an ash and a birch tree, respectively; that he gave one of his eyes in order to drink from the well of wisdom; and that he rode upon an eight-legged horse called Sleipnir, that was the offspring of the stallion Svaðilfari and Loki, who had taken the form of a mare.

Odin on Sleipnir (from Den ældre Eddas Gudesange by Lorenz Frølich, 1895) [Image is in the Public Domain]

What I didn't know, though, was that the earliest actual attestation of Odin from any written record is comparatively recent.  A friend and loyal reader of Skeptophilia sent me a link about a study of a gold disk from Denmark that contains the first certain reference to Odin, and I was surprised to see that it dates to only the fifth century C.E.  The disk is called the Vindelev bracteate -- it was found near the town of Vindelev, and a bracteate is a flat pendant.  It states, in runic lettering, "He is Odin's man," presumably referring to some unknown chieftain or leader.

Given the complexity of the legends surrounding Odin and the other Norse gods, presumably their worship goes a lot further back; but I honestly didn't realize how much less we have in the way of early attestations of the Norse pantheon as compared to (for example) the Greek, Roman, Babylonian, Indian, and Chinese assemblages of deities.  Just about everything we know comes from the eighth century and later, the point at which the Vikings kind of exploded out of Scandinavia and did their best to take over all of northern Europe.  They did a damn good job; not only was all of eastern England under Danish control for a time, so were the Hebrides and Orkneys, Normandy (the name itself means "northman-land"), and a good part of what is now western Russia, Ukraine, and Belarus.  (Perhaps you know that the name Russia itself comes from the Rus, a group of Norse traders who ruled the entire region for a while, with their capital at Kyiv.)

So the dating of the Vindelev bracteate to the fifth century certainly doesn't mean that's when the worship of Odin began, only that this is the first certain example of anyone writing about it.  His influence on the beliefs of the pre-Christian Germanic world is immense.  As an Old English runic poem from the ninth century put it:
Wōden is the origin of all language
wisdom's foundation and wise man's comfort
and to every hero blessing and hope.

Perhaps the All-Father would not be upset that this is the way he's remembered, that his association with frenzy and battle was superseded by wisdom and hope, just as the people who once worshiped him settled down to become some of the most peaceful, progressive, and prosperous nations in the world. 

****************************************



Saturday, March 11, 2023

Parallel problem solving

One of the many fascinating aspects of evolution is how nature happens upon the same solutions to environmental problems, over and over.

Two of the best examples of this are eyes and wings.  True eyes evolved from simple photoreceptive spots at least four times: the vertebrate eye, with its complex system of lenses and retinas; the pinhole-camera eyes of the chambered nautilus and other cephalopods; the compound eyes of insects; and the rows of separate spherical eyes in clams and scallops.  Wings, on the other hand, evolved independently no fewer than six times: bats, birds, insects, pterosaurs, flying squirrels, and colugos (the last two count if you include gliding along with true powered flight).

The reason is simple.  There are a handful of problems animals have to overcome (perception/sensation, nutrition, reproduction, locomotion, avoiding environmental dangers, and avoiding predation) and a limited number of ways to accomplish them.  Once (for example) photoreceptive eyespots develop in an animal, natural selection for improving the sensitivity of those spots takes over, but how exactly you do that can differ.  The result is you end up with vision evolving over and over, and each time, the organ is structured differently, but accomplishes the same thing.

Evolution, it seems, is the law of whatever works.

This has interesting implications about what extraterrestrial life might look like.  I very much believe that certain features will turn out to be constrained in any conceivable species -- the presence of locomotor organs, organs sensitive to sound, light, heat, and touch, and so on -- but also, that the way those organs are arranged and configured could be very differently from anything we have on Earth.

This "multiple solutions to the same problems" idea is what immediately came to mind when my friend and fellow writer Gil Miller, whose inquisitive mind and insatiable curiosity have provided me with many a topic here at Skeptophilia, sent me a link from Phys.org about hollow bones in dinosaurs.  Endoskeletons such as our own exist in an interesting tension.  They have to be solid enough to support our weight, but the better they are at weight-bearing, the heavier they themselves are.  The mass of an animal in general increases much faster than its linear dimensions do; double a mouse's height, keeping its other proportions the same, and it will weigh about eight times as much.  This is why in order for the whole system to work, the proportions have to change as species increase in size.  A mouse's little matchstick legs would never work if you scaled it up to be as big as a dog; at the extreme end, consider the diameter of an elephant's legs in relation to its size.  Anything narrower simply wouldn't support its weight.

[Nota bene: this is why if you were traumatized when young by bad black-and-white horror movies about enormous insects wreaking havoc, you have nothing to worry about.  If you took, for example, an ant, and made it three meters long, its proportionally tiny little legs would never be able to lift it.  The worst it could wreak would be to lie there on the ground, helpless, rather than eating Tokyo, which is what the horror movie monsters always did.  One got the impression the inhabitants of Tokyo spent ten percent of their time working, relaxing, and raising families, and the other ninety percent being messily devoured by giant radioactive bugs.]

But back to the Phys.org article.  A detailed analysis of the bone structure of three different dinosaur lineages -- ornithischians, sauropodomorphs, and herrerasaurids -- found that while all three had landed on the idea of internal air sacs as a way of reducing the mass of their large bones, the structures of each are different enough to suggest all three evolved the feature independently.  Once again, we have an analogous situation to eyes and wings; identical problem, parallel solutions.  The problem here is that large body size requires heavy bones that require a lot of energy to move around, and the solution is to lighten those bones by hollowing them out (while leaving the interstices connected enough that they're still structurally sound).  And three different clades of dinosaurs each happened upon slightly different ways to do this.

Herrerasaurus ischigualastensis [Image licensed under the Creative Commons Eva Kröcher, CC-BY-SA]

It's fascinating to see how many ways living things happen upon similar solutions to the problems of survival.  Evolution is both constrained and also infinitely creative; it's no wonder we are so often in awe when we look around us at the natural world.  The "endless forms most beautiful and most wonderful" Darwin spoke of in the moving final words of The Origin of Species never fail to astonish -- especially since the brains we use to comprehend them are just one of the end products of those very same processes.

****************************************



Friday, March 10, 2023

Mudslinging

I've been writing here at Skeptophilia for twelve years, something that I find a little mind-boggling.

Even more astonishing is that despite the amount of time I've spent debunking crazy ideas, I still run into ones I'd never heard of before.  Such as the phenomenally loopy claim I bumped into yesterday, about the "Tartaria mud flood."

First, a little background.

The Tatars are a group of Turkic ethnic groups that now live mainly in Russia, Ukraine, Kazakhstan, and Turkey.  They were the predominant force in the "Golden Horde" that swept across Central Asia in the thirteenth century C.E., establishing a khanate there that would last for four centuries.  The Europeans -- as usual, not particularly concerned with accuracy in describing people they considered inferior -- picked up this name, and started calling pretty much anyone from Central Asia and eastern Siberia "Tatars" (more commonly misspelled as "Tartars").  And the entire region appears on old maps as "Tartary."

An English map from 1806 showing "Tartary" (note that they even include Japan under this name!) [Image is in the Public Domain]

Not to beat the point unto death, but the whole European concept of Tartary was wrong right from the get-go; it was lumping together dozens of groups of people who were not only not Tatars, but weren't even Turkic, and it was pretending that the whole lot of them were under some kind of unified central government.

So we're on shaky ground from the start, but it gets worse.

In 2016, a guy named Philipp Druzhinin started posting videos and articles claiming that not only was Tartary (which he called "Tartaria") real, it had been ascendant until the 1800s -- at which point, something catastrophic happened.  Some time in the early nineteenth century, there had been a worldwide "mud flood" that had buried Tartarian cities and effectively ended the theretofore thriving country of Tartaria.  At first, his videos got little notice, but then something happened in 2019 -- it's not entirely apparently what -- that made them suddenly gain traction.

A lot of traction.  And, as you'll see, started entangling them with something a lot darker.

But first, with regards to the claim itself, I have several questions.

First, what evidence is there that anything like this ever happened?

The most accurate answer is "almost none."  The main argument seems to be that in a lot of cities there are catacombs and underground passageways, which in Druzhinin's pretend world were the actual original street levels before all the mud came in and buried stuff.  (Amusingly, he includes the Seattle Underground City in this, despite the fact that (1) Seattle is on the other side of the world from "Tartaria," and (2) the Underground City was created from a thoroughly-documented reconstruction project designed to raise street levels after the Great Seattle Fire of 1889.)

Second, why doesn't this show up in any reputable history books?

Well, Druzhinin knows the answer to that.  The knowledge was suppressed.  Because of course it was.  The evil, scheming historians went and destroyed any record of the mud flood, cackling and rubbing their hands together the entire time.  Notwithstanding the impossibility of erasing every account of a supposedly worldwide event that only happened two centuries ago.  Historians are just that diabolical, apparently.  Why they did this is unclear.  Maybe just being eeeeee-vill is enough.

Third, where did all the mud come from?

Druzhinin is a little thin on this point.  (Truthfully, he's a little thin on every point.)  Considering that even a good-sized volcano can only cover a few square miles in lava during an eruption, it's hard to imagine any process that could produce enough mud to generate a mud flood worldwide.  But, hey... Noah's ark and everything, amirite?  So q.e.d., apparently.

The Tartarian mud flood claim is so patently ridiculous that you'd think an average middle schooler would recognize it as such, and yet -- since its first appearance seven years ago -- it has gained tremendous traction.  YouTube videos about it have been watched and downloaded millions of times.  Worse still, the whole thing has gotten tangled up in other, nastier conspiracy theories -- QAnon, the Illuminati, various antisemitic ideologies, all the One World Government nonsense, microchip implantation schemes, even climate change denialism -- because, as I've pointed out before, once you've abandoned hard evidence as the touchstone for understanding, you'll fall for damn near anything.

Or perhaps for everything, all at the same time.

What would be hilarious if it weren't so disturbing is that a big part of this crazy conglomeration of claims state that the Powers-That-Be want to silence all dissent and stop anyone from finding out about their nefarious dealings, and yet some tinfoil-hat-wearing twenty-something living in his mom's basement can make and upload hours of YouTube videos on the topic, and the response from the Powers-That-Be is: *crickets*

Almost drives you to the awkward conclusion that the whole lot of it is unadulterated horse waste, doesn't it?

And of course, the purveyors of this nonsense love it when people like me write stuff like this, because there's nothing for their sense of self-righteousness like also feeling persecuted.  Laughing at them just increases their certainty they're right, because otherwise... why would we be laughing?  It reminds me of the quote from Carl Sagan: "[T]he fact that some geniuses were laughed at does not imply that all who are laughed at are geniuses.  They laughed at Columbus, they laughed at Fulton, they laughed at the Wright brothers.  But they also laughed at Bozo the Clown."

Anyhow, keep an eye out for this.  One of the most recent additions to the long, ugly list of conspiracy theories.  Dating from when it really took off, the whole thing is only about four years old, and astonishingly -- considering the logical leaps you have to make to believe any of it -- is still gaining serious traction.

Which just pisses me off.  I work my ass off to get views here at Skeptophilia, and some wingnut claims that a magical mud flood wiped out a non-existent country two centuries ago, and it somehow gains wings.  It reminds me of the quote from Charles Haddon Spurgeon -- "A lie can go all the way around the world while truth is still lacing up its boots."

****************************************



Thursday, March 9, 2023

Pitch perfect

I've been a music lover since I was little.  My mom used to tell the story of my being around four years old and begging her to let me put records on the record player.  At first, she was reluctant, but for once my persistence won the day, and she finally relented.  To my credit, despite my youth I was exceedingly careful and never damaged a record; the privilege was too important to me to risk revocation.  There were certain records I played over and over, such as Rimsky-Korsakov's Scheherazade (a piece I love to this day).

I've always been fascinated with the question of whether musicality is inborn or learned.  My parents, while they had a decent record collection, weren't musical themselves; they certainly didn't have anything like the passion for it I experienced.  While the capacity for appreciating music is still poorly understood, today I'd like to tell you about some research indicating that the way our brains interpret tone structure is inborn.

First, a little background.

While it may appear on first glance that the major key scale -- to take the simplest iteration of tone structure as an example -- must be arbitrary, there's an interesting relationship between the frequencies of the notes.  Middle C, for example, has a frequency of about 260 hertz (depending on how your piano is tuned), and the C above middle C (usually written C') has exactly twice that frequency, 520 hertz. Each note is half the frequency of the note one octave above.  The frequency of G above middle C (which musicians would say is "a fifth above") has a frequency of 3/2 that of the root note, or tonic (middle C itself), or 390 hertz.  The E above middle C (a third above) has a frequency of 5/4 that of middle C, or 325 hertz.  Together, these three make up the "major triad" -- a C major chord.  (The other notes in the major scale also have simple fractional values relative to the frequency of the tonic.)

[Note bene: Music theoretical types are probably bouncing up and down right now and yelling that this is only true if the scale is in just temperament, and that a lot of Western orchestral instruments are tuned instead in equal temperament, where the notes are tuned in intervals that are integer powers of the basic frequency increase of one half-tone.  My response is: (1) yes, I know, and (2) what I just told you is about all I understand of the difference, and (3) the technical details aren't really germane to the research I'm about to reference.  So you must forgive my oversimplifications.]

Because there are such natural relationships between the notes in a scale, it's entirely possible that our ability to perceive them is hard-wired.  It takes no training, for example, to recognize the relationship between a spring that is vibrating at a frequency of f (the lower wave on the diagram) and one that is vibrating at a frequency of 2f (the upper wave on the diagram).  There are exactly twice the number of peaks and troughs in the higher frequency wave as there are in the lower frequency wave.


Still, being able to see a relationship and hear an analogous one is not a given.  It seems pretty instinctive; if I asked you (assuming you're not tone deaf) to sing a note an octave up or down from one I played on the piano, you probably could do it, as long as it was in your singing range.

But is this ability learned because of our early exposure to music that uses that chord structure as its basis?  To test this, it would require comparing a Western person's ability to match pitch and jump octaves (or other intervals) with someone who had no exposure to music with that structure -- and that's not easy, because most of the world's music has octaves, thirds, and fifths somewhere, even if there are other differences, such as the use of quarter-tones in a lot of Middle Eastern music.

This brings us to a paper in the journal Current Biology called "Universal and Non-universal Features of Musical Pitch Perception Revealed by Singing," by Nori Jacoby (of the Max Planck Institute and Columbia University), Eduardo A. Undurraga, Joaquín Valdés, and Tomás Ossandón (of the Pontificia Universidad Católica de Chile), and Malinda J. McPherson and Josh H. McDermott (of MIT).  And what this team discovered is something startling; there's a tribe in the Amazon which has had no exposure to Western music, and while they are fairly good at mimicking the relationships between pairs of notes, they seemed completely unaware that they were singing completely different notes (as an example, if the researchers played a C and a G -- a fifth apart -- the test subjects might well sing back an A and an E -- also a fifth apart but entirely different notes unrelated to the first two).

The authors write:
Musical pitch perception is argued to result from nonmusical biological constraints and thus to have similar characteristics across cultures, but its universality remains unclear.  We probed pitch representations in residents of the Bolivian Amazon—the Tsimane', who live in relative isolation from Western culture—as well as US musicians and non-musicians.  Participants sang back tone sequences presented in different frequency ranges.  Sung responses of Amazonian and US participants approximately replicated heard intervals on a logarithmic scale, even for tones outside the singing range.  Moreover, Amazonian and US reproductions both deteriorated for high-frequency tones even though they were fully audible.  But whereas US participants tended to reproduce notes an integer number of octaves above or below the heard tones, Amazonians did not, ignoring the note “chroma” (C, D, etc.)...  The results suggest the cross-cultural presence of logarithmic scales for pitch, and biological constraints on the limits of pitch, but indicate that octave equivalence may be culturally contingent, plausibly dependent on pitch representations that develop from experience with particular musical systems.
Which is a very curious result.

It makes me wonder if our understanding of a particular kind of chord structure isn't hardwired, but is learned very early from exposure -- explaining why so much of pop music has a familiar four-chord structure (hilariously lampooned by the Axis of Awesome in this video, which you must watch).  I've heard a bit of the aforementioned Middle Eastern quarter-tone music, and while I can appreciate the artistry, there's something about it that "doesn't make sense to my ears."

Of course, to be fair, I feel the same way about jazz.

In any case, I thought this was a fascinating study, and like all good science, opens up a variety of other angles of inquiry.  Myself, I'm fascinated with rhythm more than pitch or chord structure, ever since becoming enthralled by Balkan music about thirty years ago.  Their odd rhythmic patterns and time signatures -- 5/8, 7/8, 11/16, 13/16, and, no lie, 25/16 -- take a good bit of getting used to, especially for people used to good old Western threes and fours.

So to conclude, here's one example -- a lovely performance of a dance tune called "Gankino," a kopanica in 11/16.  See what sense you can make of it.  Enjoy!

****************************************



Wednesday, March 8, 2023

The registry of dissent

I wonder if you've heard about the latest attempt to turn the state of Florida into an autonomous authoritarian oligarchy.

No, I'm not talking about Governor Ron DeSantis's virtual takeover of Disney, although for a party that is supposedly staunchly pro-corporation, it seems like a hypocritical thing to do.  "We're staunchly pro-corporation as long as the corporation toes the far-right line" is nearer the mark.

The particular move I'm thinking of today struck closer to the bone for me, because it's targeted specifically at bloggers.  A bill called "Information Dissemination" proposed by Senator Jason Brodeur would, if passed, require bloggers who post anything critical of Governor DeSantis or other elected officials to sign onto a state registry -- or face fines of up to $2,500.  It's unclear from the wording of the bill if this would apply to bloggers out of state who criticize Florida officials.  This certainly doesn't seem to be overtly excluded, but if so, it raises serious issues of jurisdiction.

The bill tries to dodge First Amendment concerns by limiting itself to bloggers who are financially compensated for their writing -- ostensibly to restrict people from taking money from lobbyists and engaging in criticism-for-pay -- but just about all bloggers get compensated in some way, even if it's just through ad monetization.  So the fact is, this bill is meant to do only one thing: stifle dissent.  

The spirit, and even the wording, of the bill have drawn speculation that it was inspired by a similar law passed by the authoritarian régime of President Viktor Orbán of Hungary in 2010.  This may sound far-fetched, but Orbán is a revered figure amongst the far right, and the elected leaders of Florida have praised him before.  Right-wing commentator Rod Dreher, who is currently living in Budapest, described in an interview a conversation with a reporter who had "talked to the press secretary of Governor Ron DeSantis of Florida and she said, 'Oh yeah, we were watching the Hungarians, so yay Hungary.'"  Steve Bannon calls Orbán "one of the great moral leaders of our time."  It's not certain if Brodeur's bill is a case of imitation or just parallel processes from like minds -- but either way, it's horrifying.

[Image licensed under the Creative Commons Madelgarius, Freedom of speech (3), CC BY-SA 4.0]

Even some GOP members seem to realize Brodeur's bill is a case of serious governmental overreach.  In a statement that would be funny if it weren't so appalling, none other than Newt Gingrich tweeted, "The idea that bloggers criticizing a politician should register with the government is insane.  It is an embarrassment that it is a Republican state legislator in Florida who introduced a bill to that effect.  He should withdraw it immediately."  Which brought to mind the trenchant quote from Stephen King: "Conservatives who for years sowed the dragon's teeth of partisan politics are horrified to discover they have grown an actual dragon."  Gingrich, perhaps more than any other single individual, is the architect of the far right; the fact that the careening juggernaut he created has lurched into authoritarian neo-fascism should come as no surprise to him, or to anyone else.  The subtext has always been "We're the party of small hands-off government until we want big intrusive government;" bills like Brodeur's, and (even more strikingly) the current tsunami of anti-trans legislation being passed in red states across the country, just pull the mask off the ugly agenda that was there from the very beginning.

The optimists say that even if Brodeur's bill passes, it'll be struck down on First Amendment grounds almost immediately.  Me, I wonder.  DeSantis and his ilk are in ascendency, and I'm perhaps to be excused if I suspect it's not so certain as all that.  Here I sit, in upstate New York, far away from the epicenter; but I hope my writer colleagues in Florida will not be cowed into silence.  Believe me, if I did live in Florida, I'd be criticizing Brodeur, DeSantis, and the proposed legislation for all I'm worth.  I'm not usually a "come at me, bro" type, but we can't keep quiet about it and hope that the First Amendment will shield us.  If this bill passes -- and I think it probably will -- it will act as a template for other state legislatures intent on crushing dissenting voices.

If you think this kind of thing can't spread like a contagion, I have only refer you to the history of Germany in the 1930s for a counterexample.

Whatever the legality of extending this law to apply to out-of-state bloggers criticizing Florida legislators, allow me to go on record as stating that this is me, criticizing the absolute shit out of the whole lot of them.  And as far as my ever signing onto a registry for doing so, I am also going on record as stating that Brodeur can take his blogger registry and stick it up his ass.

Sideways.

****************************************