Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, March 31, 2022

Relics of the distant past

Today we'll stay in an archaeological vein, mostly because a couple of loyal readers of Skeptophilia read yesterday's post and responded with links of their own and messages which basically boil down to, "Yes, but have you seen this?"

The first one comes to us from the ever-entertaining site Mysterious Universe, but unlike their usual fare of Bigfoots and UFOs, this one is about legitimate scientific research.  Not that you could tell from the title, which is "Yorkshire's Atlantis May Have Finally Been Found."  To be fair, the appellation of "Atlantis" isn't the fault of the author, Paul Seaburn; apparently this site, Ravenser Odd, has been called that before.  But unlike Atlantis, Ravenser Odd is a real place.  It was a port city on the estuary of the River Humber, attested thoroughly in records of the time until in 1362 there was a storm that breached the sand-based seawall and swamped it completely, and the once thriving town -- like its mythological namesake -- sank beneath the waves.

The shape of the long, narrow seawall is what gave the place its name, all the way back in Viking times, some four hundred years earlier; Ravenser Odd is a mangled version of hrafns eyr, which means "raven's eye" in Old Norse.  In its time it was a busy place.  It was one of the most thriving ports in the region, and a record from 1299 describes it as containing a central marketplace, wharves, warehouses, a court, a prison, a chapel, two mills, a tannery, an annual fair, and over a thousand residences.  The coastal region near the original submerged town retained the name, and in fact it's mentioned twice in Shakespeare, where he calls it "Ravenspurgh" (Richard II, act 2, scene 1, line 298, and Henry IV Part 1, act 1, scene 3, line 245).

Despite multiple attestations in the records, no one was able to find where the original Ravenser Odd had stood -- until now.

An amateur archaeologist named Philip Mathison, who is something of an expert on Ravenser Odd, stumbled upon an 1892 document on eBay that mentioned "submarine remains" at Spurn, a tidal island north of the mouth of the Humber -- and gave directions on how to find them.  Mathison went out in a boat with an echo sounder, and found what looked like a human-made rock wall, exactly where the document had said it would be.

"People had assumed it was way out to sea, as the shape of the peninsula now is very different to how it was in the thirteenth century," Mathison said.  "This document showed a stone ledge to the east of Spurn which I believe could be the walls of a dock or quay... The ridge was most likely rock armor to protect the port, as it was under threat from erosion way before it was abandoned.  The bulk of the town's buildings were on a shingle bank called The Old Den, to the west side of Spurn, and some brickwork from them has been found in the past.  The town curved around like a fish hook and the wharves were at the other end... But it needs a proper dive to find out."

Seems like Mathison is going to get his wish -- two archaeologists from the University of Hull have already purchased scanning equipment and obtained funding for other supplies for an expedition this summer, when the weather in the North Sea improves.

Also with a Viking connection is a study done at the University of Massachusetts - Amherst that seems to upend a long-held theory about why the Norse settlements of Greenland died out in the late fourteenth century.  Previous models had attributed the collapse to the onset of the Little Ice Age, a worldwide drop in global average temperature that (among other things) caused the Greenland sea access to freeze up year round and made it an even more miserable place to live than it already is.  But the new study -- using two organic molecules as markers that are known to indicate, respectively, temperature and water availability -- showed that during the period of the collapse, the temperature didn't drop much, but it became significantly drier.

The harsh winters were one thing, but when the rain stopped falling even in the warmest months of summer, that was the kiss of death for the crops and domesticated animals at the Norse settlements, and ultimately, the Norse themselves.  

For the last story, we return to the British Isles, where a geophysical survey near the town of Aberlemno uncovered a 1.7-meter-long stone carved with designs identified with the Picts, the mysterious people who inhabited northern and eastern Scotland before the Dál Riada Scots moved in and kind of took over in the tenth century.  There aren't many Pictish records around; they were Celtic, but appear to have spoken a Brythonic language related to Welsh, Cornish, and Breton, not a Goidelic language like Gaelic, Irish, and Manx.

The discovery was made quite by accident.  While moving some surveying equipment, they noticed some anomalies that seemed to indicate the buried foundation of a settlement.  They dug into the soil, and hit a rock. "I just brushed my hand, and there was a symbol," said Zack Hinckley, an archaeologist at the University of Aberdeen.  "And we had a freakout... there were genuine tears."

The Pictish stone from Aberlemno, Scotland

The difficulty is that given the paucity of Pictish records, little is known of the script, and it's currently unknown whether these were written language, or simply decorative symbols.  The stone has been removed to an archaeological conservation lab in Edinburgh for further study.

So there you are.  The world of archaeology has been hopping lately.  It's always amazing to me that despite the extensive research that's been done, with state-of-the-art mapping and surveying tools, that there are still plenty of astonishing artifacts out there to find.

Some of them, apparently, right underneath our feet.


Wednesday, March 30, 2022

Grave matters

Today we take a trip into the past with three new discoveries from the world of archaeology, sent my way by my eagle-eyed friend and fellow writer, Gil Miller.

The first one has to do with ancient fashion.  Have you ever wondered how our distant ancestors dressed?  Whether it was crudely stitched-together rags, as the peasantry are often depicted?  Leopard-skin affairs, like the Flintstones?  Or nothing but a brass jockstrap, like this guy?

Turns out it wasn't so different from what you and I are wearing.  (I'm assuming you're not naked except for a brass jockstrap.  If you are, I won't judge, but I also don't want to know about it.)  An analysis of the clothing worn by a 3,200 year old mummy recovered from China's Tarim Basin was wearing tightly-woven, intricately-made trousers, built to be durable and allow maneuverability -- a little like today's blue jeans.

The pants worn by "Turfan Man" [Image from M. Wagner et al./Archaeological Research in Asia, 2022]

The cloth is a tight twill weave -- something that was assumed to be invented much later -- and had a triangular crotch piece that seems to be designed to avoid unfortunate compression of the male naughty bits while riding horseback.  The decoration, including an interlocking "T" pattern on the bands around the knees, is very similar to patterns found used on pottery in the area, and as far away as Kazakhstan and Siberia.

From ancient Chinese fashion items, we travel halfway around the world for something a little more gruesome -- a burial in the Lambayeque region of Peru that seems to contain the skeleton of a surgeon, along with his surgical tools.

The burial has been dated to the Middle Period of the Sican Culture, which would have been somewhere between 900 and 1050 C. E., and was recovered from a mausoleum temple at the rich archaeological site of Las Ventanas.  The man was obviously of high standing; he was wearing a golden mask pigmented with cinnabar, a bronze pectoral, and a garment containing copper plates.  But most interesting was the bundle of tools he was buried with -- awls, needles, and several sizes and shapes of knives.  This, the researchers say, identifies him as a surgeon.

It's hard for me to fathom, but surgery was done fairly regularly back then -- up to and including brain surgery (called trepanning).  There was no such thing as general anesthesia, so it was done under local anesthesia at best, probably supplemented with any kind of sedative or painkilling drugs they had available.  Still, it was a horrible prospect.  But what is most astonishing is that a great many of the patients, even the ones who had holes drilled into their skulls, survived.  There have been many cases of skeletons found that show signs of surgery where the surgical cuts healed completely.

But still, the ordeal these poor folks went through is horrifying to think about, so let's move on to the third and final article, that comes to us from England.  An archaeologist named Ken Dark has led a team of researchers in studying 65 grave sites in the counties of Somerset and Cornwall that date back to a time of history I've always had a particular fascination for -- the Western European "Dark Ages," between the collapse of the Roman Empire as a centralized power in the fourth and fifth centuries C. E. and the reconsolidation of Europe under such leaders as Charlemagne and Alfred the Great, four hundred years later.

The "darkness" of the so-called Dark Ages isn't so much that it was lawless and anarchic (although some parts of it in some places probably were), but simply because we know next to nothing about it for sure.  There are virtually no contemporaneous records; about all we have, the best-known being Gildas's sixth-century De Excidio et Conquesto Britanniae, are accounts that contain legend mixed up with history so thoroughly it's impossible to tell which is which.  I bring up Gildas deliberately, because his is the only record of King Arthur written anywhere close to the time he (allegedly) lived, and the graves that Dark and his team are studying date from right around that pivotal time when Christianized Romano-Celtic Britain was being attacked and overrun by the pagan Angles, Saxons, and Jutes.

The burial practices of noble sixth-century Britons stands in stark contrast from Anglo-Saxon burials from the same period; the Britons, it's believed, scorned the ostentation and ornate decorations of pagan funerals, and by comparison even high-status individuals were buried without much pomp.  What sets these graves apart from those of commoners is that they were set apart from other graves, had a fenced enclosure, and were covered with a tumulus of stones that the early Celts called a ferta, which was a sign of high standing.

"The enclosed grave tradition comes straight out of late Roman burial practices," Dark said.  "And that's a good reason why we have them in Britain, but not in Ireland -- because Britain was part of the Roman empire, and Ireland wasn't...  We've got a load of burials that are all the same, and a tiny minority of those burials are marked out as being of higher status than the others.  When there are no other possible candidates, that seems to me to be a pretty good argument for these being the ‘lost' royal burials."

So that's today's news from the past -- ancient blue jeans, primitive surgery, and Dark Age noble burials.  Sorry for starting your day on a grave note.  But it's always fascinating to see not only how things have changed, but how similar our distant ancestors were to ourselves.  If we were to time travel back there, I'm sure there'd be a lot of surprises, but we might be more shocked at how much like us they were back then.  To borrow a line from Robert Burns, a person's a person for a' that and a' that.


Tuesday, March 29, 2022

A terrestrial heartbeat

When I was an undergraduate at the University of Louisiana, I took a class called Introduction to Astronomy from a fellow named Daniel Whitmire.  Dr. Whitmire made a name for himself, along with a colleague named John Matese (whom I later took a class in Quantum Mechanics from), with something that's been nicknamed the "Planet X" hypothesis.  This isn't some crazy, Nibiru-is-headed-toward-Earth claim; Whitmire and Matese were looking at an apparent periodicity in mass extinctions, which they suggested could be the effects of a massive planet far beyond the orbit of Pluto, perhaps with an eccentric orbit, which every so often passes through a dense part of the Oort Cloud and sends comets and other debris hurtling in toward the inner Solar System.

Since the time I first heard about it (in around 1980), the Planet X hypothesis has lost currency.  There's been no evidence whatsoever of a massive planet outside the orbit of Pluto, and in any case, further study has indicated that the extinctions (1) don't really show that strong a periodicity, and (2) have been pretty well explained from phenomena other than collisions (other than, obviously, the Cretaceous-Tertiary extinction).

[Image is in the Public Domain courtesy of NASA]

I found out last week, though, that Whitmire and Matese may have been on to something after all.

A curious paper that recently appeared in Geoscience Frontiers suggests that focusing solely on the extinctions may have hidden an underlying periodicity.  In "A Pulse of the Earth: A 27.5-Myr [million year] underlying cycle in coordinated geological events over the last 260 Myr," Michael Rampino and Yuhong Zhu (of New York University) and Ken Caldeira (of the Carnegie Institution for Science) did some detailed statistical analysis (the mathematics of which is beyond me) on 89 different major geological events on Earth -- marine and non-marine extinctions, major ocean-anoxic events, continental flood-basalt eruptions, sea-level fluctuations, global pulses of intraplate magmatism -- and found that there are striking, 27.5 million year peaks that have yet to be explained.

What jumped out at me is that the analysis isn't just some vague, it-looks-like-it-might-be-a-pattern.  The software they used found that the periodicity has a 96% confidence -- i.e. there's only a 4% chance that it's just noise that happens to look like a rhythm.  This means that they're on to something.  What, exactly, they're on to remains to be seen; the natural inclination is to look for some kind of tectonic process that for some reason is on a really slow cycle, but they did note one other curious possibility:
On the other hand, the main period of about 30 Myr is close to the Solar System’s ~ 32 ± 3 Myr vertical oscillation about the mid-plane of the Galaxy.  In the Galactic plane region, increased cosmic-ray flux might lead to significant climatic changes, whereas encounters with concentrations of disk-dark matter might trigger comet showers from the Oort Cloud, as well as thermal and geophysical disturbances in the inner Earth.  We note that a 26 to 37 Myr cycle has been reported in the ages of terrestrial impact craters, using various statistical techniques and sets of crater ages potentially connecting the terrestrial and extraterrestrial cycles.

Of course, figuring out the mechanism that causes the pattern comes after establishing that the pattern itself is real.  As I pointed out in my post on the Ganzfeld Experiment a couple of weeks ago, developing a model to explain a phenomenon has to wait until you've shown that there's a phenomenon there to explain.

But a 96% confidence level is enough to indicate that there's some underlying mechanism at work here that's worth further study.  Something, apparently, is causing a strange, regular pulse of catastrophes.  (To put your minds at ease -- I know this was one of the first things I wondered -- the last peak the analysis found occurred 12.1 million years ago, so we've got another fifteen-odd million years to go before the next one.  That is, if we don't manufacture a cataclysm ourselves first.)

For now, all we have is an odd, unexplained pattern in geological upheavals.  It will be fascinating to see what refinements are put on the analysis -- and whether the scientists can find out what's actually going on.  Until then, we're left with a mystery -- a 27.5 million year terrestrial heartbeat.


Monday, March 28, 2022


Astrophysicist Neil deGrasse Tyson said (apropos of UFO sightings), "The human brain and perceptual systems are rife with ways of getting it wrong."

It might be humbling, but it's nothing short of the plain truth, and doesn't just apply to seeing alien spaceships.  Especially in perfectly ordinary situations, we like to think that what we're hearing and seeing is an accurate reflection of what's actually out there, but the fact is we not only miss entirely a significant fraction of what we're experiencing, we misinterpret a good chunk of the rest.

Think you're immune?  Watch the following two-minute video, and see if you can figure out who killed Lord Smythe.

I don't know about you, but I didn't do so well.

It turns out that we don't just miss things that are there, we sometimes see things that aren't there.  Take, for example, the research that appeared last week in the journal Psychological Science, that suggests we make guesses about what we're going to see, and if those guesses don't line up with what actually happens, we "see" what we thought we were going to see rather than reality.

The experiment was simple enough.  It uses a short video of three squares (call them A, B, and C, from left to right).  Square A starts to move quickly to the right, and "collides" with B, which starts to move.  As you track it across the screen, it looks like B is going to collide with C, and repeat what happened in the previous collision.

The problem is, square C starts to move not only before B hits it, but before B itself starts moving.  In other words, there is no way a collision with B could have been what triggered C to start moving.  But when test subjects were asked what order the squares started moving, just about everyone said A, then B, then C.  Our expectation of cause-and-effect are so strong that even on multiple viewings, test subjects still didn't see C begin to move before B.

"We have a strong assumption that we know, through direct perception, the order in which events happen around us," said study co-author Christos Bechlivanidis, of University College London.  "The order of events in the world is the order of our perceptions.  The visual signal of the glass shattering follows the signal of the glass hitting the ground, and that is taken as irrefutable evidence that this is indeed how the events occurred.  Our research points to the opposite direction, namely, that it is causal perceptions or expectations that tell us in what order things happen.  If I believe that the impact is necessary for the glass to break, I perceive the shattering after the impact, even if due to some crazy coincidence, the events followed a different order.  In other words, it appears that, especially in short timescales, it is causation that tells us the time."

As I and many others have pointed out about previous research into what is now known as "inattentional blindness," this is yet another nail in the coffin of eyewitness testimony as the gold standard of evidence in the court of law.  We still rely on "I saw it with my own eyes!" as the touchstone for the truth, even though experiment after experiment has shown how unreliable our sensory-perceptive systems are.  Add to that how plastic our memories are, and it's a travesty that people's fates are decided by juries based upon eyewitness accounts of what happened, sometimes in the distant past.

[Image licensed under the Creative Commons Eric Chan from Palo Alto, United States, Mock trial closing, CC BY 2.0]

To end with another quote by NdGT -- "There's no such thing as good eyewitness testimony and bad eyewitness testimony.  It's all bad."


Saturday, March 26, 2022

Siding with the tribe

Springboarding off yesterday's post, about our unfortunate tendency to believe false claims if we hear them repeated often enough, today we have another kind of discouraging bit of psychological research; our behavior is strongly influenced by group membership -- even if we know from the start that the group we're in is arbitrary, randomly chosen, and entirely meaningless.

Psychologists Marcel Montrey and Thomas Shultz of McGill University set up a fascinating experiment in which volunteers were assigned at random to one of two groups, then instructed to play a simple computer game called "Where's the Rabbit?" in which a simulated rabbit is choosing between two different nest sites.  The participant gets five points if (s)he correctly guesses where the rabbit is going.  In each subsequent round, the rabbit has a 90% chance of picking the same nest again, and a 10% chance of switching to the other.

The twist comes when in mid-game, the participants are offered the option of seeing the guesses of three members from either group (or a mix of the two).  They can also pay two points to use a "rabbit-finding machine" which is set up to be unreliable -- it has a two-thirds chance of getting it right, and a one-third chance of getting it wrong (and the participants know this).  Given that this is (1) expensive, points-wise, and (2) already a lower likelihood of success than simply working on your own and basing your guess on what the rabbit did in the previous round, you'd think no one would choose this option, right?

Wrong.  It turns out that when you looked at how people chose, they were way more likely to do the same thing as the people who belonged to their own group.  Next in likelihood is the wonky, inaccurate rabbit-finding machine.  Dead last was copying what was done by members of the other group.

[Image licensed under the Creative Commons Sara 506, Group people icon, CC BY-SA 3.0]

Remember what I started with -- these groups were entirely arbitrary.  Group affiliation was assigned at the beginning of the experiment by the researchers, and had nothing to do with the participants' intelligence, or even with their previous success at the game.  But the volunteers were still more likely to side with the members of their own tribe.  In fact, when choosing whose decisions to observe, the test subjects decided by a two-to-one margin to consult in-group members and not even consider the decisions made by the out-group.

How much more powerful would this effect be if the group membership wasn't arbitrary, but involved an identity that we're deeply invested in?

"Researchers have known for some time that people prefer to copy members of their own social group (e.g., political affiliation, race, religion, etc.), but have often assumed that this is because group members are more familiar with or similar to each other," said study co-author Marcel Montrey, in an interview in PsyPost.  "However, our research suggests that people are more likely to copy members of their own group even when they have nothing in common.  Simply belonging to the same random group seems to be enough.  Surprisingly, we found that even people who rated their own group as less competent still preferred to copy its members."

It's easy to see how this tendency can be exploited by advertisers and politicians.  "Human social learning is a complex and multifaceted phenomenon, where many factors other than group membership play a role," Montrey said.  "For example, we know that people also prefer to copy successful, popular, or prestigious individuals, which is why companies advertise through endorsements.  How do people’s various learning biases interact, and which ones are most important?  Because these questions have only recently begun to be explored, the real-world relevance of our findings is still up in the air."

This also undoubtedly plays a role in the echo-chamber effect, about which I've written here more than once -- and which is routinely amplified by social media platforms.  "By offering such fine-grained control over whom users observe," Montrey said, "these platforms may spur the creation of homogeneous social networks, in which individuals are more inclined to copy others because they belong to the same social group."

We like to think of ourselves as modern and knowledgeable and savvy, but the truth is that we still retain a core of tribalism that it's awfully hard to overcome.  Consider how often you hear people say things like, "I'll only vote for a person if they belong to the _____ Party."  I've sometimes asked, in some bewilderment, "Even if the person in question is known to be dishonest and corrupt, and their opponent isn't?"  Appallingly, the response is often, "Yes.  I just don't trust people of the other party."

And of course, a great many of the politicians themselves encourage this kind of thinking.  If you can get a voter to eliminate out of hand half of the candidates for no other reason than party affiliation, it raises the likelihood you'll be the one who gets elected.  So the benefits are obvious.

Unfortunately, once you look at the Montrey and Shultz study, the downsides of this sort of thinking should also be frighteningly obvious.


Friday, March 25, 2022

Truth by repetition

You probably have heard the quote attributed to Nazi propaganda minister Joseph Goebbels: "If you tell a lie big enough and continue to repeat it, eventually people will come to believe it."  This has become a staple tactic in political rhetoric -- an obvious recent example being Donald Trump's oft-repeated declaration that he won the 2020 presidential election, despite bipartisan analysis across the United States demonstrating unequivocally that this is false.  (The tactic works; a huge number of Trump supporters still think the election was stolen.)

It turns out that the "illusory truth effect" or "truth-by-repetition effect," as the phenomenon is called, still works even if the claim is entirely implausible.  A study by psychologist Doris Lacassagne at the Université Catholique de Louvain (in Belgium) recently presented 232 test subjects with a variety of ridiculous statements, including "the Earth is a perfect cube," "smoking is good for the lungs," "elephants weigh less than ants," and "rugby is the sport associated with Wimbledon."  In the first phase of the experiment, they were asked to rate the statements not for plausibility, but for how "interesting" they were.  After this, the volunteers were given lists of statements to evaluate for plausibility, and were told ahead of time that some of the statements would be repeated, and that there would be statements from the first list included on the second along with completely new claims.

The results were a little alarming, and support Goebbels's approach to lying.  The false statements -- even some of the entirely ridiculous ones -- gained plausibility from repetition.  (To be fair, the ratings still had average scores on the "false" side of the rating spectrum; but they did shift toward increasing veracity.)

The ones that showed the greatest shift were the ones that required at least a vague familiarity with science or technical matters, such as "monsoons are caused by earthquakes."  It only took a few repetitions to generate movement toward the "true" end of the rating scale, which is scary.  Not all the news was bad, though; although 53% of the participants showed a positive illusory truth effect, 28% showed a negative effect -- repeating false statements triggered their plausibility assessments to decrease.  (I wonder if this was because people who actually know what they're talking about become increasingly pissed off by seeing the same idiotic statement over and over.  I suspect that's how I would react.)

Of course, recognizing that statements are false requires some background knowledge.  I'd be much more likely to fall for believing a false statement about (for example) economics, because I don't know much about the subject; presumably I'd be much harder to fool about biology.  It's very easy for us to see some claim about a subject we're not that familiar with and say, "Huh!  I didn't know that!" rather than checking its veracity -- especially if we see the same claim made over and over.

[Image licensed under the Creative Commons Zabou, Politics, CC BY 3.0]

I honestly have no idea what we could do about this.  The downside of the Freedom of Speech amendment in the Constitution of the United States means that with a limited number of exceptions -- slander, threats of violence, vulgarity, and hate speech come to mind -- people can pretty much say what they want on television.  The revocation of the FCC's Fairness Doctrine in 1987 meant that news media no longer were required to give a balanced presentation of all sides of the issues, and set us up for the morass of partisan editorializing that the nightly news has become in the last few years.  (And, as I've pointed out more than once, it's not just outright lying that is the problem; partisan media does as much damage by what they don't tell you as what they do.  If a particular news channel's favorite political figure does something godawful, and the powers-that-be at the channel simply decide not to mention it, the listeners will never find out about  it -- especially given that another very successful media tactic has been convincing the consumers that "everyone is lying to you except us.")

It's a quandary.  There's currently no way to compel news commentators to tell the truth, or to force them to tell their listeners parts of the news that won't sit well with them.  Unless what the commentator says causes demonstrable harm, the FCC pretty much has its hands tied.

So the Lacassagne study seems to suggest that as bad as partisan lies have gotten, we haven't nearly reached the bottom of the barrel yet.


Thursday, March 24, 2022

Sense, nonsense, and microwaves

One of the difficulties in detecting spurious claims occurs when the writer (or speaker) mixes fact, and real science, in with spurious bits and stirs the resulting hash so thoroughly that it's hard to tell which is which.  When a claim is made of unadulterated bullshit, our job is easier.  Mixtures of science and pseudoscience, though, are often hard to tease apart.

A loyal reader of Skeptophilia sent me a good example of this yesterday, a link to an article on the website NaturalSociety called "Microwave Dangers - Why You Should Not Use A Microwave."  

[Image licensed under the Creative Commons Mk2010, Microwave oven (interior), CC BY-SA 3.0]

In this piece, author Mike Barrett describes the terrible things that microwave ovens do to the people who use them and to the food that's cooked in them.  Amongst the claims Barrett makes:
  1. Microwave ovens heat food by making water molecules move "at an incredible speed."  This differs from conventional ovens, which gradually transfer heat into the food "by convection."  Further, this energy transfer into the water molecules results in their being "torn apart and vigorously deformed."
  2. Microwaves are radiation.  This radiation can "cause physical alterations" even though microwaves are classified as "non-ionizing."  This radiation "accumulates over time and never goes away."
  3. Microwave exposure has a greater effect on your brain than on your other body parts, because "microwave frequencies are very similar to the frequencies of your brain," and this causes "resonance."
  4. Exposure to microwaves causes all sorts of problems, from cancer to cataracts and everything in between.
  5. Raw foods have "life energy" in the form of "biophotons," that came directly from the sun.  These "biophotons" contain "bio-information," which is why eating sun-ripened raw fruits makes you feel happy.  Microwaving food destroys the "biophotons" which makes it lose all of its nutritional value.
  6. Microwaving foods causes the conversion of many organic molecules into carcinogens.
  7. Microwave ovens were invented by the Nazis.
Okay, let's look at these claims one at a time.
  1. First, all heating of food makes the molecules move faster.  That's what an increase in temperature means.  A piece of broccoli heated to 60 C in a microwave and a piece of broccoli heated to 60 C in a steamer have equal average molecular speeds.  Ordinary ovens don't heat most foods by convection; convection heating requires bits of the food itself to move -- so, for example, heating a pot of soup on the stove creates convection, where the bottom part of the soup, in contact with the base of the pot, gets heated first, then rises, carrying its heat energy with it.  Foods in conventional ovens are heated by a combination of radiation from the heating coils, and conduction of that heat energy into the food from the outside in.  Further, heating the water molecules doesn't "tear them apart," because then you'd have hydrogen and oxygen gas, not water.
  2. Microwaves are radiation.  So is sunlight.  Sure, microwaves can cause physical alterations, which is why it's inadvisable to climb inside a microwave oven and turn it on.  But not all kinds of radiation accumulate; the microwaves themselves are gone within a millisecond (absorbed and converted into heat) of when the magneto shuts off, otherwise it wouldn't be safe to open the door.  Barrett seems to be making an unfortunately common error, which is to confuse radiation with radioactivity.  Radioactive substances, or at least some of them, do bioaccumulate, which is why strontium-90 showed up in cows' milk following the Chernobyl disaster.  But your microwaved bowl of clam chowder is not radioactive, it's just hot.
  3. When oscillations of one body trigger oscillations of another body at the same frequency, this is called resonance.  However, your brain does not oscillate at the frequency as microwaves -- the frequency he quotes for microwaves inside a microwave oven is 2,450 megahertz (2.45 billion times per second), which is actually correct.  Brains, on the other hand, don't oscillate at all, unless you happen to be at a Metallica concert.
  4. Agreed, exposure to microwaves isn't good for you.   Thus my suggestion in (2) above not to get inside a microwave oven and turn it on.
  5. There is no such thing as a "biophoton."  You do not absorb useful energy in the form of photons in any case, for the very good reason that you are not a plant.  The only "bio-information" we have is our DNA.  Sun-ripened fruit may taste better, as it's ripened more slowly and has a longer time to develop sugars and esters (the compounds that give fruits their characteristic smell and taste), but microwaves don't destroy "life energy."  This bit is complete nonsense.
  6. Microwaving food may cause some small-scale alterations of organic molecules into carcinogens, but so does all cooking.  In fact, the prize for the highest introduction of carcinogens into food has to be awarded to grilling -- the blackened bits on a charcoal-grilled t-bone steak contain polycyclic aromatic hydrocarbons, which are known carcinogens.  The problem is, they're also very tasty carcinogens, which is why I still like grilled steaks.
  7. Microwave ovens weren't invented by the Nazis.  The first microwave oven was built by Percy Spencer, an engineer from Maine, in 1945.  The mention of the Nazis seemed only to be thrown in there to give the argument a nice sauce of evil ("anything the Nazis invented must be bad").  But it's false in any case, so there you are.
So, anyhow, that's my analysis of Barrett's anti-microwave screed.  He's pretty canny, the way he scatters in actual facts and correct science with poorly-understood science, pseudoscience, and outright nonsense; the difficulty is, you have to have a pretty good background in science to tell which is which.  All of which argues for better science education, and better education in critical thinking skills.  But any effort I make in that direction will have to wait, because my coffee's getting cold, and I need to go nuke it for a couple of seconds.


Wednesday, March 23, 2022

King of the whales

For a long time, one of the biggest evolutionary mysteries was the evolution of whales and dolphins.

Even for someone steeped in the evolutionary model, it was hard to imagine how these aquatic creatures descended from terrestrial mammals.  That they did was undeniable; not only do some species have vestigial hip and hind leg bones, inside their flippers they have exactly the same number and arrangement of arm bones as you have -- one humerus, radius, and ulna; seven carpals; five metacarpals; and fourteen phalanges.  If whales were a "special creation," it's hard to imagine why a Creator would have given them 29 articulated arm bones and then completely encased them in a flat, muscular flipper.

Skeleton of a baleen whale (drawing from the Meyers Konversationslexikon (1888) [Image is in the Public Domain]

So their relationship to terrestrial mammals was obvious, but what wasn't obvious is how they got to where they are today.

Then in 1981 a fossil bed was uncovered in the Kuldana Formation of Pakistan, a sedimentary deposit from what was a shallow marine estuary back in the early Eocene Epoch (on the order of fifty million years ago), that contained a treasure trove of fossilized cetaceans.  This allowed researchers to piece together the evolution of whales and dolphins, placing them in Order Artiodactyla (their closest terrestrial relatives appear to be hippos).  

Skeleton of Ambulocetus, one of the amphibious species of cetaceans linking their terrestrial ancestors with today's aquatic whales [Image licensed under the Creative Commons Notafly, AmbulocetusNatansPisa, CC BY-SA 3.0]

Back in the Eocene, some of these proto-cetaceans were some badass apex predators.  Take Basilosaurus -- the name is Greek for "king lizard," a misnomer, at least the "lizard" part -- which lived in the Tethys Ocean, a body of water that has since been largely erased by plate tectonics.  (The Mediterranean, Black, and Caspian Seas are about all that's left of it.)  Basilosaurus could get to twenty meters in length, and probably ate large fish like sharks and tuna.  It's Basilosaurus that got me to thinking about this topic in the first place; a couple of loyal readers of Skeptophilia sent me a link to an article about a new fossil discovery in Peru.  It's hard to imagine it, but the now bone-dry Ocucaje Desert of southern Peru was once the floor of a shallow sea, an embayment of the (at that point) rapidly shrinking Tethys.  It's provided huge numbers of Eocene fossils, but the one they just found is pretty spectacular; a complete, well-preserved skull of a Basilosaurus that when it was alive was on the order of seventeen meters from tip to tail.

"This is an extraordinary find because of its great state of preservation," said Rodolfo Salas-Gismondi, part of the team that found the fossil.  "This animal was one of the largest predators of its time.  At that time the Peruvian sea was warm.  Thanks to this type of fossil, we can reconstruct the history of the Peruvian sea."

It's fascinating that we're still piecing together the evolution, ecology, and geology of the ancient world -- in this case, a world with carnivorous proto-whales twice as long as a school bus, equipped with big nasty pointy teeth.  Life in the seas back then must have been risky business.  If ever time travel is invented, I'd love to go back and see it for myself -- preferably from a safe distance.  And as interested as I am, I doubt I'd be donning my scuba gear to get a closer look.


Tuesday, March 22, 2022

The painted bones

It's fascinating how long into our past we've had rituals surrounding death.

There's decent evidence that our cousins the Neanderthals -- which went extinct on the order of forty thousand years ago -- buried their dead, and used ceremonial pigments like red and yellow ochre to decorate the bodies.  What I'm curious about is if those rituals were performed purely as fond remembrance of the the person who had died, or if it had a more religious significance.  Did they believe in an afterlife?  Was the reverence shown to a dead person's body because of belief that the person's soul still, in some sense, inhabited the remains?  Or some other reason entirely?  

It's all too easy to misinterpret the tangible evidence left behind, even from the relatively recent past.  Take, for example, the practice -- most common in Scotland and England -- of placing sturdy metal cages over grave sites.  The more fanciful-thinking believe it was because of a fear of vampires or zombies -- to protect the living from the dead.

A "mortsafe" in Cluny, Aberdeenshire, Scotland

The real reason -- which we know from the writings of the time -- was that it was actually to protect the dead from the living.  Grave robbing was common in the seventeenth and eighteenth centuries, not only to steal any valuables the person might have been buried with, but to sell the corpse itself to medical or anatomical laboratories for dissection.  (Recall the early nineteenth century Burke and Hare murders, where a pair of enterprising young men decided it was more lucrative to kill people themselves and sell their bodies than to wait for them to die; Hare turned King's evidence in exchange for immunity if he testified against Burke, which he did.  Burke was hanged -- and in a grisly but ironic twist, his body was given to an anatomical laboratory for dissection.)

So it's harder than you'd think to ascertain the motives people had for certain ritual practices in the past.  As far as the decoration of bodies by the Neanderthals, of course, at this point it's impossible to know.  But it's fascinating that our (very) distant ancestors had burial rituals not so very different from our own.

A recent find in Turkey has shown that modern humans have been doing this sort of thing for a very long time as well.  Çatalhöyük, nicknamed the "oldest city in the world," has provided fascinating archaeological finds before; the "Mother Goddess of Çatalhöyük," a six-thousand-year-old ceramic statue probably associated with rituals of fertility (sex being the other thing people have been obsessed with for a long time) is probably the most famous artifact from the site.  (If you're wondering how Çatalhöyük is pronounced -- heaven knows I was -- I'll save you the trouble.  Near as I can get, it's something like "chot-al-hoik.")

[Image is licensed under the Creative Commons Nevit Dilmen, Museum of Anatolian Civilizations 1320259 nevit, CC BY-SA 3.0]

A new find at the site, though, is equally interesting.  A team from the University of Bern has uncovered nine-thousand-year-old bones -- so a full thousand years older than the Mother Goddess figurine -- that show evidence of having been painted.  Not only were they painted, they appear to have been unearthed more than once, and repainted.  Fascinatingly, they used different colors for different genders -- cinnabar/red for males, copper-bearing minerals/blue and green for females.  Not all the bones were so decorated; it may have been a mark of status, or membership in a ruling class or priestly class, but all that is speculation.  (The fact that there have been painted bones of children found suggests that it wasn't mere individual status that was the deciding factor.)

There's also an association between the number of painted burials in a building, and the amount of painted decoration on the walls.  "This means when they buried someone, they also painted on the walls of the house," said study senior author Marco Milella.  "Furthermore, at Çatalhöyük, some individuals stayed in the community: their skeletal elements were retrieved and circulated for some time, before they were buried again.  This second burial of skeletal elements was also accompanied by wall paintings."

I'd like to think that the painted bones were a sign of reverence and not fear of retaliation by an angry spirit, but that too is speculation.  All we have is the artifacts to judge by.  Even so, it's fascinating to get a glimpse into the distant past of our own species.

And you have to wonder what our distant descendants will make of the artifacts left from our own society.  What will they think of the marble and granite monuments we raised over the dead?  It puts me in mind of the eerie, atmospheric rhyme I saw on a gravestone in the cemetery in Waynesburg, Pennsylvania where my great- and great-great grandparents are buried:

Remember, traveler, as you pass by,
As you are now, so once was I;
As I am now, so you will be;
Prepare for death, and follow me.


Monday, March 21, 2022

Dowsing for corpses

Back when I was teaching, I ran into students with a lot of fringe-y beliefs, or at least unscientific ones.  But if you had to pick which one students were the most reluctant to abandon, I bet you'd never guess.


Dowsing, also called water-witching, is the belief that you can use a forked stick (more modern dowsers use a pair of metal rods on a swivel) to locate stuff.  It started out being used to find underground water for a well (thus the appellation "water-witching"), but has since progressed (or regressed?  Guess it depends upon your viewpoint) to being used to find all sorts of things, including -- I kid you not -- marijuana in kids' lockers in a high school.

"But it works!" students said, when I told them there was no scientific basis for it whatsoever.  "My dad hired a guy to come tell us where to dig our well, and we hit water at only thirty feet down!"

Yeah, okay.  But this is upstate New York, one of the cloudiest, rainiest climates in the United States.  Unless you're standing on an outcropping of bedrock, there's gonna be groundwater underneath you.  In fact, only about twenty miles from here, there's a hillside with a natural artesian spring -- someone put a pipe into it, and people stop and fill up water bottles from the clear water gushing out.  So it's entirely unsurprising that you hit water where the dowsing guy indicated.  You'll hit water pretty much anywhere around here if you dig down a ways.

[Image is in the Public Domain]

What's funniest are the quasi-scientific explanations the dowsers give as to why it (allegedly) works.  An example is that you should always make your dowsing rod from a willow branch, because willows grow near water, so the wood remains attracted to it.  Even though I'm yet to see how a dead branch could respond that way.  Or any way, honestly.

Given that it's dead.

Every scientifically-valid study of dowsing has resulted in zero evidence that it works.  This doesn't mean the dowsers are deliberately cheating; they may honestly think the stick is moving on its own.  This is called the ideomotor effect, where small movements made unconsciously by the practitioner convince him/her (and the audience) that something real, and supernatural, is going on.  (The same phenomenon almost certainly accounts for spiritualist claims like Ouija board divination and table-turning.)

But despite these sorts of arguments, I fear that I convinced few students to change their beliefs.  "I saw it happen!" is a remarkably powerful mindset, even once you accept that we're all prone to biases, and that we're all easily fooled when it comes to something we want to believe.

So this is why I was unsurprised but disheartened to read an article from Mother Jones sent to me by a long-time loyal reader of Skeptophilia.  In it, we read about one Arpad Vass, a guy who believes that you can use dowsing rods...

... to find dead bodies.

This would just be another goofy belief, and heaven knows those are a dime a dozen, but he has somehow convinced the people who run the National Forensic Academy in Oak Ridge, Tennessee that his technique is scientifically sound.  He has some kind of cockeyed explanation of how it works -- that the effect is due to piezoelectricity, a phenomenon where certain crystalline substances develop a charge when they're subjected to mechanical stress.  Piezoelectricity is real enough; it's the basis of quartz watches, inkjet printing, and electric guitar pickups.  But even if decomposing bone can generate some net static charge, it would leak away into the soil it's buried in -- there's no mechanism by which it could exert a pull on some bent wires several meters away.  (Actually, Vass claims he's successfully found corpses this way from a hundred meters away.  If the static charge is that high, you shouldn't need a dowsing rod to detect it -- a plain old boring volt meter would work.  Funny how that never happens.)

And, of course, there's the problem that it doesn't work for everyone.  Vass has an answer for that, too.  "If people don’t have the right voltage, it’s not going to work," he says.  "Everything in the universe vibrates at a very specific frequency.  Gold has a gold frequency, silver has a silver frequency, and your DNA has your frequency."

I guess bullshit has a specific frequency, too.

The problem is that Vass isn't just playing around, or doing something that isn't a huge deal if it doesn't work (like finding a well drilling site).  This is injecting pseudoscience into police investigation.  And recently, he's gone one step further; he has invented, he said, a "quantum oscillator" that supposedly picks up a person's "frequency" from something like a hair sample or fingernail clippings, and then beams that frequency out, and it will somehow interact with the person (or his/her corpse), and send back a signal to the device...

... from up to 120 kilometers away.

I was encouraged by the fact that the Mother Jones article came down fairly solidly on the side of the scientists, stating unequivocally that there is no evidence that any form of dowsing works.  They also highlighted the human side of this; Randy Shrewsberry, founder of the nonprofit Criminal Justice Training Reform Institute, was quoted as saying "Law enforcement regularly accepts the flaws of these practices despite the life-altering impacts that can occur when they’re wrong."  In one Virginia case, a man was convicted of murder even though no body of the victim was found -- in part, because of testimony from Vass that his device had found the victim's "frequency" in eight locations, indicating that her body had been dismembered.

Eric Bartelink, professor of anthropology at the University of California - Chico and former president of the American Board of Forensic Anthropology, was unequivocal.  "Vass is operating these services that are not scientifically valid.  It’s very misleading to families and law enforcement."

So at least some prominent voices in the field are speaking up to support the findings of every scientific study ever done on the practice of dowsing.  I'm still appalled that a forensic training academy has somehow been convinced to take Vass and his nonsense seriously; I guess being highly educated isn't necessarily an immunization against confirmation bias.  As for me, I'm calling bullshit on the whole practice.

Beam that into your "quantum oscillator," buddy.


Saturday, March 19, 2022

The imaginary fireball

The subject of today's post isn't anything new; it was just new to me, and, I suspect, will be to a good many of my readers, as well.  I found out about it from a long-time loyal reader of Skeptophilia, who sent me a link about it with a note saying, "Okay, this is interesting. What think you?"

The link was to a 2008 article that appeared in entitled, "Cuneiform Clay Tablet Translated for the First Time."   The tablet in question is called the "Sumerian planisphere," and was discovered in the ruins of Nineveh by a British archaeologist named Henry Layard in the middle of the nineteenth century.  From where it was found, it was dated to around 700 B.C.E., and although it was recognized that part of what it contained was maps of constellations, no one was quite sure what it was about.

The Sumerian planisphere [Image is in the Public Domain]

The researchers were puzzled by the fact that the arrangements of the stars in the constellations were close to, but not exactly the same as, the configurations they would have had at the time it was made, but then they concluded that those would have been their positions 2,400 years earlier -- and they claimed the text and maps didn't just show the stars on any old night, but on a sequence of nights chronicling the approach of a comet or asteroid.

Which, ultimately, hit the Earth.

They claim the collision site was near Köfels, Austria, and triggered a five-kilometer-wide fireball.  Why no huge crater, then?  The answer, they say, is that the steep side of the mountain gave way because of the impact, and a landslide ensued.  Organic matter trapped in the debris flow gave an approximate date, but once deciphered, the Sumerian planisphere's detailed sky maps (including the position of the Sun, the timing of sunrise, and so on) supposedly pinpointed the exact day of the impact: the 29th of June, 3123 B.C.E.

Between the planisphere and the geometry of the collision site, the researchers claimed that the comet came in at a very shallow angle -- their estimate is about six degrees -- clipped the nearby peak of Gamskogel, and exploded, creating a five-kilometer-wide moving fireball that finally slammed into Kófels head-on.

You may be wondering why Sumerian astronomers had any particular interest about an impact that occurred almost four thousand kilometers away.  They have an answer for that, too; the shallow impact angle created a sheet of superheated debris that arced away from the impact site, and right toward what is now the Middle East.  A 2014 paper by Joachim Seifert and Frank Lemke concluded that the greatest amount of damage didn't occur right at the collision site, but where all that flaming debris eventually landed -- in Mesopotamia.

"The back plume from the explosion (the mushroom cloud) would be bent over the Mediterranean Sea re-entering the atmosphere over the Levant, Sinai, and Northern Egypt," said Mark Hempsell of the University of Bristol, who is the chief proponent of the Köfels collision hypothesis.  "The ground heating though very short would be enough to ignite any flammable material - including human hair and clothes.  It is probable more people died under the plume than in the Alps due to the impact blast."

The dust and ash from the event caused a hundred-year-long "impact winter" that triggered droughts, leading to a several-centuries-long famine that ultimately caused the collapse of the Akkadian Empire.

Okay, so that's the claim.  There are, unfortunately, a host of problems with it, beginning with those pointed out by the scathing rebuttal by Jeff Medkeff in Blue Collar Scientist.  The first issue is that there is "impact glass" -- vitrified shards of debris partially melted by a collision -- in central Europe, but it dates to much longer ago (certainly more than eight thousand years ago).  There is no impact debris to be found between central Europe and the Middle East anywhere near 3,100 B.C.E., no scorched pottery shards or charred bones that would be indicative of a rain of fire.  An asteroid or comet "clipping" a mountain -- and then generating a plume of debris that was still superheated four thousand kilometers downstream -- would have sheared off the entire mountain top, and there'd be clear evidence of it today.  Last -- and most damning -- the Köfels formation has been studied by geologists and found to be not a single event, but a series of landslides, none of which show convincing evidence of having been triggered by an impact.

The scientists involved don't even seem sure of their own chronology; the article says 3123 B.C.E. (the 29th of June, to be exact), while the Seifert and Lemke paper says the impact occurred almost a thousand years later (in 2193 B.C.E.).  The latter date at least is closer to the claimed civilization-destroying effects; the Akkadian Empire fell in around 2154.  It seems likely, though, that the collapse of the Akkadians (and various others, including the Indus Valley Civilization, the Egyptian Old Kingdom, and the Chinese Liangzhu Culture) was due to a drought called the "4.2 Kiloyear Event."  The cause of that is uncertain, but probably wasn't an impact (again, because of the lack of clear stratigraphic evidence).  The most likely culprit was a shift in cold-water currents in the North Atlantic changing patterns of rainfall, but even that is speculative.

As far as Hempsell's even more outlandish claim -- that the Köfels impact generated the story of the destruction of Sodom and Gomorrah -- I won't even go into details except to say that there is evidence of a much smaller airburst explosion where the cities were allegedly located, but once again, it's from a different date (around 1650 B.C.E.).  As for any other evidence of the biblical "Cities on the Plain," it's slim to nonexistent.  Archaeologist Israel Finkelstein, of Tel Aviv University, called the tale of the destruction of Sodom and Gomorrah "an etiological story, that is, a legend that developed in order to explain a landmark.  In other words, people who lived in the later phase of the Iron Age, the later days of the kingdom of Judah, were familiar with the huge ruins of the Early Bronze cities and told a story of how such important places could be destroyed."

So given the (1) lack of any reasonably reliable evidence, (2) a chronology that even the researchers don't seem to be able to keep straight, and (3) plausible alternative explanations for the supposed societal aftereffects, I'm afraid I'm gonna be in the "don't think so" column on this one.  As dramatic as it would be if the astronomers of Sumer documented the approach and ultimate collision of a comet or asteroid, a collision that ultimately showered flaming debris over the entire Middle East, I think we have to set aside the drama of an imaginary fireball for the cold light of reason.


Friday, March 18, 2022

Birds of a feather

I should probably avoid social media altogether, given what a cesspit of ugliness it can be sometimes.

Unfortunately, it's provided the simplest way of keeping in touch with dear friends I seldom see, especially during the height of the pandemic (when I kind of wasn't seeing anyone).  But to say it amplifies the echo chamber effect is an understatement.  Not only do we tend to link on social media to like-minded folks (can't tell you how many times I've heard someone say that they'd unfriended someone solely because of some opinion or another, usually political), but with the few non-like-minded social media friends we have and keep, it takes so much energy to argue that most of us just sigh heavily, shrug our shoulders, and move on, even when confronted with opinions completely antithetical to our own.

Take, for example, what I saw posted yesterday -- a meme saying, "All I'm saying is, if my dog got three rabies shots and then still got rabies, I'd begin to get suspicious."  (It took all my willpower not to respond, "Oh, how I wish that was all you were saying.")  In any case, not only does the post trumpet zero understanding about how vaccinations and immunity work, it's back to the maddening phenomenon of a layperson thinking an opinion formed from watching Fox News and doing a ten-minute read of some guy's website constitutes "research."

If that wasn't bad enough, a friend-of-the-friend -- no one I know -- responded, "It's what comes from drinking the libtard kool-aid."  So, let's take the ignorant post and make it worse by slathering on some ugly vitriol demeaning half the residents of the country.

And what did I do in response?


I just didn't have the energy to get drawn in.  Plus, there's a sense of such argument being futile anyhow.  I seriously doubt anyone, in the history of the internet, has ever had their opinion changed by arguing a point online with a total stranger.

Only a few minutes after seeing the post, though, I stumbled on some research out of the University of Buffalo that contains at least a glimmer of hope; that the screeching you hear on social media isn't necessarily reflective of the attitudes that the majority of people have, because these platforms amplify the loudest voices -- not necessarily the ones that make the best sense, or are even the most common.

In a paper in The Journal of Computer-Mediated Communication, Yini Zhang, Fan Chen, and Karl Rohe looked at our tendency to form "flocks" on social media.  By studying the posts from 193,000 Twitter accounts, and the 1.3 million accounts those accounts follow, they were able to uncover patterns of tweets and retweets, and found the strongest-worded opinions were the ones that got liked and retweeted the most.  They called this phenomenon murmuration -- the term comes from the flocking behavior of starlings -- capturing the idea that online expression of opinions forms and shifts not based on actual changes in the information available, but on who is saying what, and how stridently.

"By identifying different flocks and examining the intensity, temporal pattern and content of their expression, we can gain deeper insights far beyond where liberals and conservatives stand on a certain issue," said study lead author Yini Zhang, in an interview in Science Daily.  "These flocks are segments of the population, defined not by demographic variables of questionable salience, like white women aged 18-29, but by their online connections and response to events.  As such, we can observe opinion variations within an ideological camp and opinions of people that might not be typically assumed to have an opinion on certain issues.  We see the flocks as naturally occurring, responding to things as they happen, in ways that take a conversational element into consideration."

The fact that the social media flocking doesn't mirror the range of opinion out there is heartening, to say the least.  "[S]ocial media public opinion is twice removed from the general public opinion measured by surveys," Zhang said.  "First, not everyone uses social media.  Second, among those who do, only a subset of them actually express opinions on social media.  They tend to be strongly opinionated and thus more willing to express their views publicly."

It's not just political discourse that can be volatile.  A friend of mine just got blasted on Facebook a couple of days ago, out of the blue, because she posts stuff intended to be inspirational or uplifting, and one of her Facebook friends accused her of being "self-righteous," and went on to lambaste her for her alleged holier-than-thou attitude.  The individual in question doesn't have a self-righteous bone in her whole body -- she might be the only person I know who has more of a tendency to anxious self-doubt than I do -- so it was a ridiculous accusation.  But it does exemplify the sad fact that a lot of us feel freer to be unkind to people online than we ever would face-to-face.  

The important point here is that it's easy to see the nastiness and foolishness on social media and conclude that this is the way the majority of the public believes and acts, but the Zhang et al. study suggests that the majority of the opinions of this sort are generated by a few strident people.  Only afterward do those posts act like a magnet to the like-minded followers they already had.

So as hard as it is to keep in mind sometimes, I maintain that the majority of people are actually quite nice, and want the same things we want -- safety, security, the basic necessities, health and happiness for our friends and family.  The ugly invective from people like the guy who made the "libtard" comment is far from a majority opinion, and shouldn't feed into a despairing sense that everyone is horrible.

The flocks, apparently, aren't led by the smartest birds, just the ones who squawk the loudest.  A lot of the rest are tagging along for the ride.  There's a broader population at the center, opinion-wise, than you'd think, judging by what you see on social media.  And when the birds step away from social media, most of them turn out to be ordinary tweeters just trying to stay with the flock-mates they feel the most comfortable with.


Thursday, March 17, 2022

Alchemy class

Frequently, when I'm asked why I'm opposed to science teachers being required to teach "alternate explanations" along with teaching evolution, I respond, "It's interesting that no one is asking teachers to present 'alternate explanations' in other areas of science.  No one, for example, expects chemistry teachers to advocate alchemy as an 'alternate explanation.'"

By now, you'd think I'd know better than to use the phrase "no one" in a statement about belief in some crackpot idea.

Meet Jay Weidner, film director responsible for such masterpieces as Timewave 2013, Infinity: The Ultimate Trip, and (most significantly, for our purposes) The Secrets of Alchemy: The Great Cross and the End of Time.  On his website, Weidner outlines his three laws of the universe, which are poised to oust Newton's Laws as fundamental rules governing nature:
  • Weidner's First Law: "Whatever ideas are the most suppressed are the most likely to be the closest to the truth."
  • Weidner's Second Law: "If a picture is worth a thousand words, then a symbol is worth a thousand pictures."
  • Weidner's Third Law: "The only people who call conspiracies 'theories' are the conspirators."
The First Law would seem to suggest that we should go back the Four Humors Theory of Medicine (all illnesses are caused by an imbalance between the Four Bodily Humors -- blood, phlegm, yellow bile, and black bile), as that was suppressed back when they noticed that patients treated according to the recommendations of this theory usually died.  The Second Law means -- never mind, I don't know what the fuck the Second Law means.  But the Third Law would seem to indicate that I'm a conspirator.  I guess that given that I call most conspiracies "theories," and worse still, ridicule them frequently in my blog, I'm not only a conspirator, but I'm really high up in the hierarchy of the conspiracy because I'm so determined to convince everyone that it isn't real.

How about that?  I'm in such a high echelon in a top-secret conspiracy that the fact was secret even from me.  Now that's what I call a secret conspiracy.

In any case, Weidner is a big believer in alchemy, especially as it pertains to the production of the Philosopher's Stone, a substance that can give eternal life.  I thought that Dumbledore had destroyed the Philosopher's Stone way back in the first book of Harry Potter, but Weidner disagrees; he said he has discovered a book that shows you how to produce it, using "materials costing less than a thousand dollars," and he illustrates this on his website using a picture of Aquarius, symbolized by a guy with a Fabio hairstyle, huge pecs and biceps, a six-pack, and almost no clothes, pouring water out of a jar, wearing an expression that seems to say, "Hey, baby, you wanna partake of my Elixir of Life?"

Now there's a symbol that's worth a thousand pictures.

Anyway, the book that describes the process for making the Philosopher's Stone is available for free here.  Weidner cautions us all to download the book before the Evil Conspirators find out that it's available and "hit the internet kill switch."  Because we all know how much the people who run the internet care about the presence of wacky, absurd ideas out there online.  We can't have that.

Curious, I took a look at the book (The Book of Aquarius), since it's free.  When you go to the "Read Online" page, you get a set of chapter headings, and not wanting to slog through the pages of quasi-metaphysical bullshit, I decided to cut to the chase, and skipped to Chapter 14: What Is It Made Of?  And I found out that, to my great shock, the Philosopher's Stone is only made from one ingredient.  And that ingredient is...

... wait for it...


Yes, you read that right.  I know, because I had to read it several times before I was convinced that I was reading it correctly myself.  And I thought, "Well, at least Weidner was right when he said that you can get the ingredients for less than a thousand dollars."  Here's the relevant passage from the book:
I must explain that the Stone could in theory be made from anything, since everything contains the life-energy to some degree, which is the active ingredient of the Stone.  Urine contains this life-energy in high concentration, due to the fact that it has just come out of you, and you, as a living animal, are full of life-energy...  From the urine we will need to extract a distillate (water) and a salt.  The life-energy is in the water, and since the life-energy is so volatile it will remain with the water even when the water is distilled (evaporated and condensed).  Our bodies do not want to reject the life-energy in the urine, but have no choice since the life-energy is attached to the water.  Secondly, urine is the perfect ingredient because it is as of yet undetermined.  That is, it has been well filtered, broken down and purified.  It contains all kinds of different minerals, but in minute particles not yet assigned to any purpose.
At this point, I had to stop reading, mostly because it's hard to read the computer screen when your forehead is on your desk.

Anyhow, I encourage you to peruse Weidner's site (I especially recommend the stuff about Stanley Kubrick faking the moon landing) and The Book of Aquarius.  But if you succeed in making the Philosopher's Stone, please don't tell me about it.  I don't want to know.  For one thing, it will mean that you'll have been playing around with your own urine, or, god forbid, someone else's, and that's just nasty.  For another, at that point you'll have discovered the Secret of Eternal Life, and being that I'm one of the Conspirators, I'd be duty-bound to kill you.  That'd just be unfortunate for a variety of reasons, the most important one of which is that I need all the readers I can get, and if I went around killing them it might discourage people from following my blog.


Wednesday, March 16, 2022

Thy fearful symmetry

Everyone knows that most living things are symmetrical, and the vast majority of them bilaterally symmetrical (i.e. a single line down the midsection divides the organism into two mirror-image pieces).  A few are radial -- where any line through the center point divides it in half -- such as jellyfish and sea anemones.  Even symmetrical organisms like ourselves aren't perfectly so; our hearts and spleens are displaced from the midline toward the left, the appendix to the right, and so forth.  But by and large, we -- and the vast majority of living things -- have some kind of overall symmetry.

True asymmetry is so unusual that when you see it, it really stands out as weird.  Consider the bizarre-looking flounder:

[Image licensed under the Creative Commons Peter van der Sluijs, Large flounder caught in Holland on a white background, CC BY-SA 3.0]

Flounders start out their lives as ordinary little fish, upright with symmetrically-placed eyes, fins, and so on.  But as they mature, their skulls twist and flatten, and they end up with both eyes on the same side of the head -- a great adaptation for a fish that spends its life lying flat on the seabed, and who otherwise would constantly have one eye pointing downward into the mud.

A question I've asked here before has to do with the constraints on evolution; which of the features of life on Earth are so powerfully selected for that we might expect to see them in life on other planets?  (An example of one that I suspect is strongly constrained is the placement of the sensory organs and brain near the front end of the animal, pointing in the direction it's probably moving.)  But what about symmetry?  There's no obvious reason why bilateral symmetry would be constrained, and it seems as if it might just be a holdover from the fact that our earliest ancestors happened to be bilateral, so we (with a few stand-out exceptions) have inherited it down through the eons from them.

What about symmetry in general, however?  If we went to another life-bearing planet, would we find symmetrical organisms, even if they differ in the type of symmetry from ours?

The answer, judging from a paper that appeared this week in Proceedings of the National Academy of Sciences, by a team led by Iain Johnston of the University of Bergen, appears to be yes.

What Johnston and his team did was analyze the concept of symmetry from the perspective of information theory -- not looking at functional advantages of symmetry, but how much information it takes to encode it.  There are certainly some advantages -- one that comes to mind is symmetrically-placed eyes allows for depth perception and binocular vision -- but it's hard to imagine that's a powerful enough evolutionary driver to account for symmetry in general.  The Johnston et al. research, however, takes a different approach; what if the ubiquity of symmetry is caused by the fact that it's much easier to program into the genetics?

The authors write:

Engineers routinely design systems to be modular and symmetric in order to increase robustness to perturbations and to facilitate alterations at a later date.  Biological structures also frequently exhibit modularity and symmetry, but the origin of such trends is much less well understood.  It can be tempting to assume—by analogy to engineering design—that symmetry and modularity arise from natural selection.  However, evolution, unlike engineers, cannot plan ahead, and so these traits must also afford some immediate selective advantage which is hard to reconcile with the breadth of systems where symmetry is observed.  Here we introduce an alternative nonadaptive hypothesis based on an algorithmic picture of evolution.  It suggests that symmetric structures preferentially arise not just due to natural selection but also because they require less specific information to encode and are therefore much more likely to appear as phenotypic variation through random mutations.  Arguments from algorithmic information theory can formalize this intuition, leading to the prediction that many genotype–phenotype maps are exponentially biased toward phenotypes with low descriptional complexity.

Which is a fascinating idea.  It's also one with some analogous features in other realms of physiology.  Why, for example, do men have nipples?  They're completely non-functional other than as chest adornments.  If you buy intelligent design, it's hard to see what an intelligent designer was thinking here.  But it makes perfect sense from the standpoint of coding simplicity.  It's far easier to have a genetic code that takes the same embryonic tissue, regardless of gender, and modifies it in one direction (toward functional breasts and nipples) in females and another (toward non-functional nipples) in males.  It would take a great deal more information-containing code to have a completely separate set of instructions for males and females.  (The same is true for the reproductive organs -- males and females start out with identical tissue, which under the influence of hormones diverges as development proceeds, resulting in pairs of very different organs that came from the same original tissue -- clitoris and penis, ovaries and testicles, labia and scrotum, and so on.)

So symmetry in general seems to have a significant enough advantage that we'd be likely to find it on other worlds.  Now, whether our own bilateral symmetry has some advantage of its own isn't clear; if we landed on the planets orbiting Proxima Centauri, would we find human-ish creatures like the aliens on Star Trek, who all looked like people wearing rubber masks (because they were)?  Or is it possible that we'd find something like H. P. Lovecraft's "Elder Things," which had five-way symmetry?

And note that even though the rest of its body has five-way symmetry, the artist drew it with bilateral wings. We're so used to bilateral symmetry that it's hard to imagine an animal with a different sort. [Image licensed under the Creative Commons Українська: Представник_Старців (фанатський малюнок)]

So that's our fascinating bit of research for today; coding simplicity as an evolutionary driver.  It's a compelling idea, isn't it?  Perhaps life out there in the universe is way more similar to living things down here on Earth than we might have thought.  Think of that next time you're looking up at the stars -- maybe someone not so very different from you is looking back in this direction and thinking, "I wonder who might live on the planets orbiting that little star."