Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, June 12, 2024

Cloud collision

Contrary to what the medieval church wanted you to believe, the Earth is in constant motion.

They went to enormous lengths to stand by the principle that we're the center of the universe, motionless, while everything revolves around us in perfect circles.  Arrogant attitude, that.  Also wildly wrong.  Not only is the apparent motion of the stars at night caused by our own rotation, the stars aren't in quite the same positions at a given time one night as compared to the next because we're revolving around the Sun.

Then along came Kepler, and showed that even the "perfect circles" part was wrong; the planets and their moons orbit in ellipses, not circles, some of them quite eccentric (the mathematicians' word for the degree to which an ellipse deviates from a circle).

It's even worse than that.  The Earth's axis precesses, wobbling like a spinning top, drawing out a circle in the sky once every twenty-six thousand years.  So Polaris, hasn't always been the pole star, and at some point won't be any longer.  This fact was discovered by the Greek astronomer Hipparchus, but although the aforementioned church fathers loved the Greek philosophers -- they were especially fond of Aristotle -- they were also excellent at ignoring evidence that challenged their own worldviews, so Hipparchus's studies of axial precession were brushed aside.  The thirteenth century Persian polymath Nasir al-Din al-Tusi studied astronomical records and came up with a value very close to our currently accepted precession rate, but the church fathers didn't much listen to the Muslims, either, so it wasn't until eighteenth century French mathematician Jean le Rond d'Alembert said, "No, really, guys, this precession thing is real" that people in the western world started to accept it.

The path of apparent precession of the pole star. The bright star at the bottom is Vega, which was the pole star twelve thousand years ago (and will be again in fourteen thousand years). [Image licensed under the Creative Commons Tauʻolunga, Precession N, CC BY-SA 2.5]

But even that's not the end of it, because the Sun (and the rest of the Solar System) are in the edge of one of the spiral arms of the Milky Way, and are traveling at about 230 kilometers per second in orbit around the galactic core.  This is a good clip -- it's only a bit under a thousandth of the speed of light -- but even so, the galaxy is so enormous it will take about 225,000,000 years to complete one orbit.  Put another way, the last time the Solar System was in this spot was the early Triassic Period -- right at the beginning of the "Age of Dinosaurs."

It's this last motion that's what brings the topic up today, because a team led by Boston University astronomer Merav Opher has just found that the motion of the Sun and planets around the galactic center swept it through two successive clouds of cold gas and dust, hitting one about seven million years ago and another a little over two million years ago.  The clouds, which from our current perspective are in the constellation of Lynx, provided enough resistance that the heliosphere -- the region of space dominated by the outward pressure of material thrown off by the Sun -- shrank to the point that the planets were exposed to the dust of the interstellar medium.  This caused a spike of supernova-generated isotopes like iron-60 and plutonium-244 in cosmic dust trapped in sediments and ice layers here on Earth.

Opher's team found this cosmic dust in every place of those ages they looked.  It was the fingerprint of a collision -- between the Solar System and a pair of clouds.

It's an open question what effect that had on the Earth.  The collisions happened just as our hominid ancestors were moving their way out of the African savanna, so any additional flux of cosmic rays from being outside the heliopause didn't seem to do us any harm.  But it's a cool reminder that although we feel like the Earth is solid and unmoving beneath our feet, it's actually being spun around the universe like a little kid on the Tilt-o-Whirl.

But finally, there's even another layer on top of all the above, because the Milky Way and the entire Local Group are moving toward something called the "Great Attractor" at six hundred kilometers per second, over twice as fast as the Solar System's orbital velocity around the galactic center.  Presumably this is because of some sort of gravitational effect, but what sucks is that although we know the general direction where the Great Attractor is located, we don't even know what's there because it's directly on the opposite side of the center of our own galaxy.  In other words, we can't see where we're headed because the Milky Way is in the way.  

What it's in the way of remains to be seen.

So yeah.  The medieval church fathers were kind of spectacularly wrong.  The more we've learned, the weirder the universe gets, and the farther from the center of anything we appear to be.  It's better this way, though, because it gives us constant reminders of how grand and magnificent the universe is -- even if the inevitable consequence is a reminder of how tiny we are by comparison.

****************************************



Tuesday, June 11, 2024

Atmospheric rivers

If I asked you to name the deadliest single-event natural disaster to strike the western half of the United States in recorded history, what would you answer?

If I had to hazard a guess, most people are going to suggest the 1906 San Francisco earthquake.  This was a bad one, no doubt about it; an estimated three thousand people died, and most of the city was destroyed by the quake and the fires that followed it.  Another one that might come to mind is the eruption of Mount Saint Helens in 1980, but that one comes in a distant follower at fifty-seven casualties.

The worst natural disaster in the western United States -- by a significant margin -- is one a lot of people haven't heard of.  In the winter of 1861-1862, an atmospheric river event turned the entire Central Valley of California into an enormous lake, submerging once dry land under as much as ten meters of water.  Over a period of forty-five days, a hard-even-to-imagine three meters of rain fell in the Sierra Nevada Mountains and the surrounding area, draining down into the lowlands far too fast to run off.  Rivers overflowed their banks; some simply vanished under the expanding lake.  Although the middle part of the state bore the worst of it, devastating floods were recorded that year from northern Oregon all the way down to Los Angeles.

The exact death toll will probably never be known, but it's well over four thousand.  That's about one percent of the entire population of the state at the time.

A man named John Carr, writing in his memoir thirty years later, had this to say:

From November until the latter part of March there was a succession of storms and floods... The ground was covered with snow a foot deep, and on the mountains much deeper...  The water in the river ... seemed like some mighty uncontrollable monster of destruction broken away from its bonds, rushing uncontrollably on, and everywhere carrying ruin and destruction in its course.  When rising, the river seemed highest in the middle...  From the head settlement to the mouth of the Trinity River, for a distance of one hundred and fifty miles, everything was swept to destruction.  Not a bridge was left, or a mining-wheel or a sluice-box.  Parts of ranches and miners cabins met the same fate.  The labor of hundreds of men, and their savings of years, invested in bridges, mines and ranches, were all swept away.  In forty-eight hours the valley of the Trinity was left desolate.  The county never recovered from that disastrous flood.  Many of the mining-wheels and bridges were never rebuilt.

Many of the smaller towns never were, either.

Lithograph of K Street, Sacramento, California, in January of 1862 [Image is in the Public Domain]

What seems to have happened is that in rapid succession, a series of narrow plumes of moist tropical air were carried in off the Pacific.  These "atmospheric rivers" can carry an astonishing amount of water -- some of them have a greater flow rate than the Amazon River.  When they cross over land, sometimes they dissipate, raining out over a wide geographical area.  But the West Coast's odd geography -- two mountain ranges, the Coast Range/Cascades and the Sierra Nevada Mountains, running parallel to each other with a broad valley in between -- meant that as those plumes of moisture moved inland, they were forced upward in altitude (twice).  The drop in pressure and temperature as the air rose caused the water to condense, triggering a month-and-a-half-long rain event that drowned nearly the entire middle of the state.

The reason I bring this up is because the geological record indicates the Great Flood of 1861-62 was not a one-off.  These kinds of floods hit the region on the order of once every century or so.

Only now, the Central Valley is home to 6.5 million people.  And one of the predictions of our best models of climate change is that the warm-up will make atmospheric river events more common.

When people think of deadly disasters, they usually come up with obvious and violent ones like earthquakes and volcanoes.  Certainly, those can be horrific; the 1976 earthquake in Tangshan, China killed an estimated three hundred thousand people.  But the two most dangerous kinds of natural disasters, both in terms of human lives lost and property damage, are flooding and droughts -- two opposite sides of the climatic coin, and both of which are predicted to get dramatically worse if we don't somehow get a handle on the scale of fossil fuel burning.

I saw a quip making its way around social media a while back, that every disaster movie and horror flick starts with someone in charge ignoring a scientist.  There's some truth to that.  Unfortunately, we've not been very good at taking that message to heart.  We need to start listening -- and fast -- and learning from the lessons of the past.  Disasters like the Great California Flood will happen again, and now that we've stomped on the climatic accelerator, it will likely be sooner rather than later.

Let's hope we don't close our eyes to the potential for a catastrophe that will dwarf the one of 170 years ago by several orders of magnitude.

****************************************



Monday, June 10, 2024

Mirror image

One of the hallmarks of science is its falsifiability.  Models should generate predictions that are testable, allowing you to see if they conform to what we observe and measure of the real universe.  It's why science works as well as it does; ultimately, nature has the last word.

The problem is that there are certain realms of science that don't lend themselves all that well to experiment.  Paleontology, for example -- we're dependent on the fossils that happen to have survived and that we happen to find, and the genetic evidence from the descendants of those long-gone species, to piece together what the ancient world was like.  It's a little difficult to run an experiment on a triceratops.

An even more difficult one is cosmology -- the study of the origins and evolution of the universe as a whole.  After all, we only have the one universe to study, and are limited to the bits of it we can observe from here.  Not only that, but the farther out in space we look, the less clear it becomes,  By the time light gets here from a source ten billion light years away, it's attenuated by the inverse-square law and dramatically red-shifted by all the expanding space it traveled through to get here, which is why it takes the light-collecting capacity of the world's most powerful telescopes even to see it.

None of this is meant as a criticism of cosmology, nor of cosmologists.  But the fact remains that they're trying to piece together the whole universe from a data set that makes what the paleontologists have look like an embarrassment of riches.

The result is that we're left with some massive mysteries, one of the most vexing of which is dark energy.  This is a placeholder name for the root cause of the runaway expansion of the universe, which (according to current models) accounts for 68% of the mass/energy content of the universe.  (Baryonic, or ordinary, matter is a mere 5%.)  And presently, we have no idea what it is.  Attempts either to detect dark energy directly, or to create a model that will account for observations without invoking its existence have, by and large, been unsuccessful. 

But that hasn't stopped the theorists from trying.  And the latest attempt to solve the puzzle is a curious one; that dark energy isn't necessary if you assume our universe has a partner universe that is a reflection of our own.  In that universe, three properties would all be mirror images of the corresponding properties in ours; positive and negative charges would flip, spatial "handedness" (what physicists call parity) would be reversed, and time would run backwards.

Couldn't help but think of this, of course.


The idea is intriguing.  Naman Kumar, who authored the paper on the model, is enthusiastic about its potential for explaining the expansion of the universe.  "The results indicate that accelerated expansion is natural for a universe created in pairs," Kumar writes.  "Moreover, studying causal horizons can deepen our understanding of the universe.  The beauty of this idea lies in its simplicity and naturalness, setting it apart from existing explanations."

Which may well be true.  The difficulty, however, is that the partner universe isn't reachable (or even directly detectable) from our own, Lost in Space notwithstanding.  It makes me wonder how this will ever be more than just an interesting possibility -- an idea that, in Wolfgang Pauli's often-quoted words, "isn't even wrong" because there's no way to test whether it accounts for the data any better than the other, less "natural" models do.

In any case, that's the latest from the cosmologists.  Mirror-image universes created in pairs may obviate the need for dark energy.  We'll see what smarter people than myself have to say about whether the claim holds water; or, maybe, just wait for Evil Major West With A Beard to show up and settle the matter once and for all.

****************************************



Saturday, June 8, 2024

Spin doctor

Yesterday's post, which featured a guy who claims he has revolutionized physics with a model starting from the axiom that 1 x 1 is actually equal to 2, prompted a long-time loyal reader of Skeptophilia to send me a link accompanied by the note, "Yeah, okay, Gordon, but what about this, huh?  What about this?"

The link was to the page of a guy named John Mandlbaur, a South African "investor and successful businessman" who at least admits up front that he is "not an academic."  This, when you start looking into his claims, is putting it mildly, because he is also claiming to have revolutionized physics, this time starting from the statement that the Law of Conservation of Angular Momentum is wrong.

At least he, unlike the guy in yesterday's post, is denying something that you learn in high school, not in third grade.

This, however, doesn't make his claim any more sensible.  Angular momentum, you may recall from physics class, is in the simplest case (like whirling a weight tied to a lightweight string in a circle above your head) the product of three quantities; the mass, the tangential velocity, and the radius.  And what the conservation law says is that in a closed system that quantity doesn't change.  (Remember the "closed system" part, because that'll become important in a moment.)

The most common example of the Law of Conservation of Angular Momentum is the way figure skaters' rotational rate increases when they bring their arms in.  By reducing their effective radius, the velocity has to increase in proportion to keep the aforementioned product constant.  Most kids have seen this in effect, too; whirl a weight on a string, and if you pull the string to decrease the radius of the circle, the weight spins faster.

[Image licensed under the Creative Commons Gyroskop, CC BY-SA 3.0]

This is where Mandlbaur starts leaping about making excited little squeaking noises, and telling us that this can't be true.  He goes through a "thought experiment" wherein he argues that angular momentum isn't conserved, because if you reduce the radius (pull on the string to decrease it to, say, one percent of its original length) the rotational velocity would have to increase by a ridiculous amount.  Because we never see that happen, the Law of Conservation of Angular Momentum must be wrong.

What this ignores is the "closed system" part I mentioned above.  Angular momentum is conserved if there is no external torque -- which there damn well would be if you have a mass moving that fast, produced by the air resistance.  Plus, there's the little issue of the centripetal force -- put simply, how hard you'd have to pull on the string.  Centripetal force is defined by the formula F = mass x velocity^2 / radius, so as you can see, as the velocity rises and the radius decreases, both contribute to it becoming progressively harder and harder to hang on to the string.  Since the way this force is transmitted into the weight is the tension in the string, eventually the string breaks, and the weight goes flying off in a direction tangent to the circle until it meets an opposing force, like the windshield of your neighbor's car.

A physicist named Val Rousseau did a much more thorough takedown of Mandlbaur's claims, and I won't steal his thunder by repeating all of his careful debunking.  Suffice it to say that Mandlbaur doesn't stop with trashing the Law of Conservation of Angular Momentum; he also says that Newton's Second Law, the Work-Energy Principle, both Laws of Relativity, and all of quantum mechanics are also wrong.

Oh, and light actually has mass.

What's interesting about Mandlbaur is how combative he is.  Anyone who criticizes his work is "childish" and is engaging in "character assassination" or a "blatant ad hominem."  Sorry, dude, saying "you are wrong" is not an ad hominem when you are, in fact, wrong.  To say the experimental evidence lined up behind all of the laws he's happy to jettison is "mountainous" is the understatement of the year.

And yet... he has fervent followers.  He's a "maverick," they say, a courageous knight taking on the dragons of the hidebound scientific establishment.

I've never understood the compulsion people have to follow someone simply because they're anti-establishment.  Surely, it matters more if they're right.  Right?  By itself, being anti-establishment doesn't make you a knight, it makes you Don Quixote, tilting at windmills because you've decided they're monsters.

And if what you're claiming could be refuted in a high school physics classroom, I'm afraid you don't have a lot of cause to brag about how fearless you're being.

In any case, I urge you to take a look at Rousseau's site.  I'm deliberately not linking Mandlbaur's webpage because I'd prefer not to give him any additional traffic; you can find it if you're so inclined.  And if any of you are getting ready to @ me about how "the scientists have been wrong before!", don't waste your time.  Sure, they have, no question about that.  They were wrong about continental drift -- until the plate tectonics model was proposed.  They were wrong about the luminiferous aether -- until Einstein came along.  They were wrong about what caused malaria, cholera, and typhoid -- until the Germ Theory of Disease.  They were wrong about how inheritance worked -- until Mendel wrote his book about statistical genetics, and eventually a whole group of scientists uncovered the roles of DNA and RNA.

Get my point?  Sure, the scientists have been wrong sometimes, but they fixed it by coming up with a better theory.  Science works as well as it does because it self-corrects.  If your model doesn't fit the facts, it's superseded by one that does.  On the other hand, if you want to claim the current model is wrong, you damn well better be able to show that what you're proposing to replace it fits the experimental data better than the one you're planning to trash.

So once again, we have a blowhard crank (okay, maybe that was an ad hominem... oh well) who thinks he knows better than all of the physicists from the last four hundred years.  I'm guessing if he finds out I wrote this, I'm going to get a stinger of a response, but I'm ready.

Just about every physicist from Newton on down has my back.

****************************************



Friday, June 7, 2024

The flood of nonsense

I'm going to say this straight up, in as unambiguous a fashion as I can manage:

Given the widespread availability of fact-checking websites, there is absolutely no excuse for passing along misinformation.

The topic comes up today because I recently ran into three claims online, which I present here in increasing order of ridiculousness, and in almost no cases were they accompanied by anyone saying, "But I don't think this is true."  I'm hoping that by highlighting these, I can accomplish two things -- putting a small dent in the number of people posting these claims on social media, and instilling at least a flicker of an intention to do better with what you choose to post in the future.

The first one I've mostly seen from my fellow Northeasterners, and has to do with a spider.  Here's the most common post I've seen about this:


This statement -- which is almost verbatim the headline used by a number of supposedly-reputable news sources -- is wildly misleading.  When you look into it, you find that the species in question is the joro spider (Trichonephila clavata), and while they are pretty big for a spider (the leg-span can be around ten centimeters), nothing else about them is dangerous.  They're native to China and Japan, where people live around them in apparent harmony; while they do have venom, like all spiders, it's of low toxicity.  They're actually rather docile and reluctant to bite, and if they do, it's no worse than a bee sting.

And, for fuck's sake, they can't fly.  Flying requires wings, and if you'll look closely at the above photograph, you will see they don't have any.  Their tiny young do what is called "ballooning" (again, something many spider species do), creating a few silk threads and then catching a breeze to travel to a new locale.  So while they're definitely an invasive exotic species, and ecologists are concerned about their potential for out-competing native spider species, they pose about as close to zero threat to humans as you could get.

So put away the goddamn flamethrowers.

The second claim has to do with the information you can get from the color of caps on your bottled water.  The idea here is that bottled water distributers have coded the caps -- blue caps are used for spring water, black caps for alkaline water, green caps for flavored water, and white caps for "processed water."

It's the last one that gave me a chuckle.  I damn sure hope the water you're drinking has been processed, and that Aquafina isn't just filling water bottles from the nearest river, screwing the caps on, and calling it good.  Apparently the impetus for the claim is that because consuming "highly-processed" food has been associated with some health issues, anything "processed" is bad for you, so you should avoid those bottles with white tops.

The whole thing, though, is complete nonsense.  There's no correlation between bottle top color and... anything.  All bottled water has been filtered and sterilized (and thus "processed").  And if you need a particular bottle top color to tell if you're drinking flavored water, there are some other issues you might want to address, preferably with your doctor.

The third, and most idiotic, of the claims I heard about from my friend, the wonderful writer Andrew Butters.  Like me, Andrew is a thoroughgoing science nerd, and frequently finds himself doing facepalms over some of the stupid stuff people fall for.  He sent me a link to a video by theoretical physicist Sabine Hossenfelder about an actor named Terrence Howard, who recently wrote a book about his new model for physics that proves pretty much everything we'd thought is wrong.  The basis of his model -- I swear I am not making this up -- starts from the proposition that 1 x 1 is actually equal to 2.

So Howard clearly (1) failed third grade math class, and (2) apparently has been doing sit-ups underneath parked cars.  And his "theory" (it makes me cringe even to use the word) would have vanished into the great murky morass of claims by unqualified laypeople to revolutionize all of science if it hadn't been for Joe Rogan, who gave the guy a platform and treated him as if he was the next Einstein.

Hossenfelder's takedown of Howard (and Rogan) is brilliantly acerbic, and is well worth watching in its entirety.  One line, though, stands out: "Joe Rogan isn't stupid, but he thinks his audience is."  Rogan's take on things is that Howard's ideas haven't caught on in the scientific community because the scientists are acting as gatekeepers -- rejecting ideas out of hand if they come from someone who is not In The Club.  This, of course, is nonsense; they aren't ignoring Howard's book because he's not a scientist, they're ignoring it because his claims are ridiculous.  This is not scientists acting as unfair gatekeepers; they simply know what the hell they're talking about because they've spent their entire careers studying it.

I had decided not to address Howard's claims, feeling that Hossenfelder did a masterful enough job by herself of knocking him and Rogan down simultaneously, and that anything I could add would be superfluous.  And, of course, given that Hossenfelder is a physicist, she is vastly more qualified than I am to address the physics end of it.  But since Andrew sent me the link, I've now seen Howard's claims pop up three more times, always along with some commentary about the Mean Nasty Scientists refusing to listen to an outsider, and this is why we don't trust the scientists, see?

Which, of course, made me see red, and is why you're reading about it here.  There's no grand conspiracy amongst the scientific establishment to silence amateurs; as we've seen here at Skeptophilia more than once, dedicated amateurs have made significant contributions to science.  No scientist would refuse to look at a revolutionary idea if it had merit.  Terrence Howard might well have mental problems, and be more to be pitied than censured, but Joe Rogan needs to just shut the hell up.

And for the love of Gauss, that 1 x 1 = 1 can be derived in one step from one of the fundamental axioms of arithmetic.

So.  Anyhow.  I need to finish this up and go have a nice cup of tea and calm down.  But do me a favor, Gentle Readers.  If you see this kind of nonsense online, please please puhleeeez don't forward it.  If you feel comfortable doing so, tell the original poster "this is incorrect, and here's why."  And if you run into any odd claims online, do a two-minute fact check before you post them yourself.  Snopes and FactCheck.org remain two of the best places to find out if claims are true; there's no excuse for not using them.

Let's all do what we can to stem the tide of misinformation, before we all drown in it.

****************************************



Thursday, June 6, 2024

Altered flow

John McPhee's wonderful book The Control of Nature describes three attempts to alter naturally-occurring geological processes: the shift of the course of the Mississippi River into the Atchafalaya River (which would leave New Orleans without a port); the lava flow from the 1973 eruption of Eldfell Volcano on the Icelandic island of Heimaey, which threatened to seal off the main town's only harbor; and the ongoing problem with landslides in the San Gabriel Mountains of California, which have been exacerbated by people's insistence on building multi-million-dollar homes in steep-sided canyons.

Of the three, only the Icelanders had a success story.  They halted the lava flow by pumping cold seawater onto it, and stopped it before it closed off the harbor completely; the tongue of solidified rock actually created a useful seawall.  The other two were, and still are, drastic failures.  The levee/spillway system in Louisiana, intended to keep the Mississippi in its channel and prevent it from switching over to the Atchafalaya's shorter and more direct path to the Gulf of Mexico, has caused more silting of the channel and subsidence of the land, both of which were direct contributors to the severity of the Hurricane Katrina disaster in 2005.  California still deals with landslides, despite their best efforts to contain them with various slope stabilization devices -- and rich people are still building their mansions right in harm's way.

33% is not a great success rate, but it's pretty reflective of our attempts to control natural processes.  It's not that I'm saying what we do has no effect; the unfortunate part is most of what we've tried hasn't worked, or has actually made the situation more dire.  The obvious example (anthropogenic climate change) is only one of many examples of times we've messed around with things and come off very much the worse.

Although we're unique in the animal world in being able to control our environments to some extent, we're still very much at the mercy of the natural world.  Big, sudden cataclysms -- events like major earthquakes, volcanic eruptions, hurricanes, tornadoes, or floods -- are the most obvious examples, but sometimes slow, gradual processes can alter the course of history just as profoundly.  The fall of the Roman Empire, about which I've written a couple of times recently, may well have been triggered by a climatic shift causing freezing drought in the central Asian steppes, inducing the Huns to migrate west and starting a domino effect of invasions.  Certainly the rising and lowering of sea level as ice ages came and went altered migration patterns; both Australia and the Americas were colonized during periods when the areas now at the bottom of (respectively) the Gulf of Carpentaria and the Bering Sea were dry land.

The idea that climate has been a major driver for history has gone out of vogue, and is sneeringly referred to as "climate determinism" despite the fact that (1) there's no denying the vagaries of climate have had obvious and dramatic effects, and (2) no one has ever claimed that climate was the only thing affecting the course of events.  Consider, for example, some new research out of the University of Southampton that came out in Nature Geoscience this week.

Life in Egypt has always been dicey -- the valley of the Nile is thickly-inhabited, but go more than a few miles east or west from it and you're in marginally-inhabitable desert.  We all learned in elementary school how the ancient Egyptians survived by learning how to manage what are always called the Nile's "life-giving floods" through irrigation channels and catchment basins, but the truth is, all it took was a dry year or two and the entire civilization was in deep trouble.

The situation changed -- for once, for the better -- about four thousand years ago, when the Nile shifted course and created the floodplain around Luxor.

The reason was the same as what John McPhee explains about the Mississippi, but with a happier outcome.  As rivers flow, they pick up sediment, and when they reach the sea and the water velocity slows down, that sediment is deposited on the river bottom.  This raises it, creating an impediment to water flow, slowing down the water further and making it drop more sediment, and so on and so forth.  Eventually the delta becomes impassible, and the water is forced into another channel (unless people step in and try to stop it, like what is happening with dubious success in Louisiana).

In southern Egypt, though, the switch in paths brought the flow of the Nile out over a broad, flat plain that prior to that had been high and dry.  The outflow into the Mediterranean moved east as well, and the outgoing river broke up into dozens of outflow channels.  This proved extraordinarily beneficial to the people living all along the river's northern half.  "The expansion of the floodplain greatly enlarged the area of arable land in the Nile Valley near Luxor (ancient Thebes) and improved the fertility of the soil by regularly depositing fertile silts," said Benjamin Pennington, who co-authored the paper.  "The Egyptian Nile we see today looks very different from how it would have been throughout much of the last 11,500 years.  For most of this time, the Nile was made up of a network of interwoven channels that frequently changed their course.  Around four thousand years ago, the Nile abruptly shifted and there was rapid floodplain aggradation, where the river began depositing large amounts of sediment, building up the valley floor.  This created a more expansive and stable floodplain."

The result was that the New Kingdom -- which included the reigns of famous pharaohs such as Ahmose I, Hatshepsut, Thutmose III, Amenhotep III, Akhenaten, and Tutankhamun -- had the resources to become one of the significant political powers of the region.

[Image licensed under the Creative Commons Mohammed Moussa, Ramses II in Luxor Temple, CC BY-SA 3.0]

Like McPhee's one-out-of-three success rate for humans trying to control nature, however, it bears keeping in mind that for every example of a natural event benefitting humans, there's one that didn't turn out so well for us.  The collapse of classical Mayan civilization in the eighth century C.E. was largely triggered by a prolonged drought; the onset of the Little Ice Age in the fourteenth created a perfect storm of conditions that fed into the Black Death killing one-third of the population of Europe.

However confident we are in our comfortable high-tech world keeping us safe, it's always good to remember how tenuous it is -- and the fact that in the long haul, Mother Nature is still very much in charge.

****************************************



Wednesday, June 5, 2024

Lingua franca

Here's a question I wonder if you've ever pondered:

Why do the Spanish and French speak Romance languages and not Germanic ones?

It's not as weird a consideration as it might appear at first.  By the time the Western Roman Empire collapsed in the last part of the fifth century C.E., the entire western part of Europe had been completely overrun by Germanic tribes -- the Franks, the Burgundians, and especially the Visigoths.  This latter group ended up controlling pretty much all of southern France and nearly the entirety of Spain, and their king, Euric, ruled the whole territory from his capital at Toulouse.  It was Euric who deposed the last Western Roman emperor, poor little Romulus Augustulus, in 476 -- but showing unusual mercy, sent him off to a (very) early retirement at a villa in Campania, where he spent the rest of his life.  That he felt no need to execute the kid is a good indicator of how solidly Euric and the Visigoths were in control.

So the Germanic-speaking Goths more or less took over, and not long after that the (also Germanic) Franks and Burgundians came into northern France and established their own territories there.  The country of France is even named after the Franks; but their language, Franconian, never really took hold inside its borders.

Contrast this to what happened in England.  The Celtic natives, who spoke a variety of Brythonic dialects related to Welsh and Cornish, were invaded during the reign of the Emperor Claudius in the year 43 C.E., and eventually Rome controlled Britain north to Hadrian's Wall.  But when all hell broke loose in the fifth century, and the Roman legions said, "Sorry, y'all'll have to deal with these Saxons on your own" and hauled ass back home, the invaders' Germanic language became the lingua franca (pun intended) of the southern half of the island, with the exception of the aforementioned Welsh and Cornish holdouts.

All three places had been Roman colonies.  So why did France and Spain end up speaking Romance languages, and England a Germanic one?

The easier question is the last bit.  Britain never was as thoroughly Romanized as the rest of western Europe; it always was kind of a wild-west frontier outpost, and a great many of the Celtic tribes the Romans tried to pacify rebelled again and again.  When the Romans troops withdrew, there weren't a lot of speakers of Latin left -- exceptions were monasteries and churches.  Most of the locals had retained their original languages, and when the British Celts told the troops "Romani ite domum" (more or less), they just picked up where they'd left off.


The problem was, when the Angles and Saxons started arriving in huge numbers over the next two centuries, there wasn't a single dominant language there to stand up against them -- just a bunch of various dialects spoken by tribes that never were all that numerous, and didn't get along very well with each other anyhow.  So the West Germanic language the invaders spoke became the common language, eventually evolving into Old English.

The situation was different in France and Spain.  By the fifth century, those had both been solidly Roman for three hundred years.  The Celtic/Gaulish natives were by this time thoroughly subjugated, and many had even thrown their lot in with the conquerors, rising to become important figures.  (One example is first century B.C.E. writer and polymath Gnaeus Pompeius Trogus, who despite his Roman name was from the Celtic Vocontii tribe in the western foothills of the Alps.)  Business, record-keeping, and administration were all conducted in Latin; most of the cities were predominantly Latin-speaking.  

The Germanic tribes who swept through western Europe in the fourth and fifth centuries had an interesting attitude.  They didn't want to destroy everything the Romans had built; they just wanted to control it, and have access to all the wealth and land.  They didn't even care if the Roman town-dwellers stayed put, as long as they acknowledged the Goths' overlordship.  (Which almost all of them did, given that there were no other options.  Practical folks, the Romans.)

The invading Visigoths, Franks, and Burgundians had no written language we know of, so when they settled in to rule the place -- and most importantly, to do business with the local landowners -- their only real option was to learn Latin.  Latin became the prestige language, the language you learned if you wanted to go places, much the way English is now in many parts of the world.

The result was that Latin-derived Old French and Old Spanish were eventually adopted by the Germanic-descended ruling class, ultimately being spoken throughout the region, while the opposite pattern had happened across the Channel in England.  Interesting that the Franks gave their name to the country of France and its language, but the only modern language descended from Franconian is one spoken two countries northeast of there -- Dutch.

It's always fascinating to me to see how chance events alter the course of history.  You can easily see how it could have gone the other way -- the Visigoths might have been more determined to eradicate every trace of Romanness, the way so many conquerors have done.  Instead, they saw the value in leaving it substantially intact.  Not because they had such deep respect for other cultures -- they weren't so forward thinking as all that -- but because they recognized that they could use the Roman knowledge, language, and infrastructure for their own gain.  The result is that my Celto-Germanic ancestors spoke a language derived from Latin, even though by that time it was about the only Roman thing about them.

****************************************



Tuesday, June 4, 2024

The oasis

I've always thought it was astonishing that anything short of extremely cold-adapted species could make it through an ice age.

During the last major glacial period, which peaked about twenty-one thousand years ago, the spot where I'm sitting right now was under a thirty-meter-thick layer of ice.  In fact, the hills about fifty kilometers south of us -- the Elmira Moraine -- marks the terminus of the glacier, where rocks, gravel, and soil that had been pushed forward by the advancing ice sheet got left behind as it melted.  During this period, the average global temperature was 6 C colder than it is now, and so much water was locked up as ice that the sea level was over a hundred meters lower than it is today.

My picture of how species survived (excluding the aforementioned cold-lovers) was that everything shifted range toward lower latitudes as the temperature cooled and the ice advanced, then reversed the process as the glacial period ended and the ice receded.  Species that couldn't shift quickly enough, or for which the climatic changes happened too fast to adapt, became extinct.  But according to a paper last week in Science Advances, the picture may not have been quite so simple.

One clue that our understanding was incomplete had to do with genetic diversity.  For a lot of species, we have a pretty good understanding of how quickly genetic mutations accrue, so looking at the genetic makeup of various populations within a species gives you an estimate of how long ago they had a common ancestor.  (And also tells you how closely each of those populations are related to the others.)  And in Europe, the populations of warmth-loving tree species like oaks suggested strongly that modern individuals weren't all descended from southern survivors which gradually expanded their ranges back northward as the glacial period ended twenty-odd-thousand years ago.  Their genetic diversity was too high for that to be plausible -- and some of the northern populations of modern oaks seemed to be a genetic cluster only distantly related to their southern cousins.

Fossils from the Czech Republic strongly suggest that what happened was that patches of the original forest were able to survive, clustered around hot springs that even at the height of the glacial period never froze over.  Geologist Jan Hošek of the Czech Geological Survey, who was lead author of the paper, found fossils of warmth-loving tree species preserved in geyserite -- a sedimentary rock produces by hot water dissolving and then depositing layers of opaline silica on exposed surfaces.  The hot springs created an oasis covering an estimated fifty square kilometers.  Not huge, but enough that a population of oaks and other temperate woodland plants (and presumably the animals they hosted) were able to survive the worst of the cold.

Artist's conception of the hot spring refugium [Image credit: artist Jiří Svoboda]

Being a warmth-lover myself, I always find it astonishing that species made it through some of these climatic extremes.  Not only the cold ones, of course; episodes like the Paleocene-Eocene Thermal Maximum, when the global temperature was 8 C above what it is now, can't have been pleasant, either.  But the recent discoveries show that given even a small refuge, living things will hang on despite all odds.

As Ian Malcolm famously put it, "Life, uh, finds a way."

****************************************



Monday, June 3, 2024

Inside the bubble

A couple of nights ago, my wife and I watched the latest episode in the current series of Doctor Who, "Dot and Bubble."  [Nota bene: this post will contain spoilers -- if you intend to watch it, you should do so first, then come back and read this afterward.]

All I'd heard about it before watching is that it is "really disturbing."  That's putting it mildly.  Mind you, there's no gore; even the monsters are no worse than the usual Doctor Who fare.  But the social commentary it makes puts it up there with episodes like "Midnight," "Cold Blood," and "The Almost People" for leaving you shaken and a little sick inside.

The story focuses on the character of Lindy, brilliantly played by Callie Cooke, who is one of the residents of "Finetime."  Finetime is basically a gated summer camp for spoiled rich kids, where they do some nominal work for two hours a day and spend the rest of the time playing.  Each of the residents is surrounded, just about every waking moment, by a virtual-reality shell showing all their online friends -- the "bubble" of the title -- and the "work" each of them does is mostly to keep their bubbles fully charged so they don't miss anything.


The tension starts to ramp up when the Doctor and his companion, Ruby Sunday, show up unannounced in Lindy's bubble, warning her that people in Finetime are disappearing.  At first she doesn't believe it, but when forced to look people up, she notices an abnormal number of them are offline -- she hadn't noticed because the only ones she sees are the ones who are online, so she wasn't aware how many people in her bubble had vanished.  At first she's dismissive of Ruby and downright rude to the Doctor, but eventually is driven to the realization that there are monsters eating the inhabitants of Finetime one by one.

Reluctantly accepting guidance from the Doctor, she runs for one of the conduits that pass under the city, which will give her a way out of the boundaries into the "Wild Wood," the untamed forests outside the barrier.  Along the way, though, we begin to see that Lindy isn't quite the vapid innocent we took her for at first.  She coldly and unhesitatingly sacrifices the life of a young man who had tried to help her in order to save her own; when she finds out that the monsters had already killed everyone in her home world, including her own mother, she basically shrugs her shoulders, concluding that since they were in a "happier place" it was all just hunky-dory.

It was the end, though, that was a sucker punch I never saw coming.  When she finally meets up with the Doctor and Ruby in person, and the Doctor tells her (and a few other survivors) that they have zero chance of surviving in the Wild Wood without his help, she blithely rejects his offer.

"We can't travel with you," she says, looking at him as if he were subhuman.  "You, sir, are not one of us.  You were kind -- although it was your duty to save me.  Screen-to-screen contact is just about acceptable.  But in person?  That's impossible."

In forty-five minutes, a character who started out seeming simply spoiled, empty-headed, and shallow moved into the territory of "amoral" and finally into outright evil.  That this transformation was so convincing is, once again, due to Callie Cooke's amazing portrayal.

What has stuck with me, though, and the reason I'm writing about it today, is that the morning after I watched it, I took a look at a few online reviews of the episode.  They were pretty uniformly positive (and just about everyone agreed that it was disturbing as hell), but what is fascinating -- and more than a little disturbing in its own right -- is the difference between the reactions of the reviewers who are White and the ones who are Black.

Across the board, the White reviewers thought the take-home message of "Dot and Bubble" is "social media = bad."  Or, at least, social media addiction = bad.  If so, the moral to the story is (to quote Seán Ferrick of the YouTube channel WhoCulture) "as subtle as a brick to the face."  The racism implicit in Lindy's rejection of the Doctor was a shocking twist at the end, adding another layer of yuck to an already awful character.

The Black reviewers?  They were unanimous that the main theme throughout the story is racism (even though race was never once mentioned explicitly by any of the characters).  In the very first scene, it was blatantly obvious to them that every last one of Lindy's online friends is White -- many of them almost stereotypically so.  Unlike the White reviewers, the Black reviewers saw the ending coming from a mile off.  Many of them spoke of having dealt all their lives with sneering, race-based microaggressions -- like Lindy's being willing at least to talk to Ruby (who is White) while rejecting the Doctor (who is Black) out of hand.

When considering "Dot and Bubble," it's easy to stop at it being a rather ham-handed commentary on social media, but really, it's about echo chambers.  Surround yourself for long enough with people who think like you, act like you, and look like you, and you start to believe the people who don't share those characteristics are less than you.

What disturbs me the worst is that I didn't see the obvious clues that writer Russell T. Davies left us, either.  When Lindy listens to Ruby and rejects the Doctor, it honestly didn't occur to me that the reason could be the color of his skin.  I didn't even notice that all Lindy's friends were White.  As a result, the ending completely caught me off guard.  As far as the subtle (and not-so-subtle) racist overtones of the characters in the episode, I wasn't even aware of them except in retrospect.

But that's one of the hallmarks of privilege, isn't it?  You're not aware of it because you don't have to be.  As a White male, there are issues of safety, security, and acceptance I never even have to think about.  So I guess like Lindy and the other residents of Finetime, I also live in my own bubble, surrounded by people who (mostly) think like I do, never having to stretch myself to consider, "What would it be like if I was standing where they are?"

And what makes the character of Lindy so horrific is that even offered the opportunity to do that -- to step outside of her bubble and broaden her mind a little -- she rejects it.  Even if it means losing the aid of the one person who is able to help her, and without whose assistance she is very likely not to survive.

For myself, my initial blindness to what "Dot and Bubble" was saying was a chilling reminder to keep pushing my own boundaries.  In the end, all I can do is what poet Maya Angelou tells us: "Do the best you can until you know better.  Then, when you know better, do better."

****************************************