Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, February 20, 2024

Dream a little dream of me

In one of my favorite novels, The Lathe of Heaven by Ursula LeGuin, the main character -- an unassuming man named George Orr -- figures out that when he dreams, his dream changes reality.  The problem is, since when the change occurs, it alters everyone else's memories of what had happened, the only one who realizes that anything has changed is him.

At first, of course, he doesn't believe it.  He must be remembering wrong.  Then, when he becomes convinced it's actually happening, he starts taking drugs to try to stop him from dreaming, but they don't work.  As a last resort, he tries to get help from a psychologist...

... but the psychologist realizes how powerful this ability could be, and starts guiding George into dreams that will shape the world into what he wants it to be.

It's a powerful cautionary tale about what happens when an unscrupulous person gains control over someone with a valuable talent.  Power corrupts, as the oft-quoted line from John Dalberg-Acton goes, and absolute power corrupts absolutely.

I couldn't help thinking about The Lathe of Heaven when I read about some new exploration of lucid dreaming taking place at REMSpace, a California startup, that will be featured in a paper in The International Journal of Dream Research soon (a preprint is available at the link provided).  A lucid dream is one in which you are aware that you're dreaming while you're dreaming, and often have some degree of control over what happens.  Around twenty percent of people report regular lucid dreaming, but there is some research that suggests many of us can learn to lucid dream.

Dickens's Dream by Robert W. Buss (1875) [Image is in the Public Domain]

At this point, I'll interject that despite a long history of very vivid dreams, I've never had a lucid dream.  I did have an almost-lucid dream, once; it was a weird and involved story about being a groomsman in a wedding in a big cathedral, and when the priest said the whole "does anyone have any objections?" thing, a gaudily-dressed old lady in the front row stood up and started shouting about what an asshole the groom was and how the bride could do way better.  And I'm standing there, feeling horrified and uncomfortable, and I thought, "This is bizarre!  How could this be happening?  Is this a dream?"  So I kind of looked around, then patted myself to reassure myself that I was solid, and thought, "Nope.  I guess this is real."

So the one time I actually considered the question of whether I was dreaming, I got the wrong answer.

But I digress.

Anyhow, the researchers at REMSpace took a group of test subjects who all reported being able to lucid dream, and hooked them up to electromyography and electroencephalography sensors -- which, respectively, measure the electrical discharge from voluntary muscle contractions and neural firing in the brain -- and gave them the pre-sleep suggestion that they would dream about driving a car.  Using the output from the sensors, they created a virtual avatar of the person on a computer screen, and found that they were able to use tiny motions of their hands to steer it, and even avoid obstacles.

"Two-way interaction with a computer from dreams opens up a whole area of new technologies," said Michael Raduga, who led the experiment.  "Now, these developments are crude, but soon they will change the idea of human capabilities."

Maybe so, but it also puts the dreamer in the hands of the experimenter.  Now, I'm not saying Michael Raduga and his team are up to anything nefarious; and obviously I don't believe anyone's got the George-Orr-like ability to change reality to conform to what they dream.  But does anyone else have the feeling that "two-way interaction" into your dreams is potentially problematic?  I've heard a lot of people say things like, "hypnosis isn't dangerous, you can't be given a post-hypnotic suggestion that induces you to do something you wouldn't ordinarily do," but if there's one thing my knowledge of neuroscience has taught me, it's that the human brain is highly suggestible.

So as interested as I am in lucid dreaming, I'm not ready to sign up to have my dreams interacted with by a computer controlled by someone else.  And I hope like hell that when Raduga and his group at REMSpace start "changing the idea of human capabilities," they are extremely careful.

Anyway, that's our interesting-but-a-little-scary research for today.  Me, I'm gonna stick with my ordinary old dreams, which are peculiar enough.  And given my failure at detecting a potentially lucid dream when I had the chance, I doubt I'd be all that good at it in any case.  I'd probably drive my virtual dream car right into a telephone pole.

****************************************



Monday, February 19, 2024

The viral accelerator

It's virus season, which thus far I've been able to avoid participating in, but seems like half the people I see are hacking and snorting and coughing so even with caution and mask-wearing I figure it's only a matter of time.  Viruses are odd beasts; they're obligate intracellular parasites, doing their evil work by hijacking your cellular machinery and using it to make more viruses.  Furthermore, they lack virtually all of the structures that cells have, including cell membranes, cytoplasm, and organelles.  They really are more like self-replicating chemicals than they are like living things.

Simian Polyoma Virus 40 [Image licensed under the Creative Commons Phoebus87 at English Wikipedia, Symian virus, CC BY-SA 3.0]

What is even stranger about viruses is that while some of the more familiar ones, such as colds, flu, measles, invade the host, make him/her sick, and eventually (with luck) are cleared from the body -- some of them leave behind remnants that can make their presence known later.  This behavior is what makes the herpes family of viruses so insidious.  If you've been infected once, you are infected for life, and the latent viruses hidden in your cells can cause another eruption of symptoms, sometimes decades later.

Even weirder is when those latent viral remnants cause havoc in a completely different way than the original infection did.  There's a piece of a virus left in the DNA of many of us called HERV-W (human endogenous retrovirus W) which, if activated, can trigger multiple sclerosis or schizophrenia.  Another one, Coxsackie virus, has an apparent connection to type-1 diabetes and Sjögren's syndrome.  The usual sense is that all viral infections, whether or not they're latent, are damaging to the host.  So it was quite a shock to me to read a piece of recent research that there's a viral remnant that not only is beneficial, but is critical for creating myelin -- the coating of our nerve cells that is essential for speeding up nerve transmission!

The paper -- which appeared last week in the journal Cell -- is by a team led by Tanay Ghosh of the Cambridge Institute of Science, and looked at a gene called RetroMyelin.  This gene is one of an estimated forty (!) percent of our genome that is made up of retrotransposons, DNA that was inserted by viruses during evolutionary history.  Or, looking at it another way, genes that made their way to us using a virus as a carrier.  Once inside our genome, transposons begin to do what they do best -- making copies of themselves and moving around.  Most retrovirus-introduced elements are deleterious; HIV and feline leukemia, after all, are caused by retroviruses.  But sometimes, the product of a retroviral gene turns out to be pretty critical, and that's what happened with RetroMyelin.

Myelin is a phosopholipid/protein mixture that surrounds a great many of the nerves in vertebrates.  It not only acts as an insulator, preventing the ion distribution changes that allow for nerve conduction to "short-circuit" into adjacent neurons, it is also the key to saltatory conduction -- the jumping of neural signals down the axon, which can increase transmission speed by a factor of fifty.  So this viral gene acted a bit like a neural accelerator, and gave the animals that had it a serious selective advantage.

"Retroviruses were required for vertebrate evolution to take off," said senior author and neuroscientist Robin Franklin, in an interview in Science Daily.  "There's been an evolutionary drive to make impulse conduction of our axons quicker because having quicker impulse conduction means you can catch things or flee from things more rapidly.  If we didn't have retroviruses sticking their sequences into the vertebrate genome, then myelination wouldn't have happened, and without myelination, the whole diversity of vertebrates as we know it would never have happened."

The only vertebrates that don't have myelin are the jawless fish, such as lampreys and hagfish -- so it's thought that the retroviral infection that gave us the myelin gene occurred around the same time that jaws evolved on our branch of the vertebrate family tree, on the order of four hundred million years ago.

So even some fundamental (and critical) traits shared by virtually all vertebrates, like the myelin sheaths that surround our neurons, are the result of viral infections.  Just proving that not all of 'em are bad.  Something to think about the next time you feel a sore throat coming on.

****************************************



Saturday, February 17, 2024

All set

How long is the coastline of Britain?

Answer: as long as you want it to be.

This is not some kind of abstruse joke, and if it sounds like it, blame the mathematicians.  This is what's known as the coastline paradox, which is not so much a paradox as it is the property of anything that is a fractal.  Fractals are patterns that never "smooth out" when you zoom in on them; no matter how small a piece you magnify, it still has the same amount of bends and turns as the larger bit did.

And coastlines are like that.  Consider measuring the coastline of Britain by placing dots on the coast one hundred kilometers apart -- in other words, using a straight ruler one hundred kilometers long.  If you do this, you find that the coastline is around 2,800 kilometers long.

[Image licensed under the Creative Commons Britain-fractal-coastline-100km , CC BY-SA 3.0]

But if your ruler is only fifty kilometers long, you get about 3,400 kilometers -- not an insignificant difference.

[Image licensed under the Creative Commons Britain-fractal-coastline-50km, CC BY-SA 3.0]

The smaller your ruler, the longer your measurement of the coastline.  At some point, you're measuring the twists and turns around every tiny irregularity along the coast, but do you even stop there?  Should you curve around every individual pebble and grain of sand?

At some point, the practical aspects get a little ridiculous.  The movement of the ocean makes the exact position of the coastline vague anyhow.  But with a true fractal, we get into one of the weirdest notions there is: infinity.  True fractals, such as the ones investigated by Benoit B. Mandelbrot, have an infinite length, because no matter how deeply you plunge into them, they have still finer structure.

Oh, by the way: do you know what the B. in "Benoit B. Mandelbrot" stands for?  It stands for "Benoit B. Mandelbrot."

Thanks, you're a great audience.  I'll be here all week.

The idea of infinity has been a thorn in the side of mathematicians for as long as anyone's considered the question, to the point that a lot of them threw their hands in the air and said, "the infinite is the realm of God," and left it at that.  Just trying to wrap your head around what it means is daunting:

Teacher: Is there a largest number?
Student: Yes. It's 10,732,210.
Teacher: What about 10, 732,211?
Student: Well, I was close.

It wasn't until German mathematician Georg Cantor took a crack at refining what infinity means -- and along the way, created set theory -- that we began to see how peculiar it really is.  (Despite Cantor's genius, and the careful way he went about his proofs, a lot of mathematicians of his time dismissed his work as ridiculous.  Leopold Kronecker called Cantor not only "a scientific charlatan" and a "renegade," but "a corrupter of youth"!)

Cantor started by defining what we mean by cardinality -- the number of members of a set.  This is easy enough to figure out when it's a finite set, but what about an infinite one?  Cantor said two sets have the same cardinality if you can find a way to put their members into a one-to-one correspondence in a well-ordered fashion without leaving any out, and that this works for infinite sets as well as finite ones.  For example, Cantor showed that the number of natural numbers and the number of even numbers is the same (even though it seems like there should be twice as many natural numbers!) because you can put them into a one-to-one correspondence:

1 <-> 2
2 <-> 4
3 <-> 6
4 <-> 8
etc.

Weird as it sounds, the number of fractions (rational numbers) has exactly the same cardinality as well -- there are the same number of possible fractions as there are natural numbers.  Cantor proved this as well, using an argument called Cantor's snake:


Because you can match each of them to the natural numbers, starting in the upper left and proceeding along the blue lines, and none will be left out along the way, the two sets have exactly the same cardinality.

It was when Cantor got to the real numbers that the problems started.  The real numbers are the set of all possible decimals (including ones like π and e that never repeat and never terminate).  Let's say you thought you had a list (infinitely long, of course) of all the possible decimals, and since you believe it's a complete list, you claimed that you could match it one-to-one with the natural numbers.  Here's the beginning of your list:

7.0000000000...
0.1010101010....
3.1415926535...
1.4142135623...
2.7182818284...

Cantor used what is called the "diagonal argument" to show that the list will always be missing members -- and therefore the set of real numbers is not countable.  His proof is clever and subtle.  Take the first digit of the first number in the list, and add one.  Do the same for the second digit of the second number, the third digit of the third number, and so on.  (The first five digits of the new number from the list above would be 8.2553...)  The number you've created can't be anywhere on the list, because it differs from every single number on the list by at least one digit.

So there are at least two kinds of infinity; countable infinities like the number of natural numbers and number of rational numbers, and uncountable infinities like the number of real numbers.  Cantor used the symbol aleph null -- -- to represent a countable infinity, and the symbol c (for continuum) to represent an uncountable infinity.

Then there's the question of whether there are any types of infinity larger than but smaller than c.  The claim that the answer is "no" is called the continuum hypothesis, and proving (or disproving) it is one of the biggest unsolved problems in mathematics.  In fact, it's thought by many to be an example of an unprovable but true statement, one of those hobgoblins predicted by Kurt Gödel's Incompleteness Theorem back in 1931, which rigorously showed that a consistent mathematical system could never be complete -- there will always be true mathematical statements that cannot be proven from within the system.

So that's probably enough mind-blowing mathematics for one day.  I find it all fascinating, even though I don't have anywhere near the IQ necessary to understand it at any depth.  My brain kind of crapped out somewhere around Calculus 3, thus dooming my prospects of a career as a physicist.  But it's fun to dabble my toes in it.

Preferably somewhere along the coastline of Cornwall.  However long it actually turns out to be.

****************************************



Friday, February 16, 2024

The vanished legion

In the Doctor Who episode "The Eaters of Light," the Twelfth Doctor and his companions, Bill and Nardole, go back to second-century Scotland to settle a dispute they're having over what actually happened to the Roman Ninth Legion (the Legio IX Hispania), which was deployed in the British Isles during the first century but rather suddenly disappears from the records in around 120 C.E.

Being Doctor Who, of course there are aliens involved -- a mysterious and powerful creature that feeds off of light, and which the native Picts knew how to control, but the attacks by the Romans (specifically the Ninth Legion) disrupted their ability to manage the portal behind which it was trapped, and it was in danger of getting loose and wreaking havoc.  In the end, the Doctor convinces the Picts and the Romans to set aside their hostilities and work together to deal with the bigger danger, and the Pictish leader, along with some of her warriors, and the entire legion choose to sacrifice their lives to contain the creature behind the door (which lies amongst a very atmospheric ring of standing stones out on the windswept heather), thus saving the world and also explaining why the Ninth Legion suddenly vanished.


The disappearance of the Legio IX Hispania is one of the more curious historical mysteries.  An early hypothesis, promoted by German historian Christian Theodor Mommsen, was that the Ninth had been wiped out in a battle with the Picts in 108 C.E., but there are a couple of problems in this claim.  First, the Romans were meticulous record-keepers, and didn't shy away from writing down what happened even when they'd lost.  If an entire legion had been destroyed in battle, it's curious that no one ever mentioned it.  Second, there's some evidence that at least a few members of the Ninth survived -- there are inscriptions that may be from them in the ruins of the Roman base at Nijmegen (now in the Netherlands) dating from the 120s.  It's possible, of course, that the artifacts -- including a silver-and-bronze military medal with "LEG HISP IX" engraved on the back -- were brought there by someone else.  After all, inscriptions about the Ninth Legion showing up at a particular time and place doesn't mean the Ninth Legion was there at the time.

Despite this argument, some have suggested that there were members of the Ninth at Nijmegen -- perhaps only a handful of survivors of a rout in Scotland.  Other historians go even further, believing the entire legion survived and was merely redeployed elsewhere, ultimately meeting their end in the Bar Kokhba Revolt (132-135 C.E.) or even as late as Marcus Aurelius's war against the Parthians (161-166 C.E.).

But again, we run up against the fact that although there are records of both of those battles, the Ninth Legion is never mentioned.  If they fought -- and possibly were destroyed -- in either of those conflicts, why did no one ever say so?

Most historians still subscribe to the idea that the Ninth was wiped out in Scotland, despite it leaving considerable questions about how it happened and why no one documented it.  British archaeologist Miles Russell, in his book The Celtic Kings of Roman Britain, says, "by far the most plausible answer to the question 'what happened to the Ninth' is that they fought and died in Britain, disappearing in the late 110s or early 120s when the province was in disarray."

Of course, a historical mystery like this leaves fertile ground for fiction writers to invent their own solutions, and the episode of Doctor Who is far from the only fanciful solution that has been proposed.  A good many of them involve time slips and transportation to an alternate reality, but none is as out there as the fate proposed in a Doc Savage novel wherein the Ninth is transported through an interdimensional gateway and ultimately end up in the African Congo, where their descendants survive until the 1930s.

And people say the plots of Doctor Who are ridiculous.

In any case, from a factual perspective what we're left with is a great big question mark.  An entire legion of Roman soldiers suddenly stops showing up in the records, and no one is really sure why.  The frustrating thing is that given the unlikeliness of finding any documents from that time that we don't already know about, it's doubtful we'll ever know for certain -- a highly unsatisfactory answer to our natural human curiosity.

Me, I'm voting for the light-eating alien having something to do with it.

****************************************



Thursday, February 15, 2024

The drowned wall

Humans have been modifying their own environment for a very long time.

Our capacity for building stuff is pretty extraordinary.  Birds build nests, some mammals and reptiles burrow, spiders spin webs -- but compared to what we do, it's all pretty rudimentary stuff.  No other species on Earth looks around, takes materials from nature, and says, "Hey, if I cut this to pieces and move it around, I could do something new and useful with it" to the extent that we do.

I was thinking about all this when I read a paper this week about an archaeological discovery in the Baltic Sea off the coast of Germany.  As I discussed in more depth in a post a couple of days ago, ten thousand odd years ago the Earth (especially the Northern Hemisphere) was coming out of a catastrophic and sudden cooling episode called the Younger Dryas event, during which the sea levels were considerably lower (because so much seawater was locked up as polar and glacial ice).  In fact, much of what are now the Baltic and North Seas were dry(ish) land; you could walk from England to France across a broad, grassy valley that is now at the bottom of the English Channel.

The result is that a great many of the artifacts produced during that time are now underwater, and finding them can be a matter of luck.  That was certainly the case in this instance, where scientists demonstrating a multibeam sonar system for some students in a research vessel ten kilometers offshore in the Bay of Mecklenburg saw something extraordinary -- the remains of a 971-meter-long rock wall archaeologists say "may be the oldest known human-made megastructure in Europe."

Nicknamed the "Blinkerwall," it is made of rows of a total of fourteen hundred smaller stones connecting three hundred boulders that were pretty clearly too large to move.  What the wall's builders apparently did was built a barrier out of the smaller stones connecting the large ones into a zigzagged line.

Part of the Blinkerwall [Image credit: Philipp Hoy]

The purpose of the wall, of course, can only be guessed at, but the researchers suspect it was used for hunting.  "When you chase animals [such as reindeer], they follow these structures, they don’t attempt to jump over them," said Jacob Geersen, of the Leibniz Institute for Baltic Sea Research in Warnemünde, a German port town on the Baltic coast.  "The idea would be to create an artificial bottleneck with a second wall or with the lake shore."

Once the animals were funneled into the narrow strip of land between the two barriers, they would be more vulnerable to spear-wielding hunters lying in wait.

The next step in the research, Geersen said, is to send divers down to the base of the wall, now submerged under twenty-one meters of water, to try and bolster this explanation by finding spearheads or other hunting implements.

You have to wonder what else might be down there.  Our intrepid ancestors have been finding new ways to make stuff for tens of thousands of years.  Wherever that impulse came from, there's no denying that it's served us well.  It always makes me wonder what traces will be left of our culture in ten thousand, or a hundred thousand, or a million years -- and what deductions our descendants will make about our habits, practices, and lifestyles.

****************************************



Wednesday, February 14, 2024

Music and the mind

In September, I started taking piano lessons.

I've played the piano off and on for years (more off than on, I'm afraid), but was entirely self-taught.  To say my formal music background is thin is an understatement; I had a lousy experience with elementary school band, said "to hell with it," and that was the end of my music education in public schools.  However, I was (and am) deeply in love with music, so I picked up the flute at age sixteen, and taught myself how to play it.  I took four years of lessons with a wonderful flutist and teacher named Margaret Vitus when I was in my twenties, but until last fall those accounted for the sum total of my instruction in music of any sort.

My experience as a student -- both with Margaret forty-odd years ago, and with J. P. (my piano teacher) now -- has been interesting from a number of standpoints.  In both cases I profited greatly by having someone tell me what bad habits are holding me back, and (more importantly) what I can do to remediate them.  But my spotty background has resulted in some unique challenges.  On the upside, I have an extraordinary ear and memory for melodies and rhythms, to the point that my wife calls it my "superpower."  I once heard a piece of Serbian music in a Balkan dance class when I was in my twenties, and heard it again thirty years later, and immediately knew it was the same tune even though I hadn't heard it or played it during that time.  

The downside, though, is that my lack of formal training means there are great gaping holes in my knowledge.  I'm currently working on a charming and whimsical piece by Claude Debussy, "Dr. Gradus ad Parnassum," which like much of Debussy's music twists around our sense of keys and harmonies. 

"How do you get to Carnegie Hall?  Every day, practice, practice, practice."

So J. P. -- for whom music is about as natural as breathing -- will look at some passage, and say something that sounds like, "Oh, that's a B-flat Minor Seven Demented chord."  Once I analyze what he told me using paper, pencil, and a slide rule, after three or four hours of study I can usually say, "Oh, okay, I guess I get it," but it definitely isn't anything close to intuitively obvious.  Like, ever.  So I'm still at the point of having to read each note slowly and painstakingly, and although I think the piece is lovely (well, when someone else plays it), I don't have any real comprehension of its structure.

If you're curious, here's how it's supposed to sound:


Fortunately, J. P. is an extraordinary teacher and gets my struggles, and is working to help me fill in the gaps in my knowledge.  It's slow going, but I guess that's no different from anyone learning a musical instrument.

The reason this comes up today is a study by a team from the University of L'Aquila and the University of Teramo that discovered an interesting correlation; people who have studied music seriously have better working memory -- the ability to retrieve and load information into their "attentional stream."  Stronger and faster working memory is positively associated with a greater capacity for divergent thinking, and thus the facilitation of creativity.

The authors write:

Musical practices have recently attracted the attention of research focusing on their creative properties and the creative potential of musicians.  Indeed, a typical cliché of musicians is that they are considered predominantly artistic individuals, meaning that they are creative and original.  Practicing music is certainly an intense and multisensory experience that requires the acquisition and maintenance of a range of cognitive and motor skills throughout a musician’s life.  Indeed, music practice increases a wide range of cognitive abilities, such as visuospatial reasoning, processing speed, and [working memory], from the early stages of life.  For this reason, musicians are considered an excellent human model for the study of behavioral, cognitive, and brain effects in the acquisition, practice, maintenance, and integration of sensory, cognitive, and motor skills...

[E]xperience in the music field enhances [divergent thinking] in terms of fluency, flexibility, and originality.  Strengthening the associative modes of processing, which facilitate the retrieval of information from long-term memory, and improving the working memory competences, which facilitate the online recombination of information, might explain the relationship between musical practice and [divergent thinking].

All of which bolsters something I've been saying for years; we need to be actively supporting art and music in schools.  Sadly, school boards much more often have the opposite mentality -- the esteemed "STEM" subjects (science, technology, engineering, and math) are emphasized and thus funded, and the arts (sometimes derisively called "extras") are on the chopping block when money gets tight.

Which, of course, is all the time.  But wouldn't it be nice if the educational powers-that-be actually read the research, and acknowledged that music and art are every bit as important as STEM?

In any case, it's good to know that my struggling to learn piano might provide some other benefits besides making Debussy turn over in his grave.  Hell, at age 63, I'm thrilled to have any boosts to my cognition I can get.  And even if I'll never be able to play "Dr. Gradus ad Parnassum" like Lang Lang does, maybe the skills I learn from my piano lessons will spill over into other creative realms.  

After all, as Maya Angelou said, "You can't use up creativity.  The more you use, the more you have."

****************************************



Tuesday, February 13, 2024

Cutting off the circulation

Around 12,900 years ago, the world was warming up after the last major ice age.  Climatologists call it the "Late Glacial Interstadial," a natural warm-up due to the combined effects of the slow, gradual alterations in the Earth's orbit and precession cycle.  But then...

... something happened, and within only a few decades, the Northern Hemisphere -- especially what are now North America and western Europe -- were plunged back into the deep freeze.

The episode is called the "Younger Dryas" event, because the way scientists figured out it had happened was finding traces of pollen in ice cores from a plant, Dryas octopetala, which is only found in cold, dry, windswept habitats.  Areas that had been progressing toward boreal forest, or even temperate hardwood forest, suddenly reverted to tundra.

[Image licensed under the Creative Commons Steinsplitter, Weiße Silberwurz (Dryas octopetala) 2, CC BY-SA 3.0]

There are a number of curious features of the Younger Dryas event.  First, its speed -- climate shifts ordinarily take place on the scale of centuries or millennia, not decades.  Second, the fact that its effects were huge (the average temperature in Greenland dropped by something on the order of 7 C), but were limited in range; in fact, the Southern Hemisphere appears to have continued warming.  And third, after the initial plunge, the system righted itself over the next twelve hundred years -- by 11,700 years ago, the Northern Hemisphere was back on its warming track, and caught up with the rest of the world.

What could have caused such a strange, sudden, and catastrophic event is still a matter of some debate, but the leading candidate is that something halted the Atlantic Meridional Overturning Circulation, sometimes nicknamed "the Atlantic conveyor."  This is the massive ocean current of which the Gulf Stream is only a small part, and which is powered by warm water evaporating and cooling as it moves north, finally becoming cold and salty (and thus dense) enough to sink, somewhere south of Iceland.  This draws more warm water up from near the equator.  But as the Earth was warming during the Interstadial, the ice in the north was melting, eventually making the water in the North Atlantic too fresh to sink, and thus halting the entire circulation.  Some researchers think the process was sent into overdrive by the collapse of an ice dam holding back a massive freshwater lake called Lake Agassiz (encompassing what are now all five Great Lakes and the surrounding region), causing it to drain down the Saint Lawrence Seaway and into the North Atlantic, stopping the AMOC dead in its tracks.  (This point is still being debated.)

What's certain is that the AMOC stopped, suddenly, and took over a thousand years to get started again, plunging the Northern Hemisphere back into an ice age.

Why does this come up today?

Because a new study out of the University of Utrecht has found that our out-of-control fossil fuel use, and consequent boosting of the global temperature and melting of polar ice, is hurtling the AMOC toward the same situation it faced 12,900 years ago.  One of the consequences of anthropogenic global warming might be sending eastern Canada, the northeastern United States, and western Europe into the freezer.

One of the most alarming findings of the study is that climatologists have been systematically overestimating the stability of the AMOC.  It's an easy mistake to make; the current is absolutely enormous, amounting to a hundred million cubic meters of water per second, which is nearly a hundred times the combined flow of all the rivers in the world.  The idea of anything perturbing something that massive is a little hard to imagine.

But that's what happened during the Younger Dryas, and it happened fast.  The new study suggests that if the AMOC does collapse, within twenty years the temperature of Great Britain, Scandinavia, and the rest of northern Europe could see winter temperatures ten to thirty degrees Celsius colder than they are now, which would completely alter the ecosystems of the region (including agriculture).  It would also change precipitation patterns drastically, and in ways we are currently unable to predict.

If you're not already alarmed enough, here's how climatologist Stefan Rahmstorf put it, writing for the site RealClimate:
Given the impacts, the risk of an AMOC collapse is something to be avoided at all cost.  As I’ve said before: the issue is not whether we’re sure this is going to happen.  The issue is that we need to rule this out at 99.9 % probability.  Once we have a definite warning signal it will be too late to do anything about it, given the inertia in the system...  We will continue to ignore this risk at our peril.

The problem is that last bit -- we don't have a very good history of addressing problems ahead of time.  We're much more prone to waiting until things are really awful, at which point they're harder (if not impossible) to fix.  We've let the corporate interests and short-term expediency drive policy for too long; it's increasingly looking like we're close to hitting the now-or-never point.

We need to start electing candidates who take this whole thing seriously.  It is the most important issue of our time.  I try not to be a one-issue voter, but if someone's answer to "What do you intend to do to remediate climate change" is "Nothing" -- or, worse, "Climate change is't real" -- they've lost my vote.

And they should lose yours, too.  For the good of the planet.

****************************************