Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, February 13, 2023

The Scottish queen's code

Despite the fact that she's been dead for over four hundred years, Mary, Queen of Scots remains a controversial and divisive figure amongst historians.

The only surviving child of King James V of Scotland and Mary of Guise, the younger Mary started off her reign much the same way her father had.  James's father, King James IV, died in 1542 at the disastrous (for the Scots, at least) Battle of Flodden Field, making James V the king in 1513 at the age of only seventeen months.  This put the kingdom in the hands of regents throughout the early years of the reign, which is seldom a recipe for stability.  After much jockeying about by regents and councillors eager to wield control, James V finally was able to throw off the shackles of the regency in 1528 following the Battle of Linlithgow Bridge.

His adult reign was turbulent.  James himself has been characterized as paranoid (unsurprising, really, considering that he'd been a virtual prisoner to his regents as a child), more interested in reading and playing the lute than in administering a kingdom.  He did have a great concern for the common folk, however, and actually spent time wandering amongst them in disguise, gaining him the nickname of "the Gudeman of Ballengeich" ("gudeman" is Scots dialect for "smallholder;" Ballengeich is one of his favorite haunts, near Stirling Castle).  Interestingly, there's a sweet Scottish country dance tune called "The Geud Man of Ballengigh" which I've known for years -- knew the tune, in fact, long before I ever knew the story behind it.

James V died in 1542, at the age of only thirty, probably of cholera -- only six days after the birth of his daughter Mary.  Mary was crowned, just like her father had been, as an infant.  This once again left Scotland in the hands of regents, headed by her mother, the smart, powerful Mary of Guise, who (unlike the regents James had endured) was determined to hang onto the throne on the behalf of her daughter.  Mary was sent off to France to be raised and educated, something that was to work against her later, as culturally she was seen as far more French than she was Scottish.  This was amplified in 1558 when she was married to King Francis II of France, but that marriage was ended by Francis's death in 1560 at age sixteen (whether of an infection or because he was poisoned is uncertain).  The marriage produced no children; some believe it was never consummated.

Then Mary of Guise died in 1560, at which point the younger Mary -- now widowed and old enough to rule Scotland in her own right -- returned to her home country, which she'd barely seen in her eighteen years.  But as with her father, she found that the powerful men who had run the country in her absence weren't eager to give up control.  Mary then showed signs of the recklessness that was to characterize the rest of her life.  She first married the wildly unpopular Henry Stuart, Lord Darnley, who was actually Mary's half first cousin (Mary's paternal grandmother, Margaret Tudor, daughter of English King Henry VII, had married twice -- first to King James IV of Scotland, and second to Archibald Douglas, Sixth Earl of Angus; those two marriages produced Mary's father, and Darnley's mother, respectively.)  The marriage, by all accounts, was miserable.  Despite being handsome and superficially charming, Darnley turned out to be a vain, arrogant, violent drunkard.  In fact, when Darnley was murdered by a group of noblemen led by James Hepburn, Fourth Earl of Boswell in 1567 -- only a year after the birth of Mary's and Darnley's only son, James (eventually King James VI of Scotland and James I of England), Mary turned around and married Hepburn a month later.

This outrage was the final straw.  Darnley had been unpopular, but the queen marrying his murderer was just too much.  There was a massive uprising, and Mary was forced to abdicate in favor of her infant son (making for three infant successions to the throne of Scotland in a row).  She fled to England, asking for asylum from her cousin, Queen Elizabeth I (Elizabeth's father, King Henry VIII, was Mary's paternal grandmother's brother).  Elizabeth reluctantly agreed, but recognizing the fact that Mary was a direct descendant of King Henry VII and thus in line for the throne, she had her put under rather genteel house arrest.

Her caution is understandable.  Elizabeth's own path to the throne had been fraught, and for a while it looked likely that she herself was going to spend her life in close confinement (if not worse).  But when her two half-siblings, King Edward VI and Queen Mary I, both died without heirs, she succeeded to the throne for what would be one of the longest and most successful reigns of any monarch of England.

Mary, though, wasn't content to relax into what was honestly a fairly comfortable situation and give up her aspirations to rule.  In fact, she was of the opinion that Elizabeth's own reign wasn't valid; the marriage between Henry VIII and Elizabeth's mother, the unfortunate Anne Boleyn, had been annulled shortly before Anne lost her head on Tower Hill, making Elizabeth effectively an illegitimate child.  So Mary, ever the schemer, started writing letters to perceived supporters, trying to garner support to overthrow Elizabeth and put Mary on the throne of a combined England and Scotland.

It's those letters that are why the topic comes up; while some were written (unfortunately for Mary, as it turned out) in plain English, French, or Italian, some were written in code -- and until now, they'd been undeciphered.  But a team made up of George Lasry of Israel, Norbert Biermann of Germany, and Satoshi Tomokiyo of Japan have finally cracked Mary's cipher and allowed us to discover more about her plotting to do in her cousin -- a plot that, as I'm sure you know, ultimately failed spectacularly.

A portion of Mary's cipher

"[This marks] the most important new find on Mary Stuart, Queen of Scots, for a hundred years," said historian John Guy.  "The letters show definitively that Mary, during the years of her captivity in England... closely observed and actively involved herself in political affairs in Scotland, England and France, and was in regular contact, either directly, or indirectly through de Castelnau [Michel de Castelnau Mauvissière, the French ambassador to England], with many of the leading political figures at Elizabeth I's court...  They prove that Mary was a shrewd and attentive analyst of international affairs."

"With our new decipherments, we provide evidence that such a secret channel was already in place as early as May 1578," the authors write.  "Also, while some details were already known, our new decipherments provide further insights into how this channel was operated, and on the people involved...  From time to time, she suggests enticing various people with financial rewards so that they would switch sides, or soften their attitude toward her.  She also asks for Castelnau's assistance in recruiting new spies and couriers, while sometimes she warns him – rightly – that some people working for her might be Walsingham's [Francis Walsingham, Elizabeth's spymaster] agents."

So the new letters add to the picture of Mary as a schemer -- and someone who gives new meaning to the word "reckless."  Ultimately, of course, Elizabeth got fed up with this nonsense, and after finding serious evidence that Mary was plotting to have her assassinated, signed the death warrant.  Mary, Queen of Scots was beheaded at Fotheringay Castle in 1586 at the age of forty-four -- and despite her moniker, had spent the vast majority of her life outside of Scotland, and had only been the queen in fact for seven years.

It's a story filled with intrigue and twists and turns, and further raises the question in my mind of why in the hell anyone in their right minds wanted to be in power back then.  It'd be interesting to see, just in the histories of England and Scotland alone, what percentage of the people who were kings, queens, heirs, counselors, and nobles came to bad ends.  My highly unscientific assessment is "must be really high."  While I wouldn't have wanted to be a peasant -- that had its own, quite different set of unpleasantness -- if I were to time travel back to the sixteenth century, I'd have been perfectly happy settling down to a nice placid life as a simple merchant-class guy.  Let the royals and nobles play their human chess games; any benefits from power and wealth would simply not be worth the risk.

****************************************


Saturday, February 11, 2023

Hopes and dreams

I was listening to tunes while running yesterday afternoon, and Christina Aguilera's beautiful song "Loyal, Brave, and True" (from the movie Mulan) came up, and it got me thinking about a conversation I had a while back with a diehard cynic.

This guy hates anything Disney.  Or Pixar, for that matter.  His attitude is that happy endings are smarmy, cheesy, and unrealistic.  In real life, he says, the bad guys often win, having good motives doesn't guarantee you'll succeed, and true love fails to survive as often as not.  Life is, at best, a zero-sum game.  Movies and books that try to tell us otherwise are lying -- and doing it purely to draw in audiences to bilk them of their money.

My response was, "Okay, but even if you're right, why would we want to immerse ourselves in fiction that's just as bad as the real world?"

One of fiction's purposes, it seems to me, is to elevate us, to give us hope that we can transcend the ugliness that we see on the news every night.  Especially with kids' movies and books, what possible argument could there be for not giving children that hope?  But even with adult fiction, I would argue that all of us need to have that lift of the spirit that we can only get from leaving behind what poet John Gillespie Magee called "the surly bonds of Earth" for a while.

I don't mean it's always got to have an unequivocally happy ending, of course; you can have your heart moved and broken at the same time.  Consider the impact of The Dead Poet's Society, for example.  Okay, maybe John Keating lost, in a sense; but in the end, when one by one his students stand up and say "O captain, my captain!" who can doubt that he made a difference?  My all-time favorite book -- Umberto Eco's Foucault's Pendulum -- ends with two of the main characters dead and the third waiting to be killed, but even so, the last lines are:

It makes no difference whether I write or not.  They will look for other meanings, even in my silence.  That's how They are.  Blind to revelation....  But try telling Them.  They of little faith.

So I might as well stay here, wait, and look at the sunlight on the hill.

It's so beautiful.

My own writing tends toward bittersweet endings -- perhaps not unequivocally happy, but with a sense that the fight was still very much worth it.  My character Duncan Kyle, in Sephirot, goes through hell and back trying to get home, but in the end when he's about to take his final leap into the dark and is told, "Good luck.  I hope you see wonders," he responds simply, "I already have."

No one understood this better than J. R. R. Tolkien.  Does The Lord of the Rings have a happy ending?  I don't know that you could call it that; Frodo himself, after the One Ring is destroyed, tells his beloved friend Sam, "Yes, the Shire was saved.  But not for me."  The end of the movie makes me bawl my eyes out, but could it have ended any other way without cheapening the beauty of the entire tale?

To quote writer G. K. Chesterton: "Fairy tales are more than true – not because they tell us dragons exist, but because they tell us dragons can be beaten."

We've been telling stories as long as we've been human, and we need all of them.  Even the ones my friend would call unrealistic and cheesy happily-ever-afters.  They remind us that happiness is possible, that even if the world we see around us can be tawdry and cheap and commercial and all of the things he so loudly criticizes, there is still love and kindness and compassion and creativity and courage.

And those are at least as powerful, and as real, as the ugly parts.

We need stories.  They keep us hopeful.  They keep us yearning for things to be better, for the world to be a sweeter place.  They raise our spirits, renew our commitment to treat each other with respect and honor and dignity, and keep us putting one foot in front of the other even when things seem dismal.

The best fiction recalls the last lines of Max Ehrmann's deservedly famous poem "Desiderata": "Whatever your labors and aspirations, in the noisy confusion of life, keep peace in your soul.  With all its sham, drudgery and broken dreams, it is still a beautiful world.  Be cheerful.  Strive to be happy."

****************************************


Friday, February 10, 2023

Earthquakes and sharpshooters

A guy is driving through Texas, and passes a barn.  It's got a bullseye painted on the side -- with a bullet hole in the dead center.

He sees two old-timers leaning on a fence nearby, and pulls over to talk to them.

"Did one of you guys make that bullseye shot?" he says.

One of them says, a proud smile on his face, "Yeah.  That was me."

"That's some amazing shooting!"

The man says, "Yeah, I guess it was a pretty good shot."

The old-timer's friend gives a derisive snort.  "Don't let him fool you, mister," he says.  "He got drunk, shot a hole in the side of his own barn, and the next day painted the bullseye around the bullet hole."

This is the origin of the Texas sharpshooter fallacy, the practice of analyzing an outcome out of context and after the fact, and overemphasizing its accuracy.  Kind of the bastard child of cherry-picking and confirmation bias.  And I ran into a great example of the Texas sharpshooter fallacy just yesterday -- a Dutch geologist who has gone viral for allegedly predicting the devastating earthquake that hit southeastern Turkey and northwestern Syria on February 6.

The facts of the story are that on February 3, a man named Frank Hoogerbeets posted on Twitter, "Sooner or later there will be a ~M 7.5 earthquake in this region (South-Central Turkey, Jordan, Syria, Lebanon)."  This, coupled with the fact that the day before, the SSGEOS (the agency for which Hoogerbeets works) had posted on its website, "Larger seismic activity may occur from 4 to 6 February, most likely up to mid or high 6 magnitude. There is a slight possibility of a larger seismic event around 4 February," has led many to conclude that they were either prescient or else have figured out a way to predict earthquakes accurately -- something that has eluded seismologists for years.  The result is that Hoogerbeets's tweet has gone viral, and has had over thirty-three million views and almost forty thousand retweets.

Okay, let's look at this claim carefully.

First, if you'll look at Hoogerbeets's twitter account and the SSGEOS website, you'll see a couple of things right away.  First, they specialize in linking earthquake frequency to the weather and to the positions of bodies in the Solar System, both of which are correlations most scientists find dubious at best.  Second, though, is that Hoogerbeets and the SSGEOS have made tons of predictions of earthquakes that didn't pan out; in fact, the misses far outnumber the hits.

Lastly, the East Anatolian Fault, where the earthquake occurred, is one of the most active fault zones in the world; saying an earthquake would happen there "sooner or later" doesn't take a professional geologist.

[Image licensed under the Creative Commons Roxy, Anatolian Plate Vectoral, CC BY-SA 3.0]

What seems to have happened here is that the people who are astonished at Hoogerbeets's prediction have basically taken that one tweet and painted a bullseye around it.  The problem, of course, is that this isn't how science works.  You can't just take this guy's one spot-on prediction and say it's proof; in order to support a claim, you need a mass of evidence that all points to a strong correlation.

Put a different way: the plural of anecdote is not data.

No less an authority than the United States Geological Service has stated outright that despite improvements in fault monitoring and our general knowledge about how earthquakes work, quakes are still unpredictable.  "Neither the USGS nor any other scientists have ever predicted a major earthquake," their website states.  "We do not know how, and we do not expect to know how any time in the foreseeable future.  USGS scientists can only calculate the probability that a significant earthquake will occur (shown on our hazard mapping) in a specific area within a certain number of years."

So what Hoogerbeets and the SSGEOS did was basically nothing more than an unusually shrewd guess, and I'd be willing to bet that the next "sooner or later" prediction from that source will turn out to be inaccurate at best.  Unfortunate, really; having an accurate way to forecast earthquakes could save lives.

But realistically speaking, we are nowhere near able to do that -- viral tweets and spurious bullseyes notwithstanding.

****************************************


Thursday, February 9, 2023

The glass grass

The attitudes and practices of colonialism did incalculable damage, and not least on the list is the fact that (by and large) the colonizers completely disregarded indigenous people's knowledge of their own lands.

The inevitable result was that much of that knowledge was lost.  Not only general, broad-brush information such as how to raise food in climates unfamiliar to the colonial cultures, but specific details like the uses of native plant and animal species.  The colonizers, secure in their own arrogance, instead imported the species they had back home -- thus adding another problem on top of the first.

Because, of course, this is a huge part of why there's such a problem with invasive exotics.  Some jumped accidentally; but a great many were deliberate imports that have proceeded to wreak havoc on native ecosystems.  Consider, for example, the problems caused by the introduction of European rabbits to Australia -- and the millions of dollars that have been spent since trying to control them.

I bring up Australia deliberately, because it's a prime example of colonizers completely ignoring millennia of experience and knowledge by indigenous people, embodying Adam Savage's oft-quoted line "I reject your reality and substitute my own."  You'd think they would have listened, wouldn't you?  Not only does Australia have a tough climate by most anyone's standards, plagued by droughts and floods that seem to alternate on a monthly basis, its native species have adapted by becoming tough and resilient.  The indigenous Australians managed in much the same way; learning how to deal with the climate's vagaries -- and relying on the native plants and animals to provide sustenance.

This meant making use of damn near everything, including species that seem on first glance to be worse than useless.  Take, for example, spinifex grass (Triodia spp.), which grows all over inland Australia.  Not only is it able to survive in broiling hot desert conditions -- it can survive temperatures of 60 C -- it puts down roots as long as thirty meters in an attempt to access what groundwater there is.  In a place where any kind of vegetation is fair game for herbivores, spinifex has developed ways to defend itself; it absorbs silica from the soil and deposits it in the tips of the leaves.  Silica, I probably don't need to point out, is better known as glass.

Walking through a field of spinifex in shorts is a good way to come out with your legs embedded with thousands of glass splinters.

An Australian grassland ecosystem, with two species of spinifex -- the green plants are soft spinifex (Triodia pungens), and the gray-green ones are lobed spinifex (Triodia basedowii). [Image licensed under the Creative Commons Hesperian, Triodia hummock grassland, CC BY-SA 3.0]

Despite its difficulties, the indigenous Australians made full use of this odd plant.  The fibers of the stems were used for weaving and thatching huts; the waxes and oils extracted from it were hardened into a resin that could be used as a glue or a sealant.  And now, spearheaded by the Indjalandji-Dhidhanu people of the upper Georgina River, spinifex is being reintroduced as a 21st-century commodity -- with potential international markets.

Scientists at the University of Queensland, working with Indjalandji-Dhidhanu elder Colin Saltmere (himself an adjunct professor of architecture), have analyzed spinifex's unique properties, and found that not only does the resin (used for thousands of years by indigenous peoples) have properties similar to moldable plastic, the fibers in the stems have high flexibility, exceptional resistance to fatigue cracking -- and eight times the tensile strength of an equal diameter of steel.  The potential applications are already a very long list, including cable manufacture, production of resilient membranes (possibly superseding latex in gloves, for example), and creation of substitutes for wood, plastics, and carbon nanofibres.

"For thousands of years, spinifex was a building block for the Aboriginal societies in the desert; now it will continue to play a role in advancing local Aboriginal communities through business and employment opportunities," Saltmere said.  "The fine fibres at a nanoscale make this plant remarkable – and because it is so fine, we can make a fully renewable gel that is 98% water, and on a scale where we can sustainably generate hundreds of thousands of tonnes of material."

What seems to me to be nothing more than common sense -- "Listen to the people who know the land way better than you do" -- was effectively ignored for hundreds of years.  It's heartening that at least some of those voices are now being heard.  And given what's happening to the climate, we're going to need every advantage we have.  Better late than never, I suppose.  In this case, making use of a strange crop that was considered little more than a weed by the European colonizers, the multiple uses of which are only now becoming wider knowledge outside of the communities of indigenous Australians.

****************************************


Wednesday, February 8, 2023

The cardboard box ruse

My friend and fellow author Gil Miller, who has suggested many a topic for me here at Skeptophilia, threw a real doozy my way a couple of days ago.  He shares my interest in all things scientific, and is especially curious about where technology is leading.  (It must be said that he knows way more about tech than I do; if you look up the word "Luddite" in the dictionary you'll find a little pic of me to illustrate the concept.)  The topic he suggested was, on its surface, flat-out hilarious, but beyond the amusement value it raises some deep and fascinating questions about the nature of intelligence.

He sent me a link to an article that appeared at the site PC Gamer, about an artificial intelligence system that was being tested by the military.  The idea was to beef up a defensive AI's ability to detect someone approaching -- something that would have obvious military applications, and could also potentially be useful in security systems.  So an AI that had been specifically developed to recognize humans and sense their proximity was placed in the center of a traffic circle, and eight Marines were given the task of trying to reach it undetected; whoever got there without being seen won the game.

The completely unexpected outcome was that all eight Marines handily defeated the AI.

A spokesperson for the project described what happened as follows:

Eight marines: not a single one got detected.  They defeated the AI system not with traditional camouflage but with clever tricks that were outside of the AI system's testing regime.  Two somersaulted for three hundred meters; never got detected.  Two hid under a cardboard box.  You could hear them giggling the whole time.  Like Bugs Bunny in a Looney Tunes cartoon, sneaking up on Elmer Fudd in a cardboard box.  One guy, my favorite, he field-stripped a fir tree and walked like a fir tree.  You can see his smile, and that's about all you see.  The AI system had been trained to detect humans walking, not humans somersaulting, hiding in a cardboard box, or disguised as a tree.
Remember Ralph the Wolf disguising himself as a bush?  Good thing the sheep had Sam the Sheepdog looking after them, and not some stupid AI.

This brings up some really interesting question about our own intelligence, because I think any reasonably intelligent four-year-old would have caught the Marines at their game -- and thus outperformed the AI.  In a lot of ways we're exquisitely sensitive to our surroundings (although I'll qualify that in a moment); as proto-hominids on the African savanna, we had to be really good at detecting anything anomalous in order to survive, because sometimes those anomalies were the swishing tails of hungry lions.  For myself, I have an instinctive sense of spaces with which I'm familiar.  I recall distinctly walking into my classroom one morning, and immediately thinking, Someone's been in here since I locked up last night.  There was nothing hugely different -- a couple of things moved a little -- but having taught in the same classroom for twenty years, I knew it so well that I immediately recognized something was amiss.  It turned out to be nothing of concern; I asked the principal, and she said the usual room the school board met in was being used, so they'd held their session in my room the previous evening.

But even the small shifts they'd made stood out to me instantly.

It seems as if the only way you could get an AI to key in on what humans do more or less automatically is to program them explicitly to keep track of where everything is -- or to recognize humans somersaulting, hiding under cardboard boxes, and disguised as fir trees.  Which kind of runs counter to the bottom-up approach that most AI designers are shooting for.

What's most fascinating, though, is that our "exquisite sensitivity" I referred to earlier has some gaping holes.  We're programmed (as it were) to pay attention to certain things, and as a result are completely oblivious to others, usually based upon what our brains think is important to pay attention to at the time.  Regular readers of Skeptophilia may recall my posting the following mindblowing short video, called "Whodunnit?"  If you haven't seen it, take a couple of minutes and watch it before reading further:


This phenomenon, called inattentional blindness, results in our focusing so deeply on a few things that we effectively miss everything else.  (And honesty demands I admit that despite my earlier flex about my attention to detail in my classroom, I did terribly watching "Whodunnit?".)

Awareness is complex; trying to emulate our sensory processing systems in an AI would mean understanding first how ours actually work, and we're very far from that.  Obviously, no one would want to build inattentional blindness into a security system, but I have to wonder how you would program an AI to recognize what was trivial and what was crucial to notice -- like the fact that it was being snuck up on by a Marine underneath a cardboard box.  The fact that an AI that was good enough to undergo military testing failed so spectacularly, tricked by a ruse that wouldn't fool any normally-abled human being, indicates we have a very long way to go.

****************************************


Tuesday, February 7, 2023

The locked heavens

When you picture an exoplanet, it's easy to fall back into the typical science-fiction concept of an alien world -- almost always like some odd, vaguely hostile version of Earth, with a different-colored sky and lots of big rocks.

Kirk: "Six to beam down.  Be prepared to beam back three of us in about forty-five minutes."

As astronomers have discovered more and more actual exoplanets, though, they've found there's far more variety in their characteristics than the creators of Star Trek ever dreamed.  There are hot Jupiters -- like our own largest planet, made primarily of thick layers of hydrogen and helium, with a solid core, but so close to their parent stars that not only are they extremely hot, they have orbital periods of ten days or under.  There are ice planets, water planets, planets made entirely of molten rock.  One of the most curious features of the exoplanet menagerie, though, is that some are tidally locked -- like our own Moon, they always have the same face pointing toward their orbital center.

And just last week, scientists at the Max Planck Institute for Astronomy have found a tidally-locked planet whose daylight face has temperatures in the habitable range.  If it has an atmosphere as thick as Earth's (something currently not known), it's estimated to have an average temperature of around 13 C.

Called Wolf 1069b, thirty-three light years away in the constellation Cygnus, it has almost exactly the same radius as Earth, but an orbital period of only 15.6 days.  Not that the inhabitants would be able to determine that easily; on a tidally-locked planet, their sun would always be in the same position in the sky, so it wouldn't be obvious that they were orbiting around anything

Think of how bizarre it would be to live on Wolf 1069b.  On the always-daylight side, things are reasonably clement, but as you approach the twilit border, conditions go downhill fast.  Because weather is caused by convection -- changes in air pressure driven by uneven heating -- Wolf 1069b would experience a degree of weather never seen on Earth.  Its star would heat the atmosphere on the daylight side, causing the air to expand and rise; this would pull cold air from the nighttime side of the planet, creating a convection cell that dwarfs the strongest trade winds imaginable.  As you got closer to the edge between day and night, you'd walk into an increasingly powerful, freezing cold headwind, moving at a speed that would make a hurricane seem like a gentle breeze.

Then, on the nighttime side -- nothing but frozen wasteland, forever pointing outward into the starlit sky, never seeing the warm light of the parent star.  Not survivable for any life form I can conceive of, certainly not the ones evolved on and adapted to the daylight side.  Imagine what kinds of stories the inhabitants would tell -- of a hospitable region where the sun shines down from high in the heavens, fixed in place as if the sky was a crystalline sphere, eternal and unchangeable.  The known world ringed by an impassible boundary of screaming winds and bitter cold.  On the other side of which is... the unknown.  Eventually, if they developed sophisticated enough technology, surely they'd venture there, as we now dive down in submarines to investigate the deepest oceanic trenches.  

What would they think, the first time they traveled into the region of perpetual night -- and saw stars?

The wild diversity of astronomical objects we're discovering absolutely beggars belief.  There's a planet called TrES-2b that is the darkest exoplanet ever studied -- the same overall hue as a piece of charcoal -- and no one knows why.  55 Cancri-e is hot and carbon-rich, and might be composed chiefly of diamond.  HR 5183b has an extremely elliptical orbit, lasting around 74 Earth years -- starting out farther away from its parent star than Jupiter is from the Sun, but screaming in to slingshot around it closer than the orbit of Mercury -- earning it the nickname of "the whiplash planet."  WASP-76b has a surface hot enough to vaporize iron -- meaning it rains, but it rains molten droplets of iron metal.

Fiction writers like myself would have a hard time coming up with anything odder than what the astronomers are actually discovering in the skies above us.  It recalls the quote from Carl Sagan: "We all have a thirst for wonder.  It's a deeply human quality.  Science and religion are both bound up with it.  What I'm saying is, you don't have to make stories up, you don't have to exaggerate.  There's wonder and awe enough in the real world.  Nature's a lot better at inventing wonders than we are."

****************************************


Monday, February 6, 2023

The next phase

When I put on water for tea, something peculiar happens.

Of course, it happens for everyone, but a lot of people probably don't think about it.  For a while, the water quietly heats.  It undergoes convection -- the water in contact with the element at the bottom of the pot heats up, and since warmer water is less dense, it rises and displaces the cooler layers above.  So there's a bit of turbulence, but that's it.

Then, suddenly, a bit of the water at the bottom hits 100 C and vaporizes, forming bubbles.  Those bubbles rapidly rise, dispersing heat throughout the pot.  Very quickly afterward, the entire pot of water is at what cooks call "a rolling boil."

This quick shift from liquid to gas is called a phase transition.  The most interesting thing about phase transitions is that when they occur, what had been a smooth and gradual change in physical properties (like the density of the water in the teapot) undergoes an enormous, abrupt shift -- consider the difference in density between liquid water and water vapor.

The reason this comes up is that some physicists in Denmark and Sweden have proposed a phase transition mechanism to account for the evolution of the (very) early universe -- and that proposal may solve one of the most vexing questions in astrophysics today.

A little background.

As no doubt all of you know, the universe is expanding.  This fact, discovered through the work of astronomer Edwin Hubble and others, was based upon the observation that light from distant galaxies was significantly red-shifted, indicating that they were moving away from us.  More to the point, the farther away the galaxies were, the faster they are moving.  This suggested that some very long time in the past, all the matter and energy in the universe was compressed into a very small space.

Figuring out how long ago that was -- i.e., the age of the universe -- depends on knowing how fast that expansion is taking place.  This number is called the Hubble constant.

[Image licensed under the Creative Commons Munacas, Big-bang-universo-8--644x362, CC BY-SA 4.0]

This brings up an issue with any kind of scientific measurement, and that's the difference between precision and accuracy.  While we use those words pretty much interchangeably in common speech, to a scientist they aren't the same thing at all.  Precision in an instrument means that every time you use it to measure something, it gives you the same answer.  Accuracy, on the other hand, means that the value you get from one instrument agrees with the value you get from using some other method for measuring the same thing.  So if my car's odometer tells me, every time I drive to my nearby village for groceries, that the store is exactly eight hundred kilometers from my house, the odometer is highly precise -- but extremely inaccurate.

The problem with the Hubble constant is that there are two ways of measuring it.  One is using the aforementioned red shift; the other is using the cosmic microwave background radiation.  Those two methods, each taken independently, are extremely precise; they always give you the same answer.

But... the two answers don't agree.  (If you want a more detailed explanation of the problem, I wrote a piece on the disagreement over the value of the Hubble constant a couple of years ago.)

Hundreds of measurements and re-analyses have failed to reconcile the two, and the best minds of theoretical physics have been unable to figure out why. 

Perhaps... until now.

Martin Sloth and Florian Niedermann, of the University of Southern Denmark and the Nordic Institute for Theoretical Physics, respectively, just published a paper in Physics Letters B that proposes a new model for the early universe which makes the two different measurements agree perfectly -- a rate of 72 kilometers per second per megaparsec.  Their proposal, called New Dark Energy, suggests that very quickly after the Big Bang, the energy of the universe underwent an abrupt phase transition, a bit like the water in my teapot suddenly boiling.  At this point, these "bubbles" of rapidly dissipating energy drove apart the embryonic universe.

"If we trust the observations and calculations, we must accept that our current model of the universe cannot explain the data, and then we must improve the model," Sloth said.  "Not by discarding it and its success so far, but by elaborating on it and making it more detailed so that it can explain the new and better data.  It appears that a phase transition in the dark energy is the missing element in the current Standard Model to explain the differing measurements of the universe's expansion rate.  It could have lasted anything from an insanely short time -- perhaps just the time it takes two particles to collide -- to 300,000 years.  We don't know, but that is something we are working to find out...  If we assume that these methods are reliable -- and we think they are -- then maybe the methods are not the problem.  Maybe we need to look at the starting point, the basis, that we apply the methods to.  Maybe this basis is wrong."

It's this kind of paradigm shift in understanding -- itself a sort of phase transition -- that triggers great leaps forward in science.  To be fair, some of them fizzle.  Most of them, honestly.  But sometimes, there are visionary scientists who take previously unexplained knowledge and turn our view of the universe on its head, and those are the ones who revolutionize science.  Think of how Galileo and Copernicus (heliocentrism), Kepler (planetary motion), Darwin (biological evolution), Mendel (genetics), Einstein (relativity), de Broglie and Schrödinger (quantum physics), Watson, Crick, and Franklin (DNA), and Matthews and Vine (plate tectonics) changed our world.

Will Sloth and Niedermann join that list?  Way too early to know.  But just the fact that one shift in the fundamental assumptions about the early universe reconciled measurements that heretofore had stumped the best theoretical physicists is a hopeful sign.

Time will tell if this turns out to be the next phase in cosmology.

****************************************