Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, January 31, 2024

Conspiracy crackpots

Okay, y'all, can we agree to stop calling them conspiracy theories?  A theory is a scientific model backed up by experimentation and/or observation, which is consistent with everything we know about the topic in question.

These are not theories.  We need a new term.

Maybe conspiracy batshit lunacy.  I dunno, that's more accurate, but it's a little clunky.  I'll keep thinking on it.

The reason the topic comes up (again) is because of mega-pop-star Taylor Swift and her boyfriend Travis Kelce, tight end for the Kansas City Chiefs, who will be playing in the Superbowl on February 11.  Well, Swift and Kelce made two huge mistakes, at least if you're a MAGA type; Swift endorsed Joe Biden for president in the 2020 election and is expected to endorse him again in 2024, and Kelce has appeared in commercials promoting the idea that the Pfizer COVID-19 vaccine is safe and effective.

Well.  You'd think they... I dunno.  I was gonna say "stomped all over the Constitution," but Trump himself basically did that.  Then I was going to say "threatened to drown small children," but Texas Governor Greg Abbott did that.  Then I was going to say "wanted to restrict freedom of speech," but Florida Governor (and failed presidential candidate) Ron DeSantis did that.

So comparisons kind of fail me.  Let's just say "You'd think they were really really really bad" and leave it there.

[Image licensed under the Creative Commons va Rinaldi creator QS:P170,Q37885816, Taylor Swift 2012, CC BY-SA 2.0]

In any case, the ultra-right-wing types couldn't just shrug and say, "Taylor Swift is an American citizen and can vote for whom she likes, and Travis Kelce is free to promote the vaccine if he thinks it's the right thing to do."  Oh, no.  There has to be more to it than that.  The firestorm started almost as soon as Swift and Kelce announced they were dating, and Swift started showing up to Kelce's games.  Then Swift was named Time magazine's 2023 Person of the Year, and things really started rolling.

Here are a few quotes, to give you the idea of what sort of things are being batted about on far-right media:

  • "I 'wonder' who’s going to win the Super Bowl next month.  And I 'wonder' if there’s a major presidential endorsement coming from an artificially culturally propped-up couple this fall.  Just some wild speculation over here, let’s see how it ages over the next eight months." -- Vivek Ramaswamy
  • "The Democratic Party and other powers are gearing up for an operation to use Taylor Swift in the election against Donald Trump." -- Jack Posobiec
  • "Taylor Swift is an op.  It’s all fake.  You’re being played." -- Benny Johnson
  • "The Democrats’ Taylor Swift election interference psyop is happening in the open.  It’s not a coincidence that current and former Biden admin officials are propping up Taylor Swift and Travis Kelce.  They are going to use Taylor Swift as the poster child for their pro-abortion GOTV Campaign." -- Laura Loomer
  • "All the Swifties want is a swift abortion." -- Charlie Kirk
  • The NFL is totally RIGGED for the Kansas City Chiefs, Taylor Swift, Mr. Pfizer (Travis Kelce).  All to spread DEMOCRAT PROPAGANDA.  Calling it now: KC wins, goes to Super Bowl, Swift comes out at the halftime show and ‘endorses’ Joe Biden with Kelce at midfield.  It’s all been an op since day one."  -- Mike Crispi
  • We're declaring a Holy War on Taylor Swift if she publicly backs the Democrats." -- an "unnamed source" quoting Donald Trump
  • "Who thinks this country needs a lot more women like Alina Habba, and a lot less like Taylor Swift?" -- unsurprisingly, Alina Habba
  • "Taylor Swift is a Pentagon psyop and a front for a covert political agenda." -- Jesse Watters
I could go on, but I probably don't need to.

What is astonishing to me is that very few folks listen to this and then say, "Okay, have you people been doing sit-ups underneath parked cars?  Or what?"  Evidently a significant fraction of Americans hear this stuff -- and think that it makes perfect sense.

Look, it's not that I don't know politics can get nasty, and that people -- certainly on both sides -- can do some really underhanded stuff to get elected.  But when a celebrity endorses Your Guy, and that's all hunky-dory and an example of a True American Standing Tall, but when a celebrity endorses The Other Guy it's gotta be a covert Pentagon psyop worthy of launching a Holy War, you might just want to check your thought processes for bias.

At least some mainstream media outlets are branding this wingnuttery for what it is.  CNN, in its article on the issue (linked above), labeled this stuff "loony thinking bearing little resemblance to reality," and that's not bad considering that CNN doesn't exactly have a sterling track record of calling out lunacy when they see it.  In fact, there's a good case to be made that back in 2015 the mainstream media created Donald Trump as a viable candidate by treating him as if he were one, instead of labeling him what he is right from the get-go -- an incompetent compulsive liar, a serial philanderer, a sexual predator, and a "businessman" who has a list of failed businesses as long as my arm.  But because his incendiary theatrics got listeners and readers, they uncritically publicized everything he said and did in order to keep readers and viewers engaged -- and that's a large part of why we're in the situation we now are.

At least -- maybe -- some media sources have learned their lesson.

But to return to my original point, these are not theories.  They are one of two things:
  1. deliberately crazy-sounding ideas thrown out by cynical individuals who don't actually believe what they're saying, but say it anyhow because they know it'll keep the public tuned in; or
  2. wild ramblings from people who think this stuff actually makes sense, in which case -- to borrow a line from C. S. Lewis -- "they're on the level of a man who says he is a poached egg."
And in neither case should we give them the slightest bit of attention, short of laughing directly into their faces.  Which is, honestly, what I'm hoping to accomplish here.

How about the Conspiracy Comedy Channel?  That at least captures the spirit of it.


Tuesday, January 30, 2024

The fingerprints of the Manatee

Cosmic ray is a catch-all term for the high-energy particles that constantly bombard the Earth's upper atmosphere.  The majority of them are deflected by the Earth's magnetic field or absorbed by the atmosphere, but a very few are energetic enough to reach the surface of the planet.  About 90% of cosmic rays are protons; a good chunk of the remaining ten percent are alpha particles (helium nuclei, consisting of two protons and two neutrons bound together).  The rest are varying mixes of particles from the subatomic zoo, sometimes even including positrons and antiprotons -- particles of antimatter.  They were discovered in 1912 by Austrian-American physicist Victor Hess in 1912, for which he won the 1936 Nobel Prize in Physics.

The lion's share of cosmic rays that strike the Earth originate from the Sun, but some come from much farther away.  As we've seen here several times at Skeptophilia, the universe is an energetic and often violent place, not lacking in mechanisms for sending bits of matter careening across the universe at a significant fraction of the speed of light.  As you might expect, supernovae produce cosmic rays; so do gamma ray bursters, Wolf-Rayet stars, and quasars.  The last-mentioned are thought to be supermassive black holes surrounded by an inward-spiraling accretion disk of gas and dust, which accelerates as it tumbles toward the event horizon and gives of one final death scream of radiation.  This makes quasars one of the brightest objects in the known universe, with luminosities tens of thousands of times that of the Milky Way.

Trying to pinpoint the origin of particular cosmic rays is tricky.  Being mostly made of charged particles, they're deflected by magnetic fields; so even if you find one and know the direction it was traveling when it hit your detector, you can't just trace the line backwards and assume that's the point in the sky where it originated.  So scientists who are interested in figuring out where the highest-energy cosmic rays come from -- ones that almost certainly weren't created by our placid, stable home star -- have a difficult task.

A team led by Laura Olivera-Nieto of the Max Planck Institute for Nuclear Physics has tackled this problem, and in a paper published last week in Science, came up with an answer for at least some of these mysterious particles.  Working at the High-Energy Stereoscopic System (HESS -- a nice nod to the discoverer of cosmic rays) in Namibia, Olivera-Nieto and her team are studying a curious source of cosmic rays -- black holes that are in a binary system with another star.

The current study is of an object called SS 433, a source of x-rays so powerful it's been nicknamed a "microquasar."  It lies in the middle of the Manatee Nebula in the constellation Aquila, a shell of gas and dust blown outward when a star went supernova between ten and a hundred thousand years ago.  The supernova resulted in a black hole as the doomed star's core collapsed, but its companion star lived on.

The Manatee Nebula [Image credit: B. Saxton, (NRAO/AUI/NSF) from data provided by M. Goss, et al.]

Well, after a fashion.  The enormous gravitational pull of the black hole is siphoning off matter from the companion star, and as that plume of gas spirals inward, it accelerates and gives off radiation -- just as the accretion disk of a quasar does.  The result is a jet of cosmic rays, including not only the typical charged particles but x-rays and gamma rays, which (unlike charged particles) are unaffected by magnetic fields.  This allows astronomers to pinpoint their sources.

So in the midst of this seemingly placid bit of space is a whirling hurricane of gas and dust that is accelerated so strongly it creates jets of particles moving at nearly the speed of light.  (Exactly the speed of light, in the case of the x-rays and gamma rays.)  Some of those particles eventually reach the Earth -- a few of which are picked up by Olivera-Nieto's team at HESS.

And those cosmic rays allows us to discern the fingerprints of an incredibly violent process taking place eighteen thousand light years away.


Monday, January 29, 2024

The writing on the stone

It can often be difficult to sort fact from fiction, especially when multiple people become involved, each with his or her own agenda -- and varying determination to adhere to the truth.

Take, for example, the Brandenburg Stone.  It's a 74 by 39 centimeter slab of oolite (a sedimentary rock) that appears to have writing-like marks scratched into the surface.  Without further ado, here's a photograph of the alleged artifact:

It was found in 1912 near Brandenburg, Kentucky by a farmer named Craig Crecelius.  Crecelius clearly thought the marks were writing -- and you can see for yourself that they look like it -- and he made a good effort to contact linguists who might be able to identify the script, but without success.  He exhibited the stone several times in nearby towns, but wasn't able to drum up much in the way of interest.

In 1965, the stone passed into the hands of one Jon Whitfield, and that's where things start to get interesting.

Whitfield thought he knew what the script was.  The letters, he said, were Coelbren y Beirdd (Welsh for "Bard's Lot"), a script for writing the Welsh language that in the early nineteenth century was the center of a linguistic controversy regarding its origins.  The man who promoted it, one Edward Williams (more often known by his "bardic name" of Iolo Morganwg), was absolutely obsessed with ancient Welsh history and traditions, and achieved fame as a collector of rare medieval Welsh manuscripts.

But why would there be Welsh script on a stone in Kentucky?

Whitfield thought he knew the answer.  There was a story circulating that the medieval Welsh prince Madoc ab Owain Gwynedd had crossed the Atlantic in around the year 1170 C. E. with a handful of friends, and the lot of them had stayed in North America and intermarried with Native Americans.  (Fans of Madeleine L'Engle will recognize this legend from her book A Swiftly Tilting Planet.)  This, said Whitfield, was proof that the legend was true -- and that Welsh-speaking Natives who descended from Madoc and his comrades had gotten as far inland as Kentucky.

There's only one problem with this.  Coelbren y Beirdd almost certainly wasn't an ancient script at all, but had been invented by Iolo Morganwg in 1791 -- who then passed it off as authentic.

It's pretty clear that despite his legitimate work in preserving ancient Welsh manuscripts, Williams/Morganwg also was a champion forger.  He was exposed as such long after his death by Welsh linguist and poet John Morris-Jones, who decried Williams's dishonesty, saying "it will be an age before our literature and history are clean of the traces of his dirty fingers."  Several of the works he "transcribed" were apparently written by him -- weaving his own fiction and philosophy into allegedly ancient legends and poetry, thus confusing the hell out of scholars who simply wanted to know what historical cultures actually believed.

So even if the marks on the Brandenburg Stone are actually Coelbren y Beirdd, it can't be any older than 1791, and probably much more recent than that.  Skeptic Jason Colavito points out that Morganwg's writing became really popular in the mid to late nineteenth century, when his son Taliesin began publishing and promoting his father's works.  Colavito writes:
The alphabet was widely published in the 1830s and 1840s, and whoever forged the Brandenburg Stone (it was not actually either Williams, who were never in Kentucky) almost certainly used such publications, possibly Taliesin Williams’s widely-read book about the alphabet, in forging the stone.  The younger Williams’s popular book was published to scholarly acclaim in 1840 (having won a prestigious prize two years before) and the alphabet was exposed as a hoax in 1893 (though suspicions had been raised earlier, until Taliesin successfully combated them), which makes it much more likely that the stone was actually carved between 1840 and 1912, though a date as early as 1792 cannot be excluded.  In the United States, libraries had dozens of different volumes on Coelbren y Beirdd, including the Iolo Manuscripts (1848), Bardaas (1862 and 1874), etc., but I am not able to find evidence that the alphabet itself would have been widely available in rural America prior to Taliesin’s book, though it is possible that some of Edward’s specialist publications imported from Britain were available in some places.  After 1862, the largest collection of the Williams forgeries was in print and the alphabet was at the height of its popularity.  Thus, the latter nineteenth or early twentieth century seems the best candidate for the time of forgery.
So we have Craig Crecelius, the farmer who found the stone, and who appears to have been genuinely unaware that it was a forgery; Jon Whitfield, who was the one who identified the writing as Coelbren y Beirdd, but was too young to have been responsible for the creation of the stone, and seems to have thought it was authentic as well; and Edward Williams, who created the fake script but never went to Kentucky and so can't have been the stone's creator, either.

In the end, we're left with a mystery.  An unknown person scratched some mysterious letters on a stone, probably in the last half of the nineteenth century, and left it for someone to find.  And someone did... starting a domino effect of speculation that still shows up on television shows specializing in archaeological weirdness.  The fact remains, though, that everything about it is certainly a forgery -- not only the artifact itself, but the script in which the inscription is written.

But as far as who perpetrated the hoax, we'll probably never know.


Saturday, January 27, 2024

Missing the target

Lately I've been seeing a lot of buzz on social media apropos of the Earth being hit by a killer asteroid.

Much of this appears to be wishful thinking.

Most of it seems to focus on the asteroid 2007 FT3, which is one of the bodies orbiting the Sun that is classified as a "near-Earth object" -- something with an orbit that crosses Earth's, and could potentially hit us at some point in the future.  It bears keeping in mind, however, that even on the scale of the Solar System, the Earth is a really small target.  This "deadly asteroid," we're told, is "on a collision course with Earth" -- but then you find out that its likelihood of its actually striking us on the date of Doomsday, March 3, 2030, is around one in ten million.

Oh, but there's "an altogether more sinister estimate" that 2007 FT3 could hit us on October 5, 2024, but the chances there are one in 11.5 million.  Why this is "altogether more sinister," I'm not sure.  Maybe just because it's sooner.  Or maybe the author of the article doesn't understand how math works and thinks that the bigger the second number, the worse it is.  I dunno.

Then there's the much-hyped asteroid 99942 Apophis, which was first thought to have a 2.7% chance of hitting the Earth in April of 2029 (more accurate observations of its orbit eliminated that possibility entirely), and then gets a second shot at us in April of 2036.  The 2036 collision depends on it passing through a gravitational keyhole during its 2029 close approach -- a tiny region in space where the pull of a much larger planet shifts the orbit of a smaller body in such a way that they then collide on a future pass.  Initially, the keyhole was estimated to be eight hundred kilometers in diameter, and this caused the physicists at NASA to rate Apophis at a four out of ten on the Torino Impact Scale -- the highest value any object has had since such assessments began.  (A rating of four means "A close encounter, meriting attention by astronomers.  Current calculations give a 1% or greater chance of collision capable of regional devastation.  Most likely, new telescopic observations will lead to reassignment to Level 0.  Attention by public and by public officials is merited if the encounter is less than a decade away.")  If it hit, the impact site would be in the eastern Pacific, which would be seriously bad news for anyone living in coastal California.

The close approach in 2029 [Image licensed under the Creative Commons Phoenix7777, Animation of 99942 Apophis orbit around Sun, CC BY-SA 4.0]

This, of course, spurred the scientists to try to refine their measurements, and when they did -- as the scale suggested -- they found out we're not in any danger.  The gravitational keyhole turns out to be only a kilometer wide, and Apophis will miss it completely.

In fact, there are currently no known objects with a Torino Scale rating greater than zero.

It's always possible, of course, that we could be hit out of the blue by something we never saw coming.  But given that we're talking about an unknown risk from an unknown object of unknown size hitting in an unknown location at an unknown time, I think we have more pressing things to worry about.  Sure, something big will eventually hit the Earth, but it's not going to happen in the foreseeable future.  NASA and the other space monitoring agencies in the world are doing a pretty good job of watching the skies, so maybe we should all just turn our attention on more important matters, like trying to figure out how nearly half of Americans think the best choice for president is a multiply-indicted, incompetent compulsive liar who shows every sign of incipient dementia.

In any case, I'm not concerned about asteroid impacts, and all the hype is just more clickbait.  So if you live on the West Coast and were planning on moving inland, or are considering cancelling your plans for a big Halloween bash this year, you probably should just simmer down.


Friday, January 26, 2024

Blind spots

Authors reveal more in their work, sometimes, than they may have intended.

That thought crossed my mind more than once while reading the book Hadrian by British historian, antiquarian, diplomat, and writer Stewart Perowne.  The book is a history and biography of the Roman Emperor Hadrian, who was the emperor of Rome from 117 to 138 C.E.  Hadrian is considered to be one of the better rulers Rome had -- generally fair-minded, astute, and intelligent -- although considering he's competing against guys like Caligula, Nero, Domitian, and Elagabalus, that may not be a very high bar.

A sculpture of the emperor Hadrian, circa 130 C. E. [Image licensed under the Creative Commons Djehouty, München SMAEK 2019-03-23n, CC BY-SA 4.0]

The book, which was published in 1960, was interesting enough, if a bit dry and pedantic at times (did we really need an entire chapter devoted to minute details about the architecture of the Pantheon?).  But there were a couple of times that what he wrote made me do a double-take.

The first time came when he was discussing the Roman program of expansion and colonization, and engaged in a digression comparing it to the policies of the British Empire between the eighteenth and mid-twentieth centuries.  Perowne writes:

No other country has ever had a finer or more generous record in its dealings with other races than the English.  No great power, since history began, has occupied, and advanced to autonomous sovereignty, so large an extent of territory in so short a period.  The advance, it is true, was from the very first, when the American colonists set the precedent, encouraged by the inhabitants of the territory concerned; nevertheless, it did not take long for England to adopt as a principle that the aim of all colonial enterprise is the elevation of the colonials, and their establishment as independent states, in whatever form of association they may choose with Great Britain. 

Say what?

I think there are citizens of a few nations I can think of who would beg to differ.  Great Britain fought like hell not to let a good many of their colonies gain their independence.  It was only when faced with sustained revolt -- and the impossibility of continuing a minority rule over the unwilling -- that they grudgingly granted sovereignty.  (And a great many of those nations are still struggling to overcome the long-term effects of colonialism -- oppression, exploitation, wealth inequality, and bigotry.)

I know there's the whole "man of his time" thing you hear about writers in the past, and which has been used to look past even the horrific racism that threads through a lot of the fiction of H. P. Lovecraft.  Here, it's not quite that extreme, but was still kind of startling to read.  And perhaps there are still a good many of us who have the tendency to consider our own country as intrinsically superior, even if we wouldn't necessarily put it that way.  But it's somewhere between baffling and appalling that someone who was a historian, who devoted his life to investigating and understanding other cultures -- who, in fact, worked as a diplomat in Malta, Aden, Iraq, Barbados, Libya, and Israel -- could come away with the impression of the British Empire as the Gentle Guides of the Civilized World.

Stewart Perowne in 1939, while serving in the British diplomatic corps in Libya [Image is in the Public Domain]

Now, mind you, I'm not saying the British were any worse than a lot of other militaristic colonial powers.  The history of the world is one long sorry tale of the powerful exploiting the weak.  But to write what Perowne did, especially with his extensive knowledge and experience, is evidence of a blind spot a light year wide.

Then there was the sniffy, superior bit he threw in about Hadrian's male lover, Antinoüs.  Hadrian, in fact, was pretty clearly gay.  He was married to an apparently rather obnoxious woman named Vibia Sabina, but the marriage was an unhappy one and produced no children.  His devotion and love for Antinoüs, however, was the stuff of legends; the two were inseparable.

Hadrian and Antinoüs [Image licensed under the Creative Commons Carole Raddato from FRANKFURT, Germany, Marble Busts of Hadrian & Antinoüs, from Rome, Roman Empire, British Museum (16517587460), CC BY-SA 2.0]

Perowne writes:

It was in Bithynia that Hadrian formed his famous and fatal attachment to Antinoüs, a lad of whose origin nothing is known, except that he came from the city of Bithynion...  Antinoüs, at the time when Hadrian met him, must have been a lad of about eighteen.  He was broad-shouldered and quite exceptionally handsome...  Whether the relations between the emperor Hadrian and his beautiful young favorite were carnal or not, we cannot be sure.  But what we can be certain of is this: that for the next nine years Antinoüs was the emperor's inseparable companion, that many people did suppose their association was based on a physical relationship, and that they did not reprobate it in the least...  However much we may deplore this fact, it simply is not possible to equate ancient and modern canons of morality.

He can't even bring himself to write "homosexual" -- but comments that it is unsurprising that later Roman authors used the word Bithynian as "a euphemism for something vile."

After reading this, you may be shocked to find out that Stewart Perowne himself was gay.

In a bizarre parallel to Hadrian's own life, Perowne reluctantly agreed to marry explorer and writer Freya Stark in 1947, but the marriage was unhappy, childless, and possibly even unconsummated.  Eventually the two divorced after it became obvious that Perowne's sexual orientation wasn't going to change.  He finally put it into writing to his wife, but once again meticulously avoided using the word homosexual:

It is difficult to say what "normal" is – my friend a counsellor of St. George's Hospital always refuses to use the word and in both men and women, you have a wide and graded range from ultra-male to ultra-female with naturally most people in the middle ranges...  Now for myself, I put myself in the middle group.  I have ordinary male abilities.  I like male sports some of them, and I love the company of women.  In fact, I find it hard to exist without it.  At the same time, I am occasionally attracted by members of my own sex – generally.  For some even pleasurable reason – by wearers of uniform.

I was simultaneously appalled and heartbroken to read those words, from the pen of the same man who called Hadrian's love for Antinoüs "something vile" and implied people were right to "deplore" it.  How deeply sunk in self-loathing would you have to be to be able to write both of those passages?

That a culture could produce such a tortured and damaged soul is a horrible tragedy.  And how many others did this happen to, men and women we don't know about because they never ended up in the public eye, but lived their entire lives in fear, shame, and obscurity, never able to openly love who they loved for fear of condemnation, imprisonment, or even death?

I'd like to think we've grown beyond that, but then I look around me at my own culture, where books are currently being banned merely for including queer people -- where even mentioning we exist is apparently improper -- and I realize that it's still going on.

So my reading of Hadrian got me thinking about way more than just a long-ago emperor of a classical European civilization.  It started me wondering about my own blind spots, things about myself and my culture that I take for granted as The Way Things Should Be, and which a future civilization might rightly shake their heads at.  

And thinking about Perowne himself made me recognize what complex, contradictory, and fragile creatures we humans are.  Will we ever find a way to move past all the antiquated hidebound moralizing, and simply treat each other with kindness, dignity, and compassion?  To live by the rule that has been set up as a guiding light in many cultures, but is best known in its biblical form -- "Do unto others as you would have them do unto you"?


Thursday, January 25, 2024

The man who listened to the sky

Arno Allan Penzias was born on the 26th of April, 1933, in Munich, Germany.  It was a fractious time for Germany, and downright dangerous for anyone of Jewish descent, which Penzias was; his grandparents had come from Poland and were prominent members of the Reichenbachstrasse Synagogue.  Fortunately for the family, his parents saw which way the wind was blowing and evacuated Arno and his brother Gunther to Britain as part of the Kindertransport Rescue Operation.  Their father and mother, Karl and Justine (Eisenreich) Penzias, were also able to get out before the borders closed, eventually making their way (as so many Jewish refugees did) to New York City, where they settled in the Garment District.

The younger Penzias had shown a fascination and aptitude for science at a young age, so his choice of a major was never really in doubt.  He went to City College of New York, graduating with a degree in physics in 1954 and ranking near the top of his class.  For a time after graduating he worked as a radar officer in the U. S. Army Signal Corps, but the pull of research drew him back into academia.  In 1962, he earned a Ph.D. in microwave physics from Columbia University, studying with the inventor of the maser, Charles Townes.

Penzias then got a job with Bell Labs in Holmdel, New Jersey, where he worked on developing receivers for the (then) brand-new field of microwave astronomy.  He teamed up with Robert Wilson, an American astronomer, to develop a six-meter-diameter horn reflector antenna with a seven-centimeter ultra-noise receiver, at that point by far the most sensitive microwave detector in the world.

And while using that antenna in 1964, he and Wilson discovered something extremely odd.

At a wavelength of 7.35 centimeters, corresponding to a temperature of around three degrees Kelvin, there was a strong microwave signal -- coming from everywhere.  It seemed to be absolutely uniform in intensity, and was present in the input no matter which direction they aimed the antenna.  It was so perplexing that Penzias and Wilson thought it was an artifact of some purely terrestrial cause -- at first, they thought it might be from pigeon poo on the antenna.  Even after ruling out whatever they could think of (and cleaning up after the pigeons), the signal was still there, a monotonous hiss coming from every spot in the sky.

Before publishing their findings, they started looking for possible explanations, and they found a profound one.  Almost twenty years earlier, physicists Ralph AlpherRobert Herman, and Robert Dicke had predicted the presence of cosmic microwave background radiation, the relic left behind by the Big Bang.  If the Big Bang model was correct, the unimaginably intense electromagnetic radiation generated by the beginning of the universe would have, in the 13.8-odd billion years since, been "stretched out" by the expansion of the fabric of spacetime, increasing its wavelength and dropping into the microwave region of the spectrum.  Alpher, Herman, and Dicke had predicted that the relic radiation should be under twenty centimeters in wavelength, and should be isotropic -- coming from everywhere in space at a uniform intensity.

That's just what Penzias and Wilson had observed.

In July of 1965, they published their results in the Astrophysical Journal, and suddenly Penzias and Wilson found themselves famous.

Penzias and Wilson at the Holmdel Horn Antenna in June of 1962 [Image is in the Public Domain courtesy of NASA]

At the time, there were two competing theories in cosmology -- the Big Bang model and the Steady-State model.  The latter theorized that the universe was expanding (that much had been undeniable since the discovery of red shift and Hubble's Law) but that as space expanded, matter was continuously being created, so the universe had no fixed start point.  Steady-State was championed by some big names in cosmological research -- Hermann Bondi, Thomas Gold, and Fred Hoyle amongst them -- and trying to figure out a way to discern which was correct had become something of a battle royale in astronomical circles.

But now Penzias and Wilson had made an accidental discovery, coupled it with a pair of (at the time) obscure papers making predictions about the temperature and wavelength of background radiation, and in one fell swoop blew the Steady-State model out of the water.

In 1978 Penzias and Wilson were awarded the Nobel Prize for research that changed the way we see the universe.

Since then, the cosmic microwave background radiation has been studied in phenomenal detail, and we've learned a great deal more about it -- starting with the fact that it isn't perfectly isotropic.  There are tiny but significant irregularities in the temperature of the radiation, something that has yet to be fully explained.  But the majority of the implications of the discovery have stood firm for nearly seventy years; 13.8 billion years ago, spacetime started to expand, and everything we see around us -- all the matter and energy in the universe -- condensed out of that colossally powerful event.  And coming from everywhere in the sky, like a ghostly afterimage of an explosion, is the radiation left behind, stretched out so much that it is outside of the range of human vision, and can only be detected by a telescope tuned to the microwave region of the spectrum.

On Monday, the 22nd of January, 2024, Arno Penzias died at the venerable age of ninety.  The world has lost a brilliant and innovative thinker whose contributions to science are so profound they're hard even to estimate.  The boy who escaped Nazi Germany with his family in the nick of time grew up to be a man who listened to the sky, and in doing so forever altered our understanding of how the universe began.


Wednesday, January 24, 2024

Water worlds

Water is one of those things that seems ordinary until you start looking into it.

The subject always puts me in mind of the deeply poignant Doctor Who episode "The Waters of Mars," which has to be in my top five favorite episodes ever.  (If you haven't seen it, you definitely need to, even if you're not a fanatical Whovian like I am -- but be ready for the three-boxes-of-kleenex ending.)  Without giving you any spoilers, let's just say that the Mars colonists shouldn't have decided to use thawed water from glaciers for their drinking supply.

Once things start going sideways, the Doctor warns the captain of the mission, Adelaide Brooke, that trying to fight what's happening is a losing battle, and says it in a truly shiver-inducing way: "Water is patient, Adelaide.  Water just waits.  Wears down the cliff tops, the mountains.  The whole of the world.  Water always wins."

Even beyond science fiction, water has some bizarre properties.  It's one of the only substances that gets less dense when you freeze it -- if water was like 99% of the compounds in the world, ice would sink, and lakes and oceans would freeze from the bottom up.  Compared to most other liquids, it has a sky-high specific heat (ability to absorb heat energy without much increase in temperature) and heat of vaporization (the heat energy required for it to evaporate), both of which act not only to allow our body temperature easier to regulate, it makes climates near bodies of water warmer in winter and cooler in summer than they otherwise would be.  It's cohesive, which is the key to how water can be transported a hundred meters up the trunk of a redwood tree, and is also why a bellyflop hurts like a mofo.  It's highly polar -- the molecules have a negatively-charged side and a positively-charged side -- making it an outstanding solvent for other polar compounds (and indirectly leading to several of the other properties I've mentioned).

And those are the characteristics water has at ordinary temperatures and pressures.  If you start changing either or both of these, things get weirder still.  In fact, the whole reason the topic comes up is because of a paper in Astrophysical Journal Letters called "Irradiated Ocean Planets Bridge Super-Earth and Sub-Neptune Populations," by astrophysicist Olivier Mousis of Aix-Marseille University, about a very strange class of planets where water is in a bizarre state where it's not quite a liquid and not quite a gas.

This state is called being supercritical -- where a fluid can seep through solids like a gas but dissolve materials like a liquid.  For water, the critical point is about 340 C and a pressure 217 times the average atmospheric pressure at sea level, so nothing you'll run into under ordinary circumstances.  This weird fluid has a density about a third that of liquid water at room temperature -- way more dense than your typical gas and way less than your typical liquid.

Mousis et al. have found that some of the "sub-Neptune" exoplanets that have been discovered recently are close enough to their parent stars to have a rocky core surrounded by supercritical water and a steam-bath upper atmosphere -- truly a strange new kind of world even the science fiction writers don't seem to have anticipated.  One of these exoplanets -- K2 18b, which orbits a red dwarf star about 110 light years from Earth -- fits the bill perfectly, and in fact mass and diameter measurements suggest it could be made up of as much as 37% water.

So there you are -- some strange features of a substance we all think we know.  Odd stuff, water, however familiar it is.  Even if you don't count the extraterrestrial contaminants that Captain Brooke and her crew had to contend with.


Tuesday, January 23, 2024

Never seen it before

Ever heard of the opposite of déjà vu -- jamais vu?

This may sound like it's the setup for some sort of abstruse bilingual joke, but it's not.  Déjà vu ("already seen" in French) is, as you undoubtedly know, the sensation that something you're experiencing has happened exactly that way before even though you're certain it can't have (a phenomenon, by the way, which is still yet to be fully explained, although there was a suggestive study out of Colorado State University five years ago that gave us some interesting clues about it).  Jamais vu ("never seen") is indeed the opposite; the eerie sense that something completely familiar is unfamiliar, uncertain, or simply incorrect.

One of the most common forms of jamais vu is an experience a lot of us have had; looking at a word and convincing ourselves that it's misspelled.  It can happen even with simple and ridiculously common words.  I remember being a teenager and working on a school assignment, and staring at the word "were" for what seemed like ages because suddenly it looked wrong.  The same thing can happen with music -- skilled musicians can reach a point in a piece they've practiced over and over, and suddenly it feels unfamiliar.  Less common, but even more unsettling, are reports where people look at faces of family and friends, and have the overwhelming sensation that they have never seen them before.

The emphasis here is on "looks" and "feels" and "sensation."  This seems not to be a cognitive issue but a sensory-emotional one; when I've had jamais vu over the spellings or definitions of words, and I look the word in question up, almost always what I'd been writing turned out to be correct even though it felt wrong.  The people who had the sense that their loved ones' faces were somehow unfamiliar still knew their names and relationships, so their cognitive understanding of who those people were was undiminished; it was the "gut feeling" that was all wrong.

[Image courtesy of creator © Michel Royon / Wikimedia Commons Brain memory, CC0 1.0]

The reason the subject comes up is that a team led by Chris J. A. Moulin of the Université Grenoble Alpes has done a preliminary look into the strange phenomenon of jamais vu, and their results were the subject of a paper in the journal Memory.  Their research started with a simple question: can jamais vu be induced?  The answer was yes, and by a simple protocol -- repeat something often enough, and it starts to look strange.

The researchers took familiar words like "door" and less familiar ones like "sward," and asked volunteers to write them repeatedly until they wanted to stop.  They were told they could stop for whatever reason they wanted -- tired hand, bored, feeling peculiar, whatever -- but to be aware of why they stopped.  It turned out that by far the most common reason for stopping was "feeling strange," which was cited as the cause by seventy percent of the volunteers.  The effect was more pronounced with common words than uncommon ones, as if we kind of expect to see uncommon words as odd, so it doesn't strike us as off.

It even happened with the most common word in the English language -- "the."  It only took 27 repetitions, on average, for people to halt.  One volunteer said, "[Words] lose their meaning the more you look at them."  Another, even more interestingly, said, "It doesn't seem right.  It almost looks like it's not really a word, but someone's tricked me into thinking it is."

The researchers believe that jamais vu isn't just some kind of psychological fluke.  It may serve a purpose in jolting us when our cognitive processes are going onto autopilot -- as they can, when we're asked to do a repetitive task too many times.  That feeling of strangeness brings us back to a state of high alertness, where we're paying attention to what we're doing, even if the downside is that it makes us think we've made mistakes when we haven't.

"Jamais vu is a signal to you that something has become too automatic, too fluent, too repetitive," the authors write.  "It helps us 'snap out' of our current processing, and the feeling of unreality is in fact a reality check.  It makes sense that this has to happen.  Our cognitive systems must stay flexible, allowing us to direct our attention to wherever is needed rather than getting lost in repetitive tasks for too long."

So a sense of peculiarity when we're doing ordinary stuff might actually have an adaptive benefit.  Good to know, because it's really unsettling when it happens.

But for what it's worth, I still don't think "were" should be spelled like that.


Monday, January 22, 2024

Bear with us

A paper appeared last week in the Journal of Zoology that has elicited a good bit of self-satisfied chortling amongst the people who think cryptids are abject nonsense.  It was written by a data scientist named Floe Foxon, and is entitled, "Bigfoot: If It's There, Could It Be a Bear?"

Foxon's conclusion was, "Yeah, it probably is."  Foxon writes:

Previous analyses have identified a correlation between ‘Sasquatch’ or ‘Bigfoot’ sightings and black bear populations in the Pacific Northwest using ecological niche models and simple models of expected animal sightings.  The present study expands the analysis to the entire US and Canada by modeling Sasquatch sightings and bear populations in each state/province while adjusting for human population and forest area in a generalized linear model.  Sasquatch sightings were statistically significantly associated with bear populations such that, on the average, every 1000 bear increase in the bear population is associated with a 4% increase in Sasquatch sightings.  Thus, as black bear populations increase, Sasquatch sightings are expected to increase.  On average, across all states and provinces in 2006, after controlling for human population and forest area, there were approximately 5000 bears per Sasquatch sighting.  Based on statistical considerations, it is likely that many supposed Sasquatch are really misidentified known forms.  If Bigfoot is there, it could be a bear.

While this certainly is a suggestive correlation, it's not the slam-dunk the scoffers would like it to be.  There are no known black bear populations in Delaware, Illinois, Indiana, Iowa, Kansas, Nebraska, North Dakota and South Dakota, but all of those states have had significant numbers of Bigfoot sightings; Illinois, in fact, is fifth in the nation for the number of sightings (exceeded only by Washington, California, Florida, and Ohio).

This may seem like an odd stance for a self-styled skeptic to take, and don't interpret this as saying more than it does.  My point is that it is a significant jump (and Foxon himself is clear on this point) from saying "many, perhaps most, Sasquatch sightings are actually black bears" to saying "all Sasquatch sightings are actually black bears," which is the reaction I'm mostly seeing.  My issue is with not with Foxon and his analysis, which is excellent, but with the doubters who are saying, "Ha-ha, we toldja so" and thinking this settles the question.

It's precisely the same reason I agreed with controversial physicist Michio Kaku when he said that even if only one in a hundred credible UFO sightings are unexplainable as natural phenomena, that one percent is still worth looking into.  For myself, both Kaku and most Bigfoot aficionados go a lot further into the True Believer column than I'm willing to; but in my mind, an abject statement of disbelief is no better than an abject statement of belief given that in both cases there are plenty of data left to explain.

So the whole thing leaves me pretty much where I was.  We don't have any convincing hard evidence either of Bigfoot or of alien visitation, so my opinion is they're both unlikely to be real phenomenon.  But "unlikely" doesn't mean "certain," and my opinion is just my opinion.  In neither case should we stop looking, nor close our minds to the possibility that we doubters could be wrong.

The burden of proof, of course, still rests on the ones making the claim.  You can't prove a negative, Extraordinary Claims Require Extraordinary Evidence, and all that sorta stuff.  So Foxon's paper gives us a good reason to be cautious about accepting Bigfoot sightings as conclusive -- but then, we really should be cautious about accepting damn near anything without due consideration of alternative explanations.


Saturday, January 20, 2024

The empty galaxy

A couple of weeks ago, I began a post with a quote from physicist Albert Michelson in which he confidently claimed that everything in physics was pretty well settled -- in 1894.  Right before the discovery of the relativity and quantum mechanics would shake science to its foundations.

I read yet another paper just yesterday highlighting the inadvertent irony of Michelson's statement, and which once again shows us that we are very far from understanding everything there is to understand.  This one was about the accidental discovery of a galaxy that has an extremely odd characteristic.

It appears to have no stars whatsoever.

The object, dubbed J0613+52, is about ten times less massive than the Milky Way -- so smaller than your typical galaxy, but still pretty damn huge, weighing in at about two billion solar masses.  But the entire thing is made up of diffuse gas and dust -- no stars at all.

Because of this, it has an extremely low luminosity.  It was only discovered because of a mistake -- the astronomers at the Green Bank Observatory were trying to aim it elsewhere, but had mistyped the coordinates -- but when the telescope focused on the spot, they saw a blip of hydrogen spectral emission lines in what appeared to be an empty region of space.  More detailed study of the spot found that the emission lines were coming from a huge but faint dust cloud that was on the scale of galaxies mass-wise but seemed to have undergone no star formation.

"It’s likely there is a decent amount of dark matter present as well," said Karen O’Neil, senior scientist at Green Bank, who led the research.  "But lingering uncertainties about the dark galaxy’s exact physical size make associated dark-matter estimates hazy at best...  J0613+52 is completely isolated, with no neighboring galaxy closer than 330 million light-years or so; our own Milky Way, in fact, appears to be the object’s closest-known companion.  In these void areas of the universe, gas should be too diffuse to form any galaxy-like object.  Clearly that’s not quite true."

Robert Minchin, of the National Radio Astronomy Observatory in New Mexico, heard O'Neil present the findings at last week's meeting of the American Astronomy Society, and was obviously impressed.  "I think it’s definitely a real detection," Minchin said.  "It does look like a primordial object.  It’s a bit like discovering a living dinosaur and having it there to study."

Artist's depiction of J0163+52 [Image credit: STScI POSS-II (starfield); additional illustration by NSF/GBO/P.Vosteen]

What puzzles me is that J0613+52 is only ("only") 330 million light years away, so not even close to being the farthest galaxy we've seen.  The universe as a whole is forty times older than the light we're seeing from this bizarre empty galaxy, so you'd think it'd have had plenty of time to form stars from all that hydrogen gas.  Instead, it seems to be a relatively homogeneous dust cloud.  You have to wonder, what's keeping it that way?  Gravity is relentless and inexorable -- the current models indicate that even tiny anisotropies (unevenness) in the mass distribution will result in the denser regions gaining mass at the expense of the less dense regions, resulting in clumps of matter that eventually coalesce into stars.

For a dust cloud that massive to last over twelve billion years without forming stars is somewhere beyond peculiar.

It may be that I'm missing something, here.  (Okay, given that I'm not an astrophysicist, it's certain that I'm missing something.)  But even with my no-more-than-basic understanding of astronomy, this object seems really peculiar.

As is the fact that it was discovered accidentally because one of the astronomers had entered a typo in the coordinates.

I'm sure the astronomers are going to be busy looking at the empty galaxy and trying to figure out what it is, and also looking for others.  Given its extremely low luminosity, and the fact that we found it by basically aiming a big telescope at a random spot in the sky, you have to wonder how many other similar structures there are.

I'll end with the words spoken by Hamlet, which have been quoted many times before but seem apposite: "There are more things in Heaven and Earth, Horatio, than are dreamt of in your philosophy."