Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Friday, February 2, 2024

Going against the flow

Two of the most extensively-tested laws of physics are the First and Second Laws of Thermodynamics -- and in the nearly two centuries since they were first formulated, there has not been a single exception found.

The First Law is the less shocking one.  It's sometimes called the Law of Conservation of Matter and Energy, and says simply that in a closed system, the total amount of matter and energy does not change.  You can turn one into the other, or change its form, but the total quantity doesn't vary.  Unsurprising, and in fact can seem a little circular given that this is how a closed system is defined in the first place.

The Second Law is where things get interesting.  It can be formulated a variety of ways, but the simplest is that in a closed system, the amount of entropy (disorder) always increases.  If entropy is being decreased somewhere (the system is becoming more orderly) it always requires (1) an input of energy, and (2) that somewhere else entropy is increasing, and that increase is larger than the localized decrease.  An example is the human body.  When you develop from a single fertilized egg cell to an adult, your overall entropy decreases significantly.  But in the process, you are taking the food molecules you eat and (1) extracting their energy, and (2) increasing their entropy monumentally by chopping them up into little pieces and strewing the pieces about.  So you're able to locally decrease your own entropy, but you leave behind a trail of chaos wherever you go.

Or, as my thermodynamics professor in college put it, a lot of years ago: the First Law says you can't win; the Second Law says you can't break even.  Explaining why the United States Patent Office's official policy is that any application that claims to have a working model of a perpetual motion machine goes directly into the trash without being read any further.

The Carnot Heat Engine [Image is in the Public Domain]

All of this is by way of background for a paper that I ran across in Science, called, "Heat Flowing From Cold to Hot Without External Intervention by Using a 'Thermal Inductor," by Andreas Schilling, Xiaofu Zhang, and Olaf Bossen of the University of Zürich.  Because in this paper, the three physicists have demonstrated the passage of heat energy from a colder object to a warmer one, without any external energy input -- something first shown as impossible by French physicist Sadi Carnot in 1824.

The authors write:
The cooling of boiling water all the way down to freezing, by thermally connecting it to a thermal bath held at ambient temperature without external intervention, would be quite unexpected.  We describe the equivalent of a “thermal inductor,” composed of a Peltier element and an electric inductance, which can drive the temperature difference between two bodies to change sign by imposing inertia on the heat flowing between them, and enable continuing heat transfer from the chilling body to its warmer counterpart without the need of an external driving force.
When I read this, I sat up, squinted at my computer screen, and uttered an expression of surprise that I will leave to your imagination.  In my AP Biology class, I always described the Laws of Thermodynamics as two of the most unshakeable laws of science -- two rules that are never, ever broken.  The idea that three scientists in Switzerland had taken a simple Peltier element -- a type of heat pump often found in refrigerators -- and made it run without expending any energy was earthshattering.

But before you dust off your plans for a perpetual motion machine, read the next lines in the paper:
We demonstrate its operation in an experiment and show that the process can pass through a series of quasi-equilibrium states while fully complying with the second law of thermodynamics.  This thermal inductor extends the analogy between electrical and thermal circuits and could serve, with further progress in thermoelectric materials, to cool hot materials well below ambient temperature without external energy supplies or moving parts.
I'm not going to claim I fully understand how this all works, and how despite the system's bizarre behavior it still obeys the Second Law, but apparently the key point is that despite the heat energy flowing the "wrong way," the system still gains entropy overall.

Which, I must say, was a bit of a relief.

It's still a pretty fantastic discovery.  "With this very simple technology, large amounts of hot solid, liquid or gaseous materials could be cooled to well below room temperature without any energy consumption," study co-author Andreas Schilling said, in a press release from Phys.org.  "Theoretically, this experimental device could turn boiling water to ice, without using any energy."

So don't believe any of the hype that I'm already seeing on dubiously-accurate websites, to the effect that "An Exception Has Been Discovered to the Laws of Thermodynamics!  Physicists Dismayed!  Textbooks Will Have to be Rewritten!"  It's a curiosity, sure, and pretty cool, and sounds like it will have a good many applications, but you shouldn't discount everything you learned in physics class quite yet.

****************************************



Thursday, February 1, 2024

Paleobling

We are hardly the only animal species that sports adornments, but most of the others -- bright colors, flashy feathers, ornate fins, and so on -- are created by genes and produced by the animal's own body.  We're one of the only ones who fashion those adornments out of other objects.

It's a curious thing when you think about it.  Virtually everyone wears clothes even when there's no particular necessity for purposes of protection or warmth; and a great many of us don such accessories as ties, scarves, hats, necklaces, bracelets, and rings.  The significance of these objects is largely culturally-determined (e.g. in western society a guy wearing a tie is a professional, someone with a ring on the fourth left finger is probably married, and so on).  Some have ritual meanings (clothing or jewelry that marks you as belonging to a particular religion, for example).  Others are simply for the purpose of increasing attractiveness to one's preferred gender.

But the odd fact remains that in the animal world, such items are almost entirely confined to the human species.

However such practices got started, what's certain is that they go back a long way.  A study that came out in Nature this week, by a team led by Jack Baker of the University of Bordeaux, has shown that not only does jewelry-making and wearing go back at least 34,000 years, the jewelry styles of prehistoric Europe belong to nine discernibly different styles -- suggesting that beads, necklaces, and the like may have been used as markers for belonging to particular cultures.

A few of the shells, beads, teeth, and other trinkets used in the Baker et al. study

The study was comprehensive, analyzing artifacts from Paviland, Wales east to Kostenki, Russia, and covering a period of nearly ten thousand years.  "We've shown that you can have two [distinct] genetic groups of people who actually share a culture," Baker said.  "In the East, for example, they were very, very much more focused on ivory, on teeth, on stone.  But on the other side of the Alps, people would have adorned themselves with really flamboyant colors: reds, pinks, blues, really vibrant colors.  If you were to see one person from each group, you could say, ‘He's from the East, and he's from the West,’ at a quick glance."

The intricacy and complexity of a particular adornment, Baker said, were probably reflective of wealth or social status -- just as they are today.

Interestingly, there was no particularly good correlation between the genetic relatedness of two groups and the similarity in their jewelry.  As Baker put it, "This study has shown really nicely that genetics does not equal culture."

Given its ubiquity -- there are very few cultures that don't wear some sort of jewelry -- you have to wonder how it got started.  Who was the first early human who thought, "Hey, I could string this shell on a piece of leather and hang it around my neck"?  Why would that thought have occurred to him/her?

And how did the other early humans react?  I picture them looking at their necklace-wearing friend and saying something like the Gary Larson/The Far Side line, "Hey!  Look what Zog do!"

It's interesting to try to consider it from the standpoint of an alien scientist studying anthropology.  How would you answer the question, "Why are you wearing that bracelet?"  Okay, you think it looks good, but why?

I'm not sure I have an answer to that.

****************************************



Wednesday, January 31, 2024

Conspiracy crackpots

Okay, y'all, can we agree to stop calling them conspiracy theories?  A theory is a scientific model backed up by experimentation and/or observation, which is consistent with everything we know about the topic in question.

These are not theories.  We need a new term.

Maybe conspiracy batshit lunacy.  I dunno, that's more accurate, but it's a little clunky.  I'll keep thinking on it.

The reason the topic comes up (again) is because of mega-pop-star Taylor Swift and her boyfriend Travis Kelce, tight end for the Kansas City Chiefs, who will be playing in the Superbowl on February 11.  Well, Swift and Kelce made two huge mistakes, at least if you're a MAGA type; Swift endorsed Joe Biden for president in the 2020 election and is expected to endorse him again in 2024, and Kelce has appeared in commercials promoting the idea that the Pfizer COVID-19 vaccine is safe and effective.

Well.  You'd think they... I dunno.  I was gonna say "stomped all over the Constitution," but Trump himself basically did that.  Then I was going to say "threatened to drown small children," but Texas Governor Greg Abbott did that.  Then I was going to say "wanted to restrict freedom of speech," but Florida Governor (and failed presidential candidate) Ron DeSantis did that.

So comparisons kind of fail me.  Let's just say "You'd think they were really really really bad" and leave it there.

[Image licensed under the Creative Commons va Rinaldi creator QS:P170,Q37885816, Taylor Swift 2012, CC BY-SA 2.0]

In any case, the ultra-right-wing types couldn't just shrug and say, "Taylor Swift is an American citizen and can vote for whom she likes, and Travis Kelce is free to promote the vaccine if he thinks it's the right thing to do."  Oh, no.  There has to be more to it than that.  The firestorm started almost as soon as Swift and Kelce announced they were dating, and Swift started showing up to Kelce's games.  Then Swift was named Time magazine's 2023 Person of the Year, and things really started rolling.

Here are a few quotes, to give you the idea of what sort of things are being batted about on far-right media:

  • "I 'wonder' who’s going to win the Super Bowl next month.  And I 'wonder' if there’s a major presidential endorsement coming from an artificially culturally propped-up couple this fall.  Just some wild speculation over here, let’s see how it ages over the next eight months." -- Vivek Ramaswamy
  • "The Democratic Party and other powers are gearing up for an operation to use Taylor Swift in the election against Donald Trump." -- Jack Posobiec
  • "Taylor Swift is an op.  It’s all fake.  You’re being played." -- Benny Johnson
  • "The Democrats’ Taylor Swift election interference psyop is happening in the open.  It’s not a coincidence that current and former Biden admin officials are propping up Taylor Swift and Travis Kelce.  They are going to use Taylor Swift as the poster child for their pro-abortion GOTV Campaign." -- Laura Loomer
  • "All the Swifties want is a swift abortion." -- Charlie Kirk
  • The NFL is totally RIGGED for the Kansas City Chiefs, Taylor Swift, Mr. Pfizer (Travis Kelce).  All to spread DEMOCRAT PROPAGANDA.  Calling it now: KC wins, goes to Super Bowl, Swift comes out at the halftime show and ‘endorses’ Joe Biden with Kelce at midfield.  It’s all been an op since day one."  -- Mike Crispi
  • We're declaring a Holy War on Taylor Swift if she publicly backs the Democrats." -- an "unnamed source" quoting Donald Trump
  • "Who thinks this country needs a lot more women like Alina Habba, and a lot less like Taylor Swift?" -- unsurprisingly, Alina Habba
  • "Taylor Swift is a Pentagon psyop and a front for a covert political agenda." -- Jesse Watters
I could go on, but I probably don't need to.

What is astonishing to me is that very few folks listen to this and then say, "Okay, have you people been doing sit-ups underneath parked cars?  Or what?"  Evidently a significant fraction of Americans hear this stuff -- and think that it makes perfect sense.

Look, it's not that I don't know politics can get nasty, and that people -- certainly on both sides -- can do some really underhanded stuff to get elected.  But when a celebrity endorses Your Guy, and that's all hunky-dory and an example of a True American Standing Tall, but when a celebrity endorses The Other Guy it's gotta be a covert Pentagon psyop worthy of launching a Holy War, you might just want to check your thought processes for bias.

At least some mainstream media outlets are branding this wingnuttery for what it is.  CNN, in its article on the issue (linked above), labeled this stuff "loony thinking bearing little resemblance to reality," and that's not bad considering that CNN doesn't exactly have a sterling track record of calling out lunacy when they see it.  In fact, there's a good case to be made that back in 2015 the mainstream media created Donald Trump as a viable candidate by treating him as if he were one, instead of labeling him what he is right from the get-go -- an incompetent compulsive liar, a serial philanderer, a sexual predator, and a "businessman" who has a list of failed businesses as long as my arm.  But because his incendiary theatrics got listeners and readers, they uncritically publicized everything he said and did in order to keep readers and viewers engaged -- and that's a large part of why we're in the situation we now are.

At least -- maybe -- some media sources have learned their lesson.

But to return to my original point, these are not theories.  They are one of two things:
  1. deliberately crazy-sounding ideas thrown out by cynical individuals who don't actually believe what they're saying, but say it anyhow because they know it'll keep the public tuned in; or
  2. wild ramblings from people who think this stuff actually makes sense, in which case -- to borrow a line from C. S. Lewis -- "they're on the level of a man who says he is a poached egg."
And in neither case should we give them the slightest bit of attention, short of laughing directly into their faces.  Which is, honestly, what I'm hoping to accomplish here.

How about the Conspiracy Comedy Channel?  That at least captures the spirit of it.

****************************************



Tuesday, January 30, 2024

The fingerprints of the Manatee

Cosmic ray is a catch-all term for the high-energy particles that constantly bombard the Earth's upper atmosphere.  The majority of them are deflected by the Earth's magnetic field or absorbed by the atmosphere, but a very few are energetic enough to reach the surface of the planet.  About 90% of cosmic rays are protons; a good chunk of the remaining ten percent are alpha particles (helium nuclei, consisting of two protons and two neutrons bound together).  The rest are varying mixes of particles from the subatomic zoo, sometimes even including positrons and antiprotons -- particles of antimatter.  They were discovered in 1912 by Austrian-American physicist Victor Hess in 1912, for which he won the 1936 Nobel Prize in Physics.

The lion's share of cosmic rays that strike the Earth originate from the Sun, but some come from much farther away.  As we've seen here several times at Skeptophilia, the universe is an energetic and often violent place, not lacking in mechanisms for sending bits of matter careening across the universe at a significant fraction of the speed of light.  As you might expect, supernovae produce cosmic rays; so do gamma ray bursters, Wolf-Rayet stars, and quasars.  The last-mentioned are thought to be supermassive black holes surrounded by an inward-spiraling accretion disk of gas and dust, which accelerates as it tumbles toward the event horizon and gives of one final death scream of radiation.  This makes quasars one of the brightest objects in the known universe, with luminosities tens of thousands of times that of the Milky Way.

Trying to pinpoint the origin of particular cosmic rays is tricky.  Being mostly made of charged particles, they're deflected by magnetic fields; so even if you find one and know the direction it was traveling when it hit your detector, you can't just trace the line backwards and assume that's the point in the sky where it originated.  So scientists who are interested in figuring out where the highest-energy cosmic rays come from -- ones that almost certainly weren't created by our placid, stable home star -- have a difficult task.

A team led by Laura Olivera-Nieto of the Max Planck Institute for Nuclear Physics has tackled this problem, and in a paper published last week in Science, came up with an answer for at least some of these mysterious particles.  Working at the High-Energy Stereoscopic System (HESS -- a nice nod to the discoverer of cosmic rays) in Namibia, Olivera-Nieto and her team are studying a curious source of cosmic rays -- black holes that are in a binary system with another star.

The current study is of an object called SS 433, a source of x-rays so powerful it's been nicknamed a "microquasar."  It lies in the middle of the Manatee Nebula in the constellation Aquila, a shell of gas and dust blown outward when a star went supernova between ten and a hundred thousand years ago.  The supernova resulted in a black hole as the doomed star's core collapsed, but its companion star lived on.

The Manatee Nebula [Image credit: B. Saxton, (NRAO/AUI/NSF) from data provided by M. Goss, et al.]

Well, after a fashion.  The enormous gravitational pull of the black hole is siphoning off matter from the companion star, and as that plume of gas spirals inward, it accelerates and gives off radiation -- just as the accretion disk of a quasar does.  The result is a jet of cosmic rays, including not only the typical charged particles but x-rays and gamma rays, which (unlike charged particles) are unaffected by magnetic fields.  This allows astronomers to pinpoint their sources.

So in the midst of this seemingly placid bit of space is a whirling hurricane of gas and dust that is accelerated so strongly it creates jets of particles moving at nearly the speed of light.  (Exactly the speed of light, in the case of the x-rays and gamma rays.)  Some of those particles eventually reach the Earth -- a few of which are picked up by Olivera-Nieto's team at HESS.

And those cosmic rays allows us to discern the fingerprints of an incredibly violent process taking place eighteen thousand light years away.

****************************************



Monday, January 29, 2024

The writing on the stone

It can often be difficult to sort fact from fiction, especially when multiple people become involved, each with his or her own agenda -- and varying determination to adhere to the truth.

Take, for example, the Brandenburg Stone.  It's a 74 by 39 centimeter slab of oolite (a sedimentary rock) that appears to have writing-like marks scratched into the surface.  Without further ado, here's a photograph of the alleged artifact:


It was found in 1912 near Brandenburg, Kentucky by a farmer named Craig Crecelius.  Crecelius clearly thought the marks were writing -- and you can see for yourself that they look like it -- and he made a good effort to contact linguists who might be able to identify the script, but without success.  He exhibited the stone several times in nearby towns, but wasn't able to drum up much in the way of interest.

In 1965, the stone passed into the hands of one Jon Whitfield, and that's where things start to get interesting.

Whitfield thought he knew what the script was.  The letters, he said, were Coelbren y Beirdd (Welsh for "Bard's Lot"), a script for writing the Welsh language that in the early nineteenth century was the center of a linguistic controversy regarding its origins.  The man who promoted it, one Edward Williams (more often known by his "bardic name" of Iolo Morganwg), was absolutely obsessed with ancient Welsh history and traditions, and achieved fame as a collector of rare medieval Welsh manuscripts.

But why would there be Welsh script on a stone in Kentucky?

Whitfield thought he knew the answer.  There was a story circulating that the medieval Welsh prince Madoc ab Owain Gwynedd had crossed the Atlantic in around the year 1170 C. E. with a handful of friends, and the lot of them had stayed in North America and intermarried with Native Americans.  (Fans of Madeleine L'Engle will recognize this legend from her book A Swiftly Tilting Planet.)  This, said Whitfield, was proof that the legend was true -- and that Welsh-speaking Natives who descended from Madoc and his comrades had gotten as far inland as Kentucky.

There's only one problem with this.  Coelbren y Beirdd almost certainly wasn't an ancient script at all, but had been invented by Iolo Morganwg in 1791 -- who then passed it off as authentic.

It's pretty clear that despite his legitimate work in preserving ancient Welsh manuscripts, Williams/Morganwg also was a champion forger.  He was exposed as such long after his death by Welsh linguist and poet John Morris-Jones, who decried Williams's dishonesty, saying "it will be an age before our literature and history are clean of the traces of his dirty fingers."  Several of the works he "transcribed" were apparently written by him -- weaving his own fiction and philosophy into allegedly ancient legends and poetry, thus confusing the hell out of scholars who simply wanted to know what historical cultures actually believed.

So even if the marks on the Brandenburg Stone are actually Coelbren y Beirdd, it can't be any older than 1791, and probably much more recent than that.  Skeptic Jason Colavito points out that Morganwg's writing became really popular in the mid to late nineteenth century, when his son Taliesin began publishing and promoting his father's works.  Colavito writes:
The alphabet was widely published in the 1830s and 1840s, and whoever forged the Brandenburg Stone (it was not actually either Williams, who were never in Kentucky) almost certainly used such publications, possibly Taliesin Williams’s widely-read book about the alphabet, in forging the stone.  The younger Williams’s popular book was published to scholarly acclaim in 1840 (having won a prestigious prize two years before) and the alphabet was exposed as a hoax in 1893 (though suspicions had been raised earlier, until Taliesin successfully combated them), which makes it much more likely that the stone was actually carved between 1840 and 1912, though a date as early as 1792 cannot be excluded.  In the United States, libraries had dozens of different volumes on Coelbren y Beirdd, including the Iolo Manuscripts (1848), Bardaas (1862 and 1874), etc., but I am not able to find evidence that the alphabet itself would have been widely available in rural America prior to Taliesin’s book, though it is possible that some of Edward’s specialist publications imported from Britain were available in some places.  After 1862, the largest collection of the Williams forgeries was in print and the alphabet was at the height of its popularity.  Thus, the latter nineteenth or early twentieth century seems the best candidate for the time of forgery.
So we have Craig Crecelius, the farmer who found the stone, and who appears to have been genuinely unaware that it was a forgery; Jon Whitfield, who was the one who identified the writing as Coelbren y Beirdd, but was too young to have been responsible for the creation of the stone, and seems to have thought it was authentic as well; and Edward Williams, who created the fake script but never went to Kentucky and so can't have been the stone's creator, either.

In the end, we're left with a mystery.  An unknown person scratched some mysterious letters on a stone, probably in the last half of the nineteenth century, and left it for someone to find.  And someone did... starting a domino effect of speculation that still shows up on television shows specializing in archaeological weirdness.  The fact remains, though, that everything about it is certainly a forgery -- not only the artifact itself, but the script in which the inscription is written.

But as far as who perpetrated the hoax, we'll probably never know.

****************************************



Saturday, January 27, 2024

Missing the target

Lately I've been seeing a lot of buzz on social media apropos of the Earth being hit by a killer asteroid.

Much of this appears to be wishful thinking.

Most of it seems to focus on the asteroid 2007 FT3, which is one of the bodies orbiting the Sun that is classified as a "near-Earth object" -- something with an orbit that crosses Earth's, and could potentially hit us at some point in the future.  It bears keeping in mind, however, that even on the scale of the Solar System, the Earth is a really small target.  This "deadly asteroid," we're told, is "on a collision course with Earth" -- but then you find out that its likelihood of its actually striking us on the date of Doomsday, March 3, 2030, is around one in ten million.

Oh, but there's "an altogether more sinister estimate" that 2007 FT3 could hit us on October 5, 2024, but the chances there are one in 11.5 million.  Why this is "altogether more sinister," I'm not sure.  Maybe just because it's sooner.  Or maybe the author of the article doesn't understand how math works and thinks that the bigger the second number, the worse it is.  I dunno.

Then there's the much-hyped asteroid 99942 Apophis, which was first thought to have a 2.7% chance of hitting the Earth in April of 2029 (more accurate observations of its orbit eliminated that possibility entirely), and then gets a second shot at us in April of 2036.  The 2036 collision depends on it passing through a gravitational keyhole during its 2029 close approach -- a tiny region in space where the pull of a much larger planet shifts the orbit of a smaller body in such a way that they then collide on a future pass.  Initially, the keyhole was estimated to be eight hundred kilometers in diameter, and this caused the physicists at NASA to rate Apophis at a four out of ten on the Torino Impact Scale -- the highest value any object has had since such assessments began.  (A rating of four means "A close encounter, meriting attention by astronomers.  Current calculations give a 1% or greater chance of collision capable of regional devastation.  Most likely, new telescopic observations will lead to reassignment to Level 0.  Attention by public and by public officials is merited if the encounter is less than a decade away.")  If it hit, the impact site would be in the eastern Pacific, which would be seriously bad news for anyone living in coastal California.

The close approach in 2029 [Image licensed under the Creative Commons Phoenix7777, Animation of 99942 Apophis orbit around Sun, CC BY-SA 4.0]

This, of course, spurred the scientists to try to refine their measurements, and when they did -- as the scale suggested -- they found out we're not in any danger.  The gravitational keyhole turns out to be only a kilometer wide, and Apophis will miss it completely.

In fact, there are currently no known objects with a Torino Scale rating greater than zero.

It's always possible, of course, that we could be hit out of the blue by something we never saw coming.  But given that we're talking about an unknown risk from an unknown object of unknown size hitting in an unknown location at an unknown time, I think we have more pressing things to worry about.  Sure, something big will eventually hit the Earth, but it's not going to happen in the foreseeable future.  NASA and the other space monitoring agencies in the world are doing a pretty good job of watching the skies, so maybe we should all just turn our attention on more important matters, like trying to figure out how nearly half of Americans think the best choice for president is a multiply-indicted, incompetent compulsive liar who shows every sign of incipient dementia.

In any case, I'm not concerned about asteroid impacts, and all the hype is just more clickbait.  So if you live on the West Coast and were planning on moving inland, or are considering cancelling your plans for a big Halloween bash this year, you probably should just simmer down.

****************************************



Friday, January 26, 2024

Blind spots

Authors reveal more in their work, sometimes, than they may have intended.

That thought crossed my mind more than once while reading the book Hadrian by British historian, antiquarian, diplomat, and writer Stewart Perowne.  The book is a history and biography of the Roman Emperor Hadrian, who was the emperor of Rome from 117 to 138 C.E.  Hadrian is considered to be one of the better rulers Rome had -- generally fair-minded, astute, and intelligent -- although considering he's competing against guys like Caligula, Nero, Domitian, and Elagabalus, that may not be a very high bar.

A sculpture of the emperor Hadrian, circa 130 C. E. [Image licensed under the Creative Commons Djehouty, München SMAEK 2019-03-23n, CC BY-SA 4.0]

The book, which was published in 1960, was interesting enough, if a bit dry and pedantic at times (did we really need an entire chapter devoted to minute details about the architecture of the Pantheon?).  But there were a couple of times that what he wrote made me do a double-take.

The first time came when he was discussing the Roman program of expansion and colonization, and engaged in a digression comparing it to the policies of the British Empire between the eighteenth and mid-twentieth centuries.  Perowne writes:

No other country has ever had a finer or more generous record in its dealings with other races than the English.  No great power, since history began, has occupied, and advanced to autonomous sovereignty, so large an extent of territory in so short a period.  The advance, it is true, was from the very first, when the American colonists set the precedent, encouraged by the inhabitants of the territory concerned; nevertheless, it did not take long for England to adopt as a principle that the aim of all colonial enterprise is the elevation of the colonials, and their establishment as independent states, in whatever form of association they may choose with Great Britain. 

Say what?

I think there are citizens of a few nations I can think of who would beg to differ.  Great Britain fought like hell not to let a good many of their colonies gain their independence.  It was only when faced with sustained revolt -- and the impossibility of continuing a minority rule over the unwilling -- that they grudgingly granted sovereignty.  (And a great many of those nations are still struggling to overcome the long-term effects of colonialism -- oppression, exploitation, wealth inequality, and bigotry.)

I know there's the whole "man of his time" thing you hear about writers in the past, and which has been used to look past even the horrific racism that threads through a lot of the fiction of H. P. Lovecraft.  Here, it's not quite that extreme, but was still kind of startling to read.  And perhaps there are still a good many of us who have the tendency to consider our own country as intrinsically superior, even if we wouldn't necessarily put it that way.  But it's somewhere between baffling and appalling that someone who was a historian, who devoted his life to investigating and understanding other cultures -- who, in fact, worked as a diplomat in Malta, Aden, Iraq, Barbados, Libya, and Israel -- could come away with the impression of the British Empire as the Gentle Guides of the Civilized World.

Stewart Perowne in 1939, while serving in the British diplomatic corps in Libya [Image is in the Public Domain]

Now, mind you, I'm not saying the British were any worse than a lot of other militaristic colonial powers.  The history of the world is one long sorry tale of the powerful exploiting the weak.  But to write what Perowne did, especially with his extensive knowledge and experience, is evidence of a blind spot a light year wide.

Then there was the sniffy, superior bit he threw in about Hadrian's male lover, Antinoüs.  Hadrian, in fact, was pretty clearly gay.  He was married to an apparently rather obnoxious woman named Vibia Sabina, but the marriage was an unhappy one and produced no children.  His devotion and love for Antinoüs, however, was the stuff of legends; the two were inseparable.

Hadrian and Antinoüs [Image licensed under the Creative Commons Carole Raddato from FRANKFURT, Germany, Marble Busts of Hadrian & Antinoüs, from Rome, Roman Empire, British Museum (16517587460), CC BY-SA 2.0]

Perowne writes:

It was in Bithynia that Hadrian formed his famous and fatal attachment to Antinoüs, a lad of whose origin nothing is known, except that he came from the city of Bithynion...  Antinoüs, at the time when Hadrian met him, must have been a lad of about eighteen.  He was broad-shouldered and quite exceptionally handsome...  Whether the relations between the emperor Hadrian and his beautiful young favorite were carnal or not, we cannot be sure.  But what we can be certain of is this: that for the next nine years Antinoüs was the emperor's inseparable companion, that many people did suppose their association was based on a physical relationship, and that they did not reprobate it in the least...  However much we may deplore this fact, it simply is not possible to equate ancient and modern canons of morality.

He can't even bring himself to write "homosexual" -- but comments that it is unsurprising that later Roman authors used the word Bithynian as "a euphemism for something vile."

After reading this, you may be shocked to find out that Stewart Perowne himself was gay.

In a bizarre parallel to Hadrian's own life, Perowne reluctantly agreed to marry explorer and writer Freya Stark in 1947, but the marriage was unhappy, childless, and possibly even unconsummated.  Eventually the two divorced after it became obvious that Perowne's sexual orientation wasn't going to change.  He finally put it into writing to his wife, but once again meticulously avoided using the word homosexual:

It is difficult to say what "normal" is – my friend a counsellor of St. George's Hospital always refuses to use the word and in both men and women, you have a wide and graded range from ultra-male to ultra-female with naturally most people in the middle ranges...  Now for myself, I put myself in the middle group.  I have ordinary male abilities.  I like male sports some of them, and I love the company of women.  In fact, I find it hard to exist without it.  At the same time, I am occasionally attracted by members of my own sex – generally.  For some even pleasurable reason – by wearers of uniform.

I was simultaneously appalled and heartbroken to read those words, from the pen of the same man who called Hadrian's love for Antinoüs "something vile" and implied people were right to "deplore" it.  How deeply sunk in self-loathing would you have to be to be able to write both of those passages?

That a culture could produce such a tortured and damaged soul is a horrible tragedy.  And how many others did this happen to, men and women we don't know about because they never ended up in the public eye, but lived their entire lives in fear, shame, and obscurity, never able to openly love who they loved for fear of condemnation, imprisonment, or even death?

I'd like to think we've grown beyond that, but then I look around me at my own culture, where books are currently being banned merely for including queer people -- where even mentioning we exist is apparently improper -- and I realize that it's still going on.

So my reading of Hadrian got me thinking about way more than just a long-ago emperor of a classical European civilization.  It started me wondering about my own blind spots, things about myself and my culture that I take for granted as The Way Things Should Be, and which a future civilization might rightly shake their heads at.  

And thinking about Perowne himself made me recognize what complex, contradictory, and fragile creatures we humans are.  Will we ever find a way to move past all the antiquated hidebound moralizing, and simply treat each other with kindness, dignity, and compassion?  To live by the rule that has been set up as a guiding light in many cultures, but is best known in its biblical form -- "Do unto others as you would have them do unto you"?

****************************************