Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, December 19, 2022

Ken Paxton's registries

When the Nazi party first came into power in Germany in the mid-1930s, one of the first things they did was to dramatically improve the efficiency of record-keeping, especially with regards to people they considered "undesirables."

A 1946 report on their practices by Robert M. W. Kempner, which appeared in the Journal of Criminal Law and Criminology, is as impressive as it is horrifying. Kempner writes:
The most important are the Gestapo card indices: the register of persons politically undesirable to the National Socialist regime, such as former members of democratic parties, lodges, etc.  This register consists of five different sets of alphabetical card indices for (1) highly dangerous persons, (2) less dangerous persons, (3) dangerous persons, (4) Jews, (5) part Jews (Mischling).  These card indices were kept in the offices of the Secret Police (Gestapo), i.e., in the central headquarters of the Gestapo in Berlin, Prinz Albrechtstrasse. Duplicates were kept by the supervisory offices (Staatspolizeileitstellen) or the approximately 100 district offices of the secret police (Staatspolizeistellen) which are located in the larger cities throughout Germany, e.g., in Munich, Stuttgart and the seats of the district governments...  The index cards are brown for males and green for females.  The first item is year, day, month, and place and county of birth.  Then follow the statements about occupation, name, marital status, school, and professional education, examinations passed, residence in foreign countries, knowledge of foreign languages, special abilities, service in the armed forces, or in the labor service, and residence...  Cards of Jews are marked by black index tabs.
The horrors of the Nazi regime and the Holocaust were only possible because of the extensive records the government had on damn near everyone in the country.

You may be thinking, by this point, "how is this so different from the records governments today keep on citizens?  Most of this same information is now routinely kept by government agencies, and no one bats an eyelash."  It's a reasonable question.  Census and tax forms, drivers' license registration, school registration, job applications... unless you somehow have avoided all that, which is hard to imagine, you're a known quantity.

The difference, of course, is intent.  What do the governments of the United States and other democracies intend to do with the information they have?  My own probably Pollyanna-ish idea is that most of the time, the answer is "nothing."  As long as you pay your taxes and abide by the law, the powers-that-be have neither the time nor the interest to worry about what color your eyes are, what your marital status is, or what exactly you're doing.

It's the exceptions that are downright terrifying.  Which brings us, unsurprisingly, to Attorney General Ken Paxton of Texas.

It was just revealed that earlier this year, Paxton demanded a list of all of the people in the state who requested a change in their gender designation on their drivers' licenses.  In other words, trans individuals who wanted to have legal documentation of their gender identity.  It's disingenuous not to see the comparison to the "black index tabs" on Jewish registration cards in Nazi Germany.  Add to this the fact that earlier this year, Texas passed one of the harshest anti-trans laws in the nation -- it explicitly forbids gender-affirming medical treatment for teenagers, and mandates criminal prosecution and jail time not only for medical professionals who carry it out, but for teachers, counselors, therapists, and so on (people considered "mandated reporters" for child abuse) if they find out about a teenager's trans status and fail to notify the authorities.

Don't tell me that LGBTQ+ people are "overreacting" if they're terrified by Paxton's demand for a trans registry.  And if you think I'm engaging in hyperbole by comparing it to the Nazi registries, you're being willfully blind.

"This could be a mass outing of a whole bunch of trans people because a lot of us change our documents and then choose to live in private," said Eden Rose Torres, a trans Texan who chooses to be out.  "We don't have to disclose our transness."


The only good news is that the request from Paxton's office was denied -- but not on the grounds of its being potentially used to harm trans Texans and the people who aid them, but because of practicality.  "Ultimately, our team advised the AG’s office the data requested neither exists nor could be accurately produced," said Travis Considine, of the Department of Public Safety, to which the demand had been directed.  "Thus, no data of any kind was provided...  It [would] be very difficult to determine which records had a valid update without a manual review of all supporting documents."

What really needs to happen, of course, is that Paxton be required to produce, in writing, a statement (1) justifying why he has the right to the information, and (2) outlining in detail what he intends to do with it.  Not, frankly, that I trust Paxton as far as I could throw him.  But at least then it would push him into defending his actions, rather than what's happened thus far, which is giving him carte blanche.

The refusal of the DPS to cooperate is not going to be the end of this.  People like Ken Paxton are never, ever going to give up their campaign against LGBTQ+ people, despite the fact that all we queer people want is to live our truth in peace and safety, the same as straight White Americans do, and to be in control of when and to whom we reveal our private lives.  Paxton sees himself as the leader of a religious-inspired crusade, and if he succeeds with his attempt at a trans registry, that's only going to be the beginning.

We have only to look back ninety years to see where this kind of thing can lead.

****************************************


Saturday, December 17, 2022

Ignorant and proud of it

Way back in 1980, biochemist, writer, and polymath Isaac Asimov wrote something that is even more accurate today than it was back then:

There is a cult of ignorance in the United States, and there always has been.  The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that "my ignorance is just as good as your knowledge."

I remember the first time I ran headlong into the bizarre American "ignorant and proud of it" attitude Asimov describes, during the presidential campaign of George W. Bush.  Even Bush's supporters admitted he wasn't an intellectual; I heard one person say he was voting for Bush because he wanted someone in the White House who was one of the "common folk," someone he would want to sit down and have a pint of beer with.  I responded, in considerable bafflement, "Don't you want the president to be smarter than you and I are?  I know I'm not smart enough to run the country."

His response was that the intellectuals are out of touch, and don't understand on a visceral level the problems ordinary people face.  This, I have to admit, contains a kernel of truth.  Politics is a money game, and most (not all; I'm sure you'll find counterexamples) elected officials come from some level of wealth and privilege.  And it's true that this privilege can create a set of blinders.  People who have never been down to pennies at the end of a pay period -- as I, and many others, have -- don't understand what it's like for financial worries never to be far from your mind, twenty-four hours a day.

The problem, of course, is that while an "ordinary person" might empathize with the plight of other ordinary people, that doesn't mean (s)he knows how to fix it.  Experiencing a problem doesn't mean you have a clue how to solve it.

But as Asimov pointed out, the "we're equal as people, so my ideas are as good as yours" nonsense is woven deeply into the American psyche, and the result has been that increasingly you run into people who seem to be not only oblivious to their own ignorance, but actively proud of it.  I was just discussing this with my athletic trainer, Kevin, this week.  One of the points I made is that I know there are a lot of areas about which I am ignorant.  The internal workings of cars, for example.  I have only the vaguest notion of how automobile engines work -- which is why when something goes wrong with my car, I go down to my mechanic and say, "Car not go, please fix."  What I don't do is start blathering on to my friends and acquaintances about carburetors and alternators and fuel pumps, and getting all defensive when one of them tells me what I'm saying is bullshit.

This, surprisingly, is often not the approach people have.  Kevin told me he was at a party a while back, and someone was pontificating about how the problem with the COVID-19 vaccination was that it was a vaccine.  On the other hand, he said, he was fine with getting a flu shot, because that wasn't a vaccine, it was a shot.

Kevin said, "The flu shot is a vaccine, too."

The guy responded, "No, it's a shot.  COVID is a vaccine, which means it does stuff to your immune system."

A little goggle-eyed, Kevin said, "But... doing stuff to your immune system is what shots are supposed to do."

Undeterred, the guy said, "No, that's vaccines.  The flu shot just stops the flu virus from making you sick, it doesn't mess with your immune system."

At that point, Kevin decided that the guy had the IQ of a peach pit and gave up.

What gets me about this is not that some person had a goofy misconception about something.  We all have goofy misconceptions about some things, and a complete lack of knowledge about others.  But -- hopefully -- most of us know better than to broadcast our ignorance in front of a large group of people.

Or on a major news network.  Just a couple of days ago, Fox News commentator Tucker Carlson, who is himself no stranger to broadcasting his stupidity, had a guest who made Carlson's own beliefs look positively Ph.D.-worthy by comparison.  The guy's name is (I'm not making this up) Joe Bastardi, and you'll get a good idea of his scientific credibility when I tell you that he's the author of a book called The Weaponization of Weather in the Phony Climate War.  (He chose this title when it narrowly edged out his second-favorite choice, which was 99% of the Earth's Scientists Are Big Dumb Poopyheads.)  But what he said went way beyond just claiming that "the climate's just fine, keep on burnin' those fossil fuels."  Here is a direct quote, which (once again) I swear I'm not making up:

I’ve been giving [climate change policy] a lot of thought today, because I had to drive from Iowa City all the way to Pittsburgh, and when I went by South Bend, oddly enough it hit me.  There are three possibilities here, in my opinion, just looking at this, okay.

First is, they’ve all got climate vaccines.  We don’t know about them, but unlike the COVID vaccine, they actually work, so whatever they do, they’re immune from it.  So that’s a possibility.  That’s a long shot.

The second, Tucker, is, that if bad weather stops air travel, and it stops car travel, if you can cause more bad weather, right, then guess what?  Everybody can’t drive.  For instance, next week, and the week after?  Watch how much bad weather comes into the United States.  It’s going to be the coldest, snowiest period around the Christmas time since 2000.  So we’re gonna see planes, and trains, and all these other things shut down.  So if you just dump all this CO2 in the atmosphere, your assumption is, hey, CO2 causes bad weather, if I could cause more bad weather, then guess what?  Other people won’t be able to fly, and we’ll have less CO2 emissions.

Or the third possibility, exactly what you said: it’s a phony climate war, it’s fraudulent.  When we talked back in July, we talked about how it’s going to get cold earlier this year across the United States, that has nothing to do with CO2, what it has to do is the natural cycles of the weather, and what happens is these people are taking advantage of people who fall prey to this, and this is what they’re doing.  There’s no logic or reason for it except they are trying to establish a caste system that destroys the greatest experiment of freedom and individuality, which is this country.

I have a few responses to this, to wit:

  1. How the fuck do you vaccinate someone against the climate?
  2. Winter is frequently the coldest, snowiest part of the year in the United States.  That's because we're in the Northern Hemisphere and that's how seasons work.
  3. So, what he's saying is that the environmental scientists have created the whole climate change thing in order to destroy the United States.  Even though a great many of them live here.  Because that makes total sense.
  4. Does he really think that somehow, the climatologists are engineering bad weather across the entire United States?  Simultaneously?  How are they doing this, using magical laser beams from space, or something?
  5. No, wait -- it's not magical laser beams from space, he says.  It's something way less plausible than that.  What we're gonna do is dump carbon dioxide into the air to make travel difficult, which will stop travel, which will cause us to emit less carbon dioxide. 
Now that's what I call a cunning plan.


And through the entire conversation, Tucker Carlson sat there, nodding sagely, as if what Bastardi was saying was nearing Stephen Hawking levels of brilliance, instead of doing what I'd have done, which is to say to him, "What is clear from this conversation is that if the government taxed brains, you'd get a refund."

Which explains why I am not a commentator on Fox News.

So.  Yeah.  For some reason, there are people who are abjectly ignorant, and yet who consider it critical that the entire world finds out about it.  It all brings back the well-known aphorism -- one of my dad's favorites --- that "it's better to keep your mouth and be thought a fool than to open your mouth and prove it."

****************************************


Friday, December 16, 2022

Bugs of unusual size

Although you don't tend to hear much about it, the Ordovician Period was a very peculiar time in Earth's history.

From beginning (485 million years ago) to the end (444 million years ago) it experienced one of the biggest global climatic swings the Earth has ever seen.  In the early Ordovician the climate was a sauna -- an intense greenhouse effect caused the highest temperatures the Paleozoic Era would see, and glacial ice all but vanished.  By the end, the center of the supercontinent of Gondwana was near the South Pole, and glaciers covered much of what is now Africa and South America, resulting in a massive extinction that wiped out an estimated sixty percent of life on Earth.

At this point, life was confined to the oceans.  The first terrestrial plants and fungi wouldn't evolve until something like twenty million years after the beginning of the next period, the Silurian, and land animals only followed after that.  So during the Ordovician, the shift in sea level had an enormous impact -- as the period progressed and more and more ocean water became locked up in the form of on-land glacial ice, much of what had been shallow, temperate seas dried up to form cold, barren deserts.

But during the beginning of the period, life thrived in the warm oceans, giving rise to huge ecosystems based on reef-building corals and sponges.  Just as today, back then coral reefs provided habitats to a tremendously diverse community, and fossil beds like the Fezouata Formation of Morocco give us a glimpse of a strange and wonderful world.

Here's one of the exceptionally well-preserved fossils from Fezouata, a marrellomorph arthropod called Furca mauritanica:

[Image licensed under the Creative Commons Didier Descouens, Furca mauritanica MHNT, CC BY-SA 3.0]

And here's a reconstruction of another one from the same group, the bizarre Mimetaster hexagonalis (the genus name means "mimics a starfish"):

[Image licensed under the Creative Commons Franzanth, Mimetaster hexagonalis reconstruction, CC BY-SA 4.0]

These arthropods, more closely related to trilobites than to any living species, were one of the dominant groups during the temperate early Ordovician, but had vanished almost entirely during the icehouse conditions of the end of the period.  

The reason this comes up is because of a paper out of the University of Exeter about further research into the fossils of the Fezouata Formation.  And this study has turned up something phenomenal -- another kind of marrellomorph arthropod related to Furca and Mimetaster that was something on the order of two meters long.

That is one big swimming bug.

I found this a little surprising, above and beyond simply being shocking because it's enormous.  As far as I understand physical chemistry, I would think that the greenhouse conditions of the early Ordovician implied two things: (1) higher carbon dioxide and lower oxygen levels, both in the atmosphere and the oceans; and (2) the warmer temperatures making what oxygen there was less soluble in water.  Both of these would lead to more hypoxic conditions, and -- again, as far as my layperson's understanding goes -- should result in generally smaller body size, especially in arthropods.

Arthropods have a couple of limitations that keep cockroaches from getting big as elephants (despite what you might have seen in any number of bad 1950s horror movies).  First, they aren't built to support a large body mass; a terrestrial insect expanded to enormous size, with its bodily proportions left intact, wouldn't be able to stand up, much less move.  This disadvantage is somewhat offset by living in the water, where buoyancy supports the body's mass.  (Note how much bigger oceanic mammals can get than terrestrial ones do.)

Second, and more apposite to this discussion, arthropods are limited by their rather shoddy respiratory systems.  They don't circulate oxygen using their blood, as we do; oxygen is absorbed passively, through channels called tracheal tubes (in terrestrial arthropods) and feathery gills (in aquatic ones).  Gills do have an edge, efficiency-wise, over tracheal tubes, but are working against water's much lower oxygen concentration (way less than one percent, as compared to air at sea level, which averages around twenty-one percent).  This is why terrestrial animals drown; their lungs are just not efficient enough to extract oxygen from a fluid that has so little of it.

And, as I said before, the likelihood is that the conditions of the early Ordovician would likely have combined to cause a far lower dissolved concentration in the oceans than we have now.

So how did marrellomorphs get so big?

At the moment, we don't know.  But the new study has shown that the early Ordovician seas were even weirder than we'd thought, with arthropods swimming around as long as a fully-grown human is tall.

No idea what those things ate, but if I ever get in a time machine and go back then, I'm sure as hell going to be careful if I go swimming.

****************************************


Thursday, December 15, 2022

Words as edged tools

Words matter.

This comes up because of a couple of unrelated social media interactions that got me thinking about the fact that many people use words and then want to avoid the implications and consequences of how they're perceived.  The first was a post from the Reverend Doctor Jacqui Lewis that (hearteningly) got a lot of responses of the high-five and applause variety, which said, "You don't 'hate pronouns.'  You hate the people who are using them.  If that makes you feel uncomfortable, then good.  It should.  You either respect how people are asking you to know and name them, or you don't.  But stop pretending it's about language."

The other was in response to a TikTok video I made for my popular #AskLinguisticsGuy series, in which I made the statement that prescriptivism -- the idea that one dialect of a language is to be preferred over another -- is inherently classist, and that we have to be extremely careful how we characterize differing pronunciations and word usages because they are often used as markers of class and become the basis for discrimination.  Most people were positive, but there was That One Guy who responded only with, "Great.  Another arrogant preachy prick."

Now, let me say up front that there are perhaps times when people are hypersensitive, and infer malice from the words we use when there was none intended.  On the other hand, it's critical that we as speakers and writers understand the power of words, and undertake educating ourselves about how they're perceived (especially by minorities and other groups who have experienced bigotry).  If someone in one of those groups says to me, "Please don't use that word, it's offensive," I am not going to respond by arguing with them about why it was completely appropriate.  I would far rather err on the side of being a little overcautious than unwittingly use a word or a phrase that carries ugly overtones.

Let me give you an example from my own personal experience.  I grew up in the Deep South -- as my dad put it, if we'd been any Deeper South, we'd'a been floating.  And I can say that it really pisses me off when I see a southern accent used as a marker of ignorance, bigotry, or outright stupidity.  I was appalled when a local middle school here in upstate New York put on a performance of Li'l Abner, a play written by Melvin Frank and Norman Panama (both northerners native to Chicago).  The entire play, in my opinion, can be summed up as "Oh, those goofy southerners, how comically dim-witted they are."  If you've never seen it, you'll get the flavor when you hear that it features characters named Mammy Yokum, General Bullmoose, and Jubilation T. Cornpone.  I don't blame the kids; they were doing their best with it.  I blame the adults who chose the play, and then chortled along at sixth and seventh graders hee-hawing their way through the lines of dialogue with fake southern accents, and acted as if it was all okay.

People who know me would readily tell you that I'm very comfortable with laughing at myself.  My reaction to Li'l Abner wasn't that I "can't take a joke" at my own expense.  The problem is that the show is based on a single premise: characterizing an entire group, rural southerners, using a ridiculous stereotype, and then holding that stereotype up for a bunch of smug northerners to laugh at.

And if taking offense at that makes me a "woke snowflake," then I guess that's just the way it has to be.

[Image licensed under the Creative Commons Kevin C Chen, SnowflakesOnWindshield, CC BY-SA 2.0 TW]

If, in your humor or your critical commentary, you're engaging in what a friend of mine calls "punching downward," you might want to think twice about it.

The bottom line, here, is that what I'm asking people to do (1) can make a world of difference to the way they come across, and (2) just isn't that hard.  When a trans kid in my class came up to me on the first day of class and said, "I go by the name ____, and my pronouns are ____," it literally took me seconds to jot that down, and next to zero effort afterward to honor that request.  To that student, however, it was deeply important, in a way I as a cis male can only vaguely comprehend.  Considering the impact of what you say or what you write, especially on marginalized groups, requires only that you educate yourself a little bit about the history of those groups and how they perceive language.

Refusing to do that isn't "being anti-woke."  It's "being an asshole."

Words can be edged tools, and we need to treat them that way.  Not be afraid of them; simply understand the damage they can do in the wrong hands or used in the wrong way.  If you're not sure how a word will be perceived, ask someone with the relevant experience whether they find it offensive, and then accept what they say as the truth.

And always, always, in everything: err on the side of kindness and acceptance.

****************************************


Wednesday, December 14, 2022

Ahead of the curve

I remember how stunned I was when I was in high school and found out that all energy release -- from striking a match to setting off a nuclear bomb -- goes back to Einstein's famous equation, that energy is equal to mass times the speed of light squared.

It all hinges on the fact that the Law of Conservation of Mass isn't quite right.  If I set a piece of paper on fire inside a sealed box, the oft-quoted line in middle school textbooks -- that if I'd weighed the paper and the air in the box beforehand and then reweighed the ash and the air in the box afterward, they'd have identical masses -- isn't true.  The fact is, the box would weigh less after the paper had burned completely.

The reason is that some (a very tiny amount, but some) of the mass of the paper would have been converted to energy according to Einstein's equivalency, and that's where the heat and light of the fire came from.  Thus, the box and its contents would have less mass than they started with.

The mind-boggling truth is that when you burn a fossil fuel -- oil, coal, or natural gas -- you are re-releasing energy from the Sun that was stored in the tissues of plants in the form of a little bit of extra mass during the Carboniferous Period, three-hundred-odd million years ago.

So to fix the problem with the "Law," we have to account for the shifting back and forth between matter and energy.  If you change it to a conservation law of the total -- that the sum of the mass and energy stays constant in a closed system -- it's spot-on.  (In fact, this is the First Law of Thermodynamics.)

How much energy you can get out of anything depends, then, only on one thing; how much of its mass you can turn into energy.  This is the basis of (amongst many other things) what happens in a nuclear power plant.  As folks like Henri Becquerel, Marie Skłodowska Curie, Pierre Curie, and others showed in the early twentieth century, the atoms of an element can be turned into the atoms of a different element -- the dream of the alchemists -- and the amount of energy required or released by that process is described by something called the binding energy curve.


This graph shows a number of interesting things.  First, the higher on the graph an atom is, the more stable it is.  Second, when you're going from one atom type to another, if you've moved upward on the graph, that transition releases energy; if you've moved downward, the transition requires energy.  Third, how big a jump you've made is a measure of the amount of energy you release or consume in the transition.  (Theoretically; as you'll see, doing this in the real world, and making practical use of the process, is another matter entirely.)

Note, for example, going from uranium (at the far right end of the graph) to any of the other mid-weight elements uranium breaks down into when it undergoes nuclear fission.  What those are, specifically, isn't that important; they all lie on the flattish part of the curve between iron (Fe, the most stable element) and uranium.  Going from uranium to any of those is an upward movement on the graph, and thus releases energy.  Seems like it must not be much, right?  Well, that "small" release is what generates the energy from a nuclear power plant -- and from bombs of the type that destroyed Hiroshima.

Now check out the other end of the graph -- the elements for which fusion is the energy-releasing transformation.

Go, for example from hydrogen-1 (the very bottom left corner of the graph) to helium-4 (at the peak, right around 7 MeV), and compare the size of that leap with the one from uranium to any of its fission products.  This transition -- hydrogen-1 to helium-4 -- is the one that powers the Sun, and is what scientists would like to get going in a fusion reactor.

See why?  I could sit down and calculate the per-transition difference in the energy release between fission and fusion, but it's huge.  Fusion releases more energy by orders of magnitude.  Also, the fuel for fusion, hydrogen, is by far the most abundant element in the Solar System; it's kind of everywhere.  Not only that, the waste product -- helium -- is completely harmless and inert, by comparison to fission waste, which remains deadly for centuries.

That's why the scientists want so desperately to get fusion going as a viable energy source.

The problem, as I noted earlier, is practicality.  The fusion reactions in the Sun are kept going because the heat and pressure in the core are sufficient for hydrogen nuclei to overcome their mutual electrostatic repulsion, crushing them together and triggering a chain reaction that leads to helium-4 (and releasing a crapload of energy in the process).  Maintaining those conditions in the lab has turned out to be extraordinarily difficult; it's always consumed (far) more energy to trigger nuclear fusion than came out of it, and the reactions are self-limiting, collapsing in a split-second.  It's what's given rise to the sardonic quip, "Practical nuclear fusion is fifty years in the future... and always will be."

Well -- it seems like "fifty years in the future" may have just gotten one step closer.

It was just announced that for the first time ever, scientists at the amusingly-named National Ignition Facility of Livermore, California have created a nuclear fusion reaction that produced more energy than it consumed.  This proof-of-concept is, of course, only the first step, but it demonstrates that practical nuclear fusion might not be the pipe dream it has seemed since its discovery almost a century ago.

"This is a monumental breakthrough," said Gilbert Collins of the University of Rochester in New York, a physicist who has collaborated in other NIF projects but was not involved the current research.  "With this achievement, the landscape has changed...  comparable to the invention of the transistor or the Wright brothers’ first flight.  We now have a laboratory system that we can use as a compass for how to make progress very rapidly."

So keep your eyes on the news.  A common pattern in science is that once someone shows something is possible, the advances take off like a rocket.  Imagine how it would change the world if we could, once and for all, ditch our dependence on fossil fuels and dangerous nuclear fission technology, and power the planet using an energy source that runs on a ridiculously abundant fuel and produces a completely harmless waste product.

That dream may have just gotten one step closer.

****************************************


Tuesday, December 13, 2022

Timey-wimey light

I don't always need to understand things to appreciate them.

In fact, there's a part of me that likes having my mind blown.  I find it reassuring that the universe is way bigger and more complex than I am, and the fact that I actually can parse a bit of it with my little tiny mind is astonishing and cool.  How could it possibly be surprising that there's so much more out there than the fragment of it I can comprehend?

This explains my love for twisty, complicated fiction, in which you're not handed all the answers and everything doesn't get wrapped up with a neat bow at the end.  It's why I thoroughly enjoyed the last season of Doctor Who, the six-part story arc called "Flux."  Apparently it pissed a lot of fans off because it had a quirky, complicated plot that left a bunch of loose ends, but I loved that.  (I'm also kind of in love with Jodie Whittaker's Thirteenth Doctor, but that's another matter.)

I don't feel like I need all the answers.  I'm not only fine with having to piece together what exactly happened to whom, but I'm okay that sometimes I don't know.  You just have to accept that even with all the information right there in front of you, it's still not enough to figure everything out.

Because, after all, that's how the universe itself is.

[Nota bene: Please don't @ me about how much you hated Flux, or how I'm crediting Doctor Who showrunner Chris Chibnall with way too much cleverness by comparing his work to the very nature of the universe.  For one thing, you're not going to change my mind.  For another, I can't be arsed to argue about a matter of taste.  Thanks.]

In any case, back to actual science.  That sense of reality being so weird and complicated that it's beyond my grasp is why I keep coming back to the topic of quantum physics.  It is so bizarrely counterintuitive that a lot of laypeople hear about it, scoff, and say, "Okay, that can't be real."  The problem with the scoffers is that although sometimes we're not even sure what the predictions of quantum mechanics mean, they are superbly accurate.  It's one of the most thoroughly tested scientific models in existence, and it has passed every test.  There are measurements made using the quantum model that have been demonstrated to align with the predictions to the tenth decimal place.

That's a level of accuracy you find almost nowhere else in science.

The reason all this wild stuff comes up is because of a pair of papers (both still in peer review) that claim to have demonstrated something damn near incomprehensible -- the researchers say they have successfully split a photon and then triggered half of it to move backwards in time.

One of the biggest mysteries in physics is the question of the "arrow of time," a conundrum about which I wrote in some detail earlier this year.  The gist of the problem -- and I refer you to the post I linked if you want more information -- is that the vast majority of the equations of physics are time-reversible.  They work equally well backwards and forwards.  A simple example is that if you drop a ball with zero initial velocity, it will reach a speed of 9.8 meters per second after one second; if you toss a ball upward with an initial velocity of 9.8 meters per second, after one second it will have decelerated to a velocity of zero.  If you had a film clip of the two trajectories, the first one would look exactly like the second one running backwards, and vice versa; the physics works the same forwards as in reverse.

The question, then, is why is this so different from our experience?  We remember the past and don't know the future.  The physicists tell us that time is reversible, but it sure as hell seems irreversible to us.  If you see a ball falling, you don't think, "Hey, you know, that could be a ball thrown upward with time running backwards."  (Well, I do sometimes, but most people don't.)  The whole thing bothered Einstein no end.  "The distinction between past, present, and future," he said, "is only an illusion, albeit a stubbornly persistent one."

This skew between our day-to-day experience and what the equations of physics describe is why the recent papers are so fascinating.  What the researchers did was to take a photon, split it, and allow the two halves to travel through a crystal.  During its travels, one half had its polarity reversed.  When the two pieces were recombined, it produced an interference pattern -- a pattern of light and dark stripes -- only possible, the physicists say, if the reversed-polarity photon had actually been traveling backwards in time as it traveled forwards in space.

The scientists write:

In the macroscopic world, time is intrinsically asymmetric, flowing in a specific direction, from past to future.  However, the same is not necessarily true for quantum systems, as some quantum processes produce valid quantum evolutions under time reversal.  Supposing that such processes can be probed in both time directions, we can also consider quantum processes probed in a coherent superposition of forwards and backwards time directions.  This yields a broader class of quantum processes than the ones considered so far in the literature, including those with indefinite causal order.  In this work, we demonstrate for the first time an operation belonging to this new class: the quantum time flip.

This takes wibbly-wobbly-timey-wimey to a whole new level.


Do I really understand what happened here on a technical level?  Hell no.  But whatever it is, it's cool.  It shows us that our intuition about how things work is wildly and fundamentally incomplete.  And I, for one, love that.  It's amazing that not only are there things out there in the universe that are bafflingly weird, we're actually making some inroads into figuring them out.

To quote the eminent physicist Richard Feynman, "I can live with doubt and uncertainty and not knowing.  I think it's much more interesting to live not knowing than to have answers which might be wrong.  I have approximate answers and possible beliefs and different degrees of certainty about different things, but I'm not absolutely sure about anything."

To which I can only say: precisely.  (Thanks to the wonderful Facebook pages Thinking is Power and Mensa Saskatchewan for throwing this quote my way -- if you're on Facebook, you should immediately follow them.  They post amazing stuff like this every day.)

I'm afraid I am, and will always be, a dilettante.  There are only a handful of subjects about which I feel any degree of confidence in my depth of comprehension.  But that's okay.  I make up for my lack of specialization by being eternally inquisitive, and honestly, I think that's more fun anyhow.

 Three hundreds years ago, we didn't know atoms existed.  It was only in the early twentieth century that we figured out their structure, and that they aren't the little solid unbreakable spheres we thought they were.  (That concept is still locked into the word "atom" -- it comes from a Greek word meaning "can't be cut.")  Since then, we've delved deeper and deeper into the weird world of the very small, and what we're finding boggles the mind.  My intuition is that if you think it's gotten as strange as it can get, you haven't seen nothin' yet.

I, for one, can't wait.

****************************************


Monday, December 12, 2022

The origins of Thule

There's a logical fallacy called appeal to authority, and it's trickier than it sounds at first.

Appeal to authority occurs when you state that a claim is correct solely because it was made by someone who has credentials, prestige, or fame.  Authorities are, of course, only human, and make mistakes just like the rest of us, so the difficulty lies in part with the word "solely."  If someone with "M.S., Ph.D." after their name makes a declaration, those letters alone aren't any kind of argument that what they've said is correct, unless they have some hard evidence to back them up.

There's a subtler piece of this, though, and it comes in two parts.  The first is that because scientific research has become increasingly technical, jargon-dense, and specialized, laypeople sometimes are simply unqualified to evaluate whether a claim within a field is justified.  If Kip Thorne, Lee Smolin, or Steven Weinberg were to tell me about some new discovery in theoretical physics, I would be in well over my head (despite my B.S. in physics) and ridiculously out of line to say, "No, that's not right."  At that point, I don't have much of a choice but to accept what they say for the time -- and hope that if it is incorrect, further research and the peer-review process will demonstrate that.  This isn't so much avoiding appeal to authority as it is accepting that bias as an inevitable outcome of my own incomplete knowledge.

The second problem is that sometimes, people who are experts in one field will make statements in another, cashing in on their fame and name recognition to give unwarranted credence to a claim they are unqualified to make.  A good, if disquieting, example of this is the famous molecular geneticist James Watson.  As the co-discoverer of both the double-helical structure of the DNA molecule and the genetic code, anything he had to say about genetic biochemistry should carry considerable gravitas.  On the other hand, he's moved on to making pronouncements about (for example) race that are nothing short of repellent -- including, "I am inherently gloomy about the prospect of Africa [because] all our social policies are based on the fact that their intelligence is the same as ours, whereas all the testing says not really."  Believing this statement "because James Watson said it, and he's a famous scientist" is appeal to authority at its worst.  In fact, he is wildly unqualified to make any such assessment, and the statement reveals little more than the fact that he's an asshole.  (In fact, in 2019 that statement and others like it, including ones reflecting blatant sexism, resulted in Watson being stripped of all his honorary titles by Cold Springs Harbor Laboratory.)

My point here is that appeal to authority is sometimes difficult to pin down, which is why we have to rely on knowledgeable people policing each other.  Which brings us to philologist Andrew Charles Breeze.

Breeze has been a professor of philology at the University of Navarra for thirty-five years, and is a noted scholar of the classics.  His knowledge of Celtic languages, especially as used in ancient Celtic literature, is superb.  But he's also, unfortunately, known for his adherence to hypotheses based on evidence that is slim at best.  One example is his claim that the beautiful Welsh legend cycle The Mabinogion was written by a woman, Gwenllian ferch Gruffydd, daughter of Gruffydd ap Cynan, Prince of Gwynedd.  This claim has proven controversial to say the least.  He also has championed the idea that King Arthur et al. lived, fought, and died in Strathclyde rather than in southwestern England, a claim that has been roundly scoffed at.  Even Arthur's existence is questionable, given that his earliest mention in extant literature is Nennius's Historia Brittonum, which was written in 830 C.E., four hundred years after Arthur was allegedly King of the Britons.  As far as where he lived -- well, it seems to me that establishing if he lived is the first order of business.  

But even making the rather hefty assumption that the accounts of Nennius are true, we still have a problem with Breeze's claim.  Arthur's enemies the Saxons didn't really make any serious incursions into Strathclyde until the early seventh century, so an Arthur in Strathclyde would be in the position of fighting the Battle of Badon Hill against an enemy who wasn't there at the time. 

Awkward.

Anyhow, my point is that Breeze kind of has a reputation for putting himself out on the edge.  Nothing wrong with that; that's why we have peer review.  But I also have to wonder about people who keep making claims with flimsy evidence.  You'd think they'd become at least a little more cautious.

Why this comes up is that Breeze just made yet another claim, and this one is on a topic about which I'm honestly qualified to comment in more detail.  It has to do with the origin of the word "Thule."  You probably know that Thule is the name given in classical Greek and Roman literature to the "most northern place."  It was written in Greek as Θούλη, and has been identified variously as the Faeroe Islands, the Shetland Islands, northern Scotland, Greenland, Iceland, Norway, Finnish Lapland, an "area north of Scythia," the island of Saaremaa (off the coast of Estonia), and about a dozen other places.  The problem is -- well, one of many problems is -- there's no archaeological or linguistic evidence that the Greeks ever went to any of those places.  In the absence of hard evidence, you could claim that Thule was on Mars and your statement would carry equivalent weight.

Another difficulty is that even in classical times, the first source material mentioning Thule, written by Pytheas of Massalia, was looked at with a dubious eye.  The historian Polybius, writing only a century and a half after Pytheas's time, scathingly commented, "Pytheas... has led many people into error by saying that he traversed the whole of Britain on foot, giving the island a circumference of forty thousand stadia, and telling us also about Thule, those regions in which there was no longer any proper land nor sea nor air, but a sort of mixture of all three of the consistency of a jellyfish in which one can neither walk nor sail, holding everything together, so to speak."

Well, Breeze begs to differ.  In a recent paper, he said that (1) Thule is for sure Iceland, and (2) the Greeks (specifically Pytheas and his pals) got to Iceland first, preceding the Vikings by a thousand years.

[Image is in the Public Domain]

Bold claim, but there are a number of problems with it.

First, he seems to be making this claim based on one thing -- that the Greek word for Thule (Θούλη) is similar to the Greek word for altar (θῠμέλη), and that the whole thing was a transcription error in which the vowel was changed (ού substituted for ῠ) and the middle syllable (μέ) dropped.  Well, this is exactly the kind of thing I specialized in during my graduate studies, and I can say unequivocally that's not how historical linguistics works.  You can'd just jigger around syllables in a couple of words and say "now they're the same, q.e.d."  

He says his idea is supported by the fact that from the sea, the southern coast of Iceland looks kind of like an altar:

The term Thymele may have arisen from the orographic features of the south of the island, with high cliffs of volcanic rock, similar to that of Greek temple altars.  Probably, when Pytheas and his men sighted Iceland, with abundant fog, and perhaps with columns of smoke and ashes from volcanoes like Hekla, he thought of the altar of a temple.

This is what one of my professors used to call "waving your hands around in the hopes of distracting the audience into thinking you have evidence."  Also, the geologists have found evidence of only one major eruption in Iceland during Pytheas's lifetime -- the Mývatn eruption in around 300 B.C.E. -- and it occurred in the north part of Iceland, over three hundred kilometers from the southern coast of the island.

Oops.

Another thing that makes me raise an eyebrow is where the paper is published -- the Housman Society Journal, which is devoted to the study of the works of British classicist and poet A. E. Housman.  If Breeze's claim was all that and a bag of crisps, why hasn't it been published in a peer-reviewed journal devoted to historical linguistics?

Third, there's another classical reference to Thule that puts Breeze's claim on even thinner ice, which is from Strabo's Geographica, and states that when Pytheas got to Thule, he found it already thickly inhabited.  There is zero evidence that Iceland had any inhabitants prior to the Vikings -- it may be that the Inuit had summer camps in coastal western Iceland, but that is pure speculation without any hard evidential support.  The earliest Norse writings about Iceland describe it as "a barren and empty land, devoid of people."  Despite all this, Strabo writes:

The people [of Thule] live on millet and other herbs, and on fruits and roots; and where there are grain and honey, the people get their beverage, also, from them.  As for the grain, he says, since they have no pure sunshine, they pound it out in large storehouses, after first gathering in the ears thither; for the threshing floors become useless because of this lack of sunshine and because of the rains.

Oops again.

I can say from experience that establishing linguistic evidence for contact between two cultures is difficult, requires rigorous evidence, and can easily be confounded by chance similarities between words.  My own work, which involved trying to figure out the extent to which Old Norse infiltrated regional dialects of Old English and Archaic Gaelic, was no easy task (and was made even more difficult by the fact that two of the languages, Old Norse and Old English, share relatively recent a common root language -- Proto-Germanic -- so if you see similarities, are they due to borrowing or parallel descent?  Sometimes it's mighty hard to tell).

I'm not in academia and I'm in no position to write a formal refutation of Breeze's claim, but I sure as hell hope someone does.  Historical linguistics is not some kind of bastard child of free association and the game of Telephone.  I've no doubt that Breeze's expertise in the realm of ancient Celtic literature is far greater than mine -- but maybe he should stick to that subject.

****************************************