Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, December 15, 2022

Words as edged tools

Words matter.

This comes up because of a couple of unrelated social media interactions that got me thinking about the fact that many people use words and then want to avoid the implications and consequences of how they're perceived.  The first was a post from the Reverend Doctor Jacqui Lewis that (hearteningly) got a lot of responses of the high-five and applause variety, which said, "You don't 'hate pronouns.'  You hate the people who are using them.  If that makes you feel uncomfortable, then good.  It should.  You either respect how people are asking you to know and name them, or you don't.  But stop pretending it's about language."

The other was in response to a TikTok video I made for my popular #AskLinguisticsGuy series, in which I made the statement that prescriptivism -- the idea that one dialect of a language is to be preferred over another -- is inherently classist, and that we have to be extremely careful how we characterize differing pronunciations and word usages because they are often used as markers of class and become the basis for discrimination.  Most people were positive, but there was That One Guy who responded only with, "Great.  Another arrogant preachy prick."

Now, let me say up front that there are perhaps times when people are hypersensitive, and infer malice from the words we use when there was none intended.  On the other hand, it's critical that we as speakers and writers understand the power of words, and undertake educating ourselves about how they're perceived (especially by minorities and other groups who have experienced bigotry).  If someone in one of those groups says to me, "Please don't use that word, it's offensive," I am not going to respond by arguing with them about why it was completely appropriate.  I would far rather err on the side of being a little overcautious than unwittingly use a word or a phrase that carries ugly overtones.

Let me give you an example from my own personal experience.  I grew up in the Deep South -- as my dad put it, if we'd been any Deeper South, we'd'a been floating.  And I can say that it really pisses me off when I see a southern accent used as a marker of ignorance, bigotry, or outright stupidity.  I was appalled when a local middle school here in upstate New York put on a performance of Li'l Abner, a play written by Melvin Frank and Norman Panama (both northerners native to Chicago).  The entire play, in my opinion, can be summed up as "Oh, those goofy southerners, how comically dim-witted they are."  If you've never seen it, you'll get the flavor when you hear that it features characters named Mammy Yokum, General Bullmoose, and Jubilation T. Cornpone.  I don't blame the kids; they were doing their best with it.  I blame the adults who chose the play, and then chortled along at sixth and seventh graders hee-hawing their way through the lines of dialogue with fake southern accents, and acted as if it was all okay.

People who know me would readily tell you that I'm very comfortable with laughing at myself.  My reaction to Li'l Abner wasn't that I "can't take a joke" at my own expense.  The problem is that the show is based on a single premise: characterizing an entire group, rural southerners, using a ridiculous stereotype, and then holding that stereotype up for a bunch of smug northerners to laugh at.

And if taking offense at that makes me a "woke snowflake," then I guess that's just the way it has to be.

[Image licensed under the Creative Commons Kevin C Chen, SnowflakesOnWindshield, CC BY-SA 2.0 TW]

If, in your humor or your critical commentary, you're engaging in what a friend of mine calls "punching downward," you might want to think twice about it.

The bottom line, here, is that what I'm asking people to do (1) can make a world of difference to the way they come across, and (2) just isn't that hard.  When a trans kid in my class came up to me on the first day of class and said, "I go by the name ____, and my pronouns are ____," it literally took me seconds to jot that down, and next to zero effort afterward to honor that request.  To that student, however, it was deeply important, in a way I as a cis male can only vaguely comprehend.  Considering the impact of what you say or what you write, especially on marginalized groups, requires only that you educate yourself a little bit about the history of those groups and how they perceive language.

Refusing to do that isn't "being anti-woke."  It's "being an asshole."

Words can be edged tools, and we need to treat them that way.  Not be afraid of them; simply understand the damage they can do in the wrong hands or used in the wrong way.  If you're not sure how a word will be perceived, ask someone with the relevant experience whether they find it offensive, and then accept what they say as the truth.

And always, always, in everything: err on the side of kindness and acceptance.

****************************************


Wednesday, December 14, 2022

Ahead of the curve

I remember how stunned I was when I was in high school and found out that all energy release -- from striking a match to setting off a nuclear bomb -- goes back to Einstein's famous equation, that energy is equal to mass times the speed of light squared.

It all hinges on the fact that the Law of Conservation of Mass isn't quite right.  If I set a piece of paper on fire inside a sealed box, the oft-quoted line in middle school textbooks -- that if I'd weighed the paper and the air in the box beforehand and then reweighed the ash and the air in the box afterward, they'd have identical masses -- isn't true.  The fact is, the box would weigh less after the paper had burned completely.

The reason is that some (a very tiny amount, but some) of the mass of the paper would have been converted to energy according to Einstein's equivalency, and that's where the heat and light of the fire came from.  Thus, the box and its contents would have less mass than they started with.

The mind-boggling truth is that when you burn a fossil fuel -- oil, coal, or natural gas -- you are re-releasing energy from the Sun that was stored in the tissues of plants in the form of a little bit of extra mass during the Carboniferous Period, three-hundred-odd million years ago.

So to fix the problem with the "Law," we have to account for the shifting back and forth between matter and energy.  If you change it to a conservation law of the total -- that the sum of the mass and energy stays constant in a closed system -- it's spot-on.  (In fact, this is the First Law of Thermodynamics.)

How much energy you can get out of anything depends, then, only on one thing; how much of its mass you can turn into energy.  This is the basis of (amongst many other things) what happens in a nuclear power plant.  As folks like Henri Becquerel, Marie Skłodowska Curie, Pierre Curie, and others showed in the early twentieth century, the atoms of an element can be turned into the atoms of a different element -- the dream of the alchemists -- and the amount of energy required or released by that process is described by something called the binding energy curve.


This graph shows a number of interesting things.  First, the higher on the graph an atom is, the more stable it is.  Second, when you're going from one atom type to another, if you've moved upward on the graph, that transition releases energy; if you've moved downward, the transition requires energy.  Third, how big a jump you've made is a measure of the amount of energy you release or consume in the transition.  (Theoretically; as you'll see, doing this in the real world, and making practical use of the process, is another matter entirely.)

Note, for example, going from uranium (at the far right end of the graph) to any of the other mid-weight elements uranium breaks down into when it undergoes nuclear fission.  What those are, specifically, isn't that important; they all lie on the flattish part of the curve between iron (Fe, the most stable element) and uranium.  Going from uranium to any of those is an upward movement on the graph, and thus releases energy.  Seems like it must not be much, right?  Well, that "small" release is what generates the energy from a nuclear power plant -- and from bombs of the type that destroyed Hiroshima.

Now check out the other end of the graph -- the elements for which fusion is the energy-releasing transformation.

Go, for example from hydrogen-1 (the very bottom left corner of the graph) to helium-4 (at the peak, right around 7 MeV), and compare the size of that leap with the one from uranium to any of its fission products.  This transition -- hydrogen-1 to helium-4 -- is the one that powers the Sun, and is what scientists would like to get going in a fusion reactor.

See why?  I could sit down and calculate the per-transition difference in the energy release between fission and fusion, but it's huge.  Fusion releases more energy by orders of magnitude.  Also, the fuel for fusion, hydrogen, is by far the most abundant element in the Solar System; it's kind of everywhere.  Not only that, the waste product -- helium -- is completely harmless and inert, by comparison to fission waste, which remains deadly for centuries.

That's why the scientists want so desperately to get fusion going as a viable energy source.

The problem, as I noted earlier, is practicality.  The fusion reactions in the Sun are kept going because the heat and pressure in the core are sufficient for hydrogen nuclei to overcome their mutual electrostatic repulsion, crushing them together and triggering a chain reaction that leads to helium-4 (and releasing a crapload of energy in the process).  Maintaining those conditions in the lab has turned out to be extraordinarily difficult; it's always consumed (far) more energy to trigger nuclear fusion than came out of it, and the reactions are self-limiting, collapsing in a split-second.  It's what's given rise to the sardonic quip, "Practical nuclear fusion is fifty years in the future... and always will be."

Well -- it seems like "fifty years in the future" may have just gotten one step closer.

It was just announced that for the first time ever, scientists at the amusingly-named National Ignition Facility of Livermore, California have created a nuclear fusion reaction that produced more energy than it consumed.  This proof-of-concept is, of course, only the first step, but it demonstrates that practical nuclear fusion might not be the pipe dream it has seemed since its discovery almost a century ago.

"This is a monumental breakthrough," said Gilbert Collins of the University of Rochester in New York, a physicist who has collaborated in other NIF projects but was not involved the current research.  "With this achievement, the landscape has changed...  comparable to the invention of the transistor or the Wright brothers’ first flight.  We now have a laboratory system that we can use as a compass for how to make progress very rapidly."

So keep your eyes on the news.  A common pattern in science is that once someone shows something is possible, the advances take off like a rocket.  Imagine how it would change the world if we could, once and for all, ditch our dependence on fossil fuels and dangerous nuclear fission technology, and power the planet using an energy source that runs on a ridiculously abundant fuel and produces a completely harmless waste product.

That dream may have just gotten one step closer.

****************************************


Tuesday, December 13, 2022

Timey-wimey light

I don't always need to understand things to appreciate them.

In fact, there's a part of me that likes having my mind blown.  I find it reassuring that the universe is way bigger and more complex than I am, and the fact that I actually can parse a bit of it with my little tiny mind is astonishing and cool.  How could it possibly be surprising that there's so much more out there than the fragment of it I can comprehend?

This explains my love for twisty, complicated fiction, in which you're not handed all the answers and everything doesn't get wrapped up with a neat bow at the end.  It's why I thoroughly enjoyed the last season of Doctor Who, the six-part story arc called "Flux."  Apparently it pissed a lot of fans off because it had a quirky, complicated plot that left a bunch of loose ends, but I loved that.  (I'm also kind of in love with Jodie Whittaker's Thirteenth Doctor, but that's another matter.)

I don't feel like I need all the answers.  I'm not only fine with having to piece together what exactly happened to whom, but I'm okay that sometimes I don't know.  You just have to accept that even with all the information right there in front of you, it's still not enough to figure everything out.

Because, after all, that's how the universe itself is.

[Nota bene: Please don't @ me about how much you hated Flux, or how I'm crediting Doctor Who showrunner Chris Chibnall with way too much cleverness by comparing his work to the very nature of the universe.  For one thing, you're not going to change my mind.  For another, I can't be arsed to argue about a matter of taste.  Thanks.]

In any case, back to actual science.  That sense of reality being so weird and complicated that it's beyond my grasp is why I keep coming back to the topic of quantum physics.  It is so bizarrely counterintuitive that a lot of laypeople hear about it, scoff, and say, "Okay, that can't be real."  The problem with the scoffers is that although sometimes we're not even sure what the predictions of quantum mechanics mean, they are superbly accurate.  It's one of the most thoroughly tested scientific models in existence, and it has passed every test.  There are measurements made using the quantum model that have been demonstrated to align with the predictions to the tenth decimal place.

That's a level of accuracy you find almost nowhere else in science.

The reason all this wild stuff comes up is because of a pair of papers (both still in peer review) that claim to have demonstrated something damn near incomprehensible -- the researchers say they have successfully split a photon and then triggered half of it to move backwards in time.

One of the biggest mysteries in physics is the question of the "arrow of time," a conundrum about which I wrote in some detail earlier this year.  The gist of the problem -- and I refer you to the post I linked if you want more information -- is that the vast majority of the equations of physics are time-reversible.  They work equally well backwards and forwards.  A simple example is that if you drop a ball with zero initial velocity, it will reach a speed of 9.8 meters per second after one second; if you toss a ball upward with an initial velocity of 9.8 meters per second, after one second it will have decelerated to a velocity of zero.  If you had a film clip of the two trajectories, the first one would look exactly like the second one running backwards, and vice versa; the physics works the same forwards as in reverse.

The question, then, is why is this so different from our experience?  We remember the past and don't know the future.  The physicists tell us that time is reversible, but it sure as hell seems irreversible to us.  If you see a ball falling, you don't think, "Hey, you know, that could be a ball thrown upward with time running backwards."  (Well, I do sometimes, but most people don't.)  The whole thing bothered Einstein no end.  "The distinction between past, present, and future," he said, "is only an illusion, albeit a stubbornly persistent one."

This skew between our day-to-day experience and what the equations of physics describe is why the recent papers are so fascinating.  What the researchers did was to take a photon, split it, and allow the two halves to travel through a crystal.  During its travels, one half had its polarity reversed.  When the two pieces were recombined, it produced an interference pattern -- a pattern of light and dark stripes -- only possible, the physicists say, if the reversed-polarity photon had actually been traveling backwards in time as it traveled forwards in space.

The scientists write:

In the macroscopic world, time is intrinsically asymmetric, flowing in a specific direction, from past to future.  However, the same is not necessarily true for quantum systems, as some quantum processes produce valid quantum evolutions under time reversal.  Supposing that such processes can be probed in both time directions, we can also consider quantum processes probed in a coherent superposition of forwards and backwards time directions.  This yields a broader class of quantum processes than the ones considered so far in the literature, including those with indefinite causal order.  In this work, we demonstrate for the first time an operation belonging to this new class: the quantum time flip.

This takes wibbly-wobbly-timey-wimey to a whole new level.


Do I really understand what happened here on a technical level?  Hell no.  But whatever it is, it's cool.  It shows us that our intuition about how things work is wildly and fundamentally incomplete.  And I, for one, love that.  It's amazing that not only are there things out there in the universe that are bafflingly weird, we're actually making some inroads into figuring them out.

To quote the eminent physicist Richard Feynman, "I can live with doubt and uncertainty and not knowing.  I think it's much more interesting to live not knowing than to have answers which might be wrong.  I have approximate answers and possible beliefs and different degrees of certainty about different things, but I'm not absolutely sure about anything."

To which I can only say: precisely.  (Thanks to the wonderful Facebook pages Thinking is Power and Mensa Saskatchewan for throwing this quote my way -- if you're on Facebook, you should immediately follow them.  They post amazing stuff like this every day.)

I'm afraid I am, and will always be, a dilettante.  There are only a handful of subjects about which I feel any degree of confidence in my depth of comprehension.  But that's okay.  I make up for my lack of specialization by being eternally inquisitive, and honestly, I think that's more fun anyhow.

 Three hundreds years ago, we didn't know atoms existed.  It was only in the early twentieth century that we figured out their structure, and that they aren't the little solid unbreakable spheres we thought they were.  (That concept is still locked into the word "atom" -- it comes from a Greek word meaning "can't be cut.")  Since then, we've delved deeper and deeper into the weird world of the very small, and what we're finding boggles the mind.  My intuition is that if you think it's gotten as strange as it can get, you haven't seen nothin' yet.

I, for one, can't wait.

****************************************


Monday, December 12, 2022

The origins of Thule

There's a logical fallacy called appeal to authority, and it's trickier than it sounds at first.

Appeal to authority occurs when you state that a claim is correct solely because it was made by someone who has credentials, prestige, or fame.  Authorities are, of course, only human, and make mistakes just like the rest of us, so the difficulty lies in part with the word "solely."  If someone with "M.S., Ph.D." after their name makes a declaration, those letters alone aren't any kind of argument that what they've said is correct, unless they have some hard evidence to back them up.

There's a subtler piece of this, though, and it comes in two parts.  The first is that because scientific research has become increasingly technical, jargon-dense, and specialized, laypeople sometimes are simply unqualified to evaluate whether a claim within a field is justified.  If Kip Thorne, Lee Smolin, or Steven Weinberg were to tell me about some new discovery in theoretical physics, I would be in well over my head (despite my B.S. in physics) and ridiculously out of line to say, "No, that's not right."  At that point, I don't have much of a choice but to accept what they say for the time -- and hope that if it is incorrect, further research and the peer-review process will demonstrate that.  This isn't so much avoiding appeal to authority as it is accepting that bias as an inevitable outcome of my own incomplete knowledge.

The second problem is that sometimes, people who are experts in one field will make statements in another, cashing in on their fame and name recognition to give unwarranted credence to a claim they are unqualified to make.  A good, if disquieting, example of this is the famous molecular geneticist James Watson.  As the co-discoverer of both the double-helical structure of the DNA molecule and the genetic code, anything he had to say about genetic biochemistry should carry considerable gravitas.  On the other hand, he's moved on to making pronouncements about (for example) race that are nothing short of repellent -- including, "I am inherently gloomy about the prospect of Africa [because] all our social policies are based on the fact that their intelligence is the same as ours, whereas all the testing says not really."  Believing this statement "because James Watson said it, and he's a famous scientist" is appeal to authority at its worst.  In fact, he is wildly unqualified to make any such assessment, and the statement reveals little more than the fact that he's an asshole.  (In fact, in 2019 that statement and others like it, including ones reflecting blatant sexism, resulted in Watson being stripped of all his honorary titles by Cold Springs Harbor Laboratory.)

My point here is that appeal to authority is sometimes difficult to pin down, which is why we have to rely on knowledgeable people policing each other.  Which brings us to philologist Andrew Charles Breeze.

Breeze has been a professor of philology at the University of Navarra for thirty-five years, and is a noted scholar of the classics.  His knowledge of Celtic languages, especially as used in ancient Celtic literature, is superb.  But he's also, unfortunately, known for his adherence to hypotheses based on evidence that is slim at best.  One example is his claim that the beautiful Welsh legend cycle The Mabinogion was written by a woman, Gwenllian ferch Gruffydd, daughter of Gruffydd ap Cynan, Prince of Gwynedd.  This claim has proven controversial to say the least.  He also has championed the idea that King Arthur et al. lived, fought, and died in Strathclyde rather than in southwestern England, a claim that has been roundly scoffed at.  Even Arthur's existence is questionable, given that his earliest mention in extant literature is Nennius's Historia Brittonum, which was written in 830 C.E., four hundred years after Arthur was allegedly King of the Britons.  As far as where he lived -- well, it seems to me that establishing if he lived is the first order of business.  

But even making the rather hefty assumption that the accounts of Nennius are true, we still have a problem with Breeze's claim.  Arthur's enemies the Saxons didn't really make any serious incursions into Strathclyde until the early seventh century, so an Arthur in Strathclyde would be in the position of fighting the Battle of Badon Hill against an enemy who wasn't there at the time. 

Awkward.

Anyhow, my point is that Breeze kind of has a reputation for putting himself out on the edge.  Nothing wrong with that; that's why we have peer review.  But I also have to wonder about people who keep making claims with flimsy evidence.  You'd think they'd become at least a little more cautious.

Why this comes up is that Breeze just made yet another claim, and this one is on a topic about which I'm honestly qualified to comment in more detail.  It has to do with the origin of the word "Thule."  You probably know that Thule is the name given in classical Greek and Roman literature to the "most northern place."  It was written in Greek as Θούλη, and has been identified variously as the Faeroe Islands, the Shetland Islands, northern Scotland, Greenland, Iceland, Norway, Finnish Lapland, an "area north of Scythia," the island of Saaremaa (off the coast of Estonia), and about a dozen other places.  The problem is -- well, one of many problems is -- there's no archaeological or linguistic evidence that the Greeks ever went to any of those places.  In the absence of hard evidence, you could claim that Thule was on Mars and your statement would carry equivalent weight.

Another difficulty is that even in classical times, the first source material mentioning Thule, written by Pytheas of Massalia, was looked at with a dubious eye.  The historian Polybius, writing only a century and a half after Pytheas's time, scathingly commented, "Pytheas... has led many people into error by saying that he traversed the whole of Britain on foot, giving the island a circumference of forty thousand stadia, and telling us also about Thule, those regions in which there was no longer any proper land nor sea nor air, but a sort of mixture of all three of the consistency of a jellyfish in which one can neither walk nor sail, holding everything together, so to speak."

Well, Breeze begs to differ.  In a recent paper, he said that (1) Thule is for sure Iceland, and (2) the Greeks (specifically Pytheas and his pals) got to Iceland first, preceding the Vikings by a thousand years.

[Image is in the Public Domain]

Bold claim, but there are a number of problems with it.

First, he seems to be making this claim based on one thing -- that the Greek word for Thule (Θούλη) is similar to the Greek word for altar (θῠμέλη), and that the whole thing was a transcription error in which the vowel was changed (ού substituted for ῠ) and the middle syllable (μέ) dropped.  Well, this is exactly the kind of thing I specialized in during my graduate studies, and I can say unequivocally that's not how historical linguistics works.  You can'd just jigger around syllables in a couple of words and say "now they're the same, q.e.d."  

He says his idea is supported by the fact that from the sea, the southern coast of Iceland looks kind of like an altar:

The term Thymele may have arisen from the orographic features of the south of the island, with high cliffs of volcanic rock, similar to that of Greek temple altars.  Probably, when Pytheas and his men sighted Iceland, with abundant fog, and perhaps with columns of smoke and ashes from volcanoes like Hekla, he thought of the altar of a temple.

This is what one of my professors used to call "waving your hands around in the hopes of distracting the audience into thinking you have evidence."  Also, the geologists have found evidence of only one major eruption in Iceland during Pytheas's lifetime -- the Mývatn eruption in around 300 B.C.E. -- and it occurred in the north part of Iceland, over three hundred kilometers from the southern coast of the island.

Oops.

Another thing that makes me raise an eyebrow is where the paper is published -- the Housman Society Journal, which is devoted to the study of the works of British classicist and poet A. E. Housman.  If Breeze's claim was all that and a bag of crisps, why hasn't it been published in a peer-reviewed journal devoted to historical linguistics?

Third, there's another classical reference to Thule that puts Breeze's claim on even thinner ice, which is from Strabo's Geographica, and states that when Pytheas got to Thule, he found it already thickly inhabited.  There is zero evidence that Iceland had any inhabitants prior to the Vikings -- it may be that the Inuit had summer camps in coastal western Iceland, but that is pure speculation without any hard evidential support.  The earliest Norse writings about Iceland describe it as "a barren and empty land, devoid of people."  Despite all this, Strabo writes:

The people [of Thule] live on millet and other herbs, and on fruits and roots; and where there are grain and honey, the people get their beverage, also, from them.  As for the grain, he says, since they have no pure sunshine, they pound it out in large storehouses, after first gathering in the ears thither; for the threshing floors become useless because of this lack of sunshine and because of the rains.

Oops again.

I can say from experience that establishing linguistic evidence for contact between two cultures is difficult, requires rigorous evidence, and can easily be confounded by chance similarities between words.  My own work, which involved trying to figure out the extent to which Old Norse infiltrated regional dialects of Old English and Archaic Gaelic, was no easy task (and was made even more difficult by the fact that two of the languages, Old Norse and Old English, share relatively recent a common root language -- Proto-Germanic -- so if you see similarities, are they due to borrowing or parallel descent?  Sometimes it's mighty hard to tell).

I'm not in academia and I'm in no position to write a formal refutation of Breeze's claim, but I sure as hell hope someone does.  Historical linguistics is not some kind of bastard child of free association and the game of Telephone.  I've no doubt that Breeze's expertise in the realm of ancient Celtic literature is far greater than mine -- but maybe he should stick to that subject.

****************************************


Saturday, December 10, 2022

Christmas cheer

It sometimes comes as a shock to my friends and acquaintances when they find out that even though I'm a staunch unbeliever in anything even resembling organized religion, I love Christmas music.

Well, some Christmas music.  There are modern Christmas songs that make me want to stick any available objects in my ears, even if those objects are fondue forks.  Abominations like "I Saw Mommy Kissing Santa Claus" leave me leery of entering any public spaces with ambient music from November 15 to December 25.  In my opinion, there should be jail time associated with writing lines like, "Little tin horns and little toy drums, rooty-toot-toot and rummy-tum-tum," and whoever wrote "Let It Snow" should be pitched, bare-ass naked, head-first into a snowdrift.

Each year I participate in something called the Little Drummer Boy Challenge, which is a contest to see if you can make it from Thanksgiving to Christmas without once hearing "The Little Drummer Boy."  So far, I'm still in the game this year, although it must be said that I've done this for nine years and have hardly ever survived.  I've never been taken out as ignominiously, though, as I was a few years ago, when I made it all the way to the week before Christmas, and stopped by a hardware store to pick some stuff up.  And while I was waiting to check out, a stock clerk walked by jauntily singing the following:

Come, they LA LA pah-rum-puh-pum-pum
A newborn LA LA LA pah-rum-puh-pum-pum
LA LA LA gifts we bring pah-rum-puh-pum-pum
LA LA before the king pah-rum-puh-pum-pum, rum-puh-pum-pum, rum-puh-pum-pum

Dude didn't even know all the damn lyrics, but I had to play fair and admit I'd been felled by the Boy one more time.  Before I could stop myself, I glared at him and said, "Are you fucking kidding me right now?" in a furious voice, which led to a significant diminishment of the Christmas cheer in the store, but I maintain to this day I had ample justification.  The alarmed stock clerk scurried off, clearly afraid that if he stuck around much longer, the Batshit Crazy Scruffy Blond Customer was going to Deck his Halls but good.

I know this makes me sound like a grumpy curmudgeon.  I can accept that, because I am a grumpy curmudgeon.  But even so, I absolutely love a lot of Christmas music.  I think "O Holy Night" is a stunning piece of music, and "Angels We Have Heard On High" is incredible fun to sing (as long as it's not sung like a dirge, but as the expression of joy consistent with the lyrics).  Speaking of doing things the right way, check out Annie Lennox's stupendous music video of "God Rest Ye Merry, Gentlemen:"


Despite the impression I probably gave at the start of this post, the list of Christmas songs I like is way longer than the list of ones I don't.  I grew up singing wonderful French carols like "Il Est Né, Le Divin Enfant" and "Un Flambeau, Jeanette Isabella," and to this day hearing those songs makes me smile.

And I can include not only seasonal religious music, but religious music in general, in this discussion; one of my favorite genres of music is Renaissance and Baroque religious music, especially the works of William Byrd, Henry Purcell, J. S. Bach, William Cornysh, Giovanni de Palestrina, and Thomas Tallis.  If you want to hear something truly transcendent, listen to this incredible performance of Tallis's Spem in Alium ("Hope in Another"), a forty-part motet here sung by seven hundred people:


I know it might seem like a contradiction for a non-religious person to thoroughly enjoy such explicitly religious music, but in my opinion, beauty is beauty wherever you find it.  I can be moved to tears by Bach's Mass in B Minor without necessarily believing the story it tells.  And it also pleases me that it gives me common ground with my friends who do believe, for whom the lovely "Mary's Boy Child" isn't just a cool calypso tune, but a joyous expression of something near and dear to them.

I guess I'm a bit of a contradiction in terms sometimes, but that's okay.  I still deeply resent any attempt to force belief on others (or lack of belief, for that matter), and my anger runs deep at the damage done, and still being done, by the religious to members of the LGBTQ community.  The likelihood of my ending up back in church is minuscule at best.

Even so, I still love the holiday season.  It's a chance to give gifts and express my appreciation for my friends and family, and to enjoy the pretty decorations and sweet music.  Honestly, I think a lot of us godless heathens feel the same way, which is why I'm glad to see that this year -- so far, at least -- the Religious Right has backed off on the whole idiotic "War On Christmas" nonsense.  After all, it's been what, fifteen years or so? -- since Bill O'Reilly gave the clarion call that the Atheists Were Comin' For Your Christmas Trees, and if you'll look around you'll notice that everyone's still saying "Merry Christmas" and giving gifts and everything else just like they've always done, so the whole trope has finally fallen a little flat.  It couldn't have gone any other way, honestly.  A great many of us atheistic types are also pretty dedicated to live-and-let-live, and most of us don't care if you have Christmas displays in your front yard so bright they disrupt nearby air traffic, as long as you're not going to pull out your AR-15 when a non-believer says "Happy Holidays" instead of "Merry Christmas."

I do, however, draw the line at piping in "The Little Drummer Boy" over mall loudspeakers.  That's just a bridge too far.  I mean, what kind of stupid song is that, anyhow?  It's about a kid who sees a mom and dad with a quietly sleeping newborn baby, and thinks, "You know what these people need?  A drum solo."

In my opinion, Mary would have been well in her rights to smack him over the head with the frankincense.  Pah-rum-puh-pum-pow, you odious little twerp.

****************************************


Friday, December 9, 2022

It's a bird, it's a plane... no, it's both

One topic I've come back to over and over again here at Skeptophilia is how flawed our sensory/perceptive apparatus is.  Oh, it works well enough; most of the time, we perceive the external world with sufficient clarity not to walk into walls or get run over by oncoming trains.  But our impression that we experience the world as it is -- that our overall ambient sense of everything around us, what the brilliant neurophysiologist David Eagleman calls our umwelt, is a crystal-clear reflection of the real universe -- simply is false.

All it takes is messing about with optical illusions to convince yourself how easy our brains and sensory organs are to fool.  For example, in the following drawing, which is darker; square A or square B?


They're exactly the same.  Don't believe me?  Here's the same drawing, with a pair of gray lines superimposed on it:



Because your brain decided that B was in the shadow and A wasn't, then it concluded that A had to be intrinsically darker.  What baffles me still about this illusion is that even once you know how the trick works, it's impossible to see it any other way.

As astronomer Neil deGrasse Tyson put it, "Our brains are rife with ways of getting it wrong.  You know optical illusions?  That's not what they should call them.  They should call them brain failures.  Because that's what they are.  A few cleverly drawn lines, and your brain can't handle it."

Well, we just got another neat hole shot in our confidence that what we're experiencing is irrefutable concrete reality with a study that appeared in the journal Psychological Science this week.  What the researchers did was attempt to confound the senses of sight and hearing by showing test subjects a photograph of one object morphing into another -- say, a bird into an airplane.  During the time they studied the photograph, they were exposed to a selection from a list of sounds, two of which were relevant (birdsong and the noise of a jet engine) and a number of which were irrelevant distractors (like a hammer striking a nail).

They were then told to use a sliding scale to estimate where in the transformation of bird-into-airplane the image was (e.g. seventy percent bird, thirty percent airplane).  What the researchers found was that people were strongly biased by what they were hearing; birdsong biased the test subjects to overestimate the birdiness of the photograph, and reverse happened with the sound of a jet engine.  The irrelevant noises didn't effect choice (and thus, when exposed to the irrelevant noises, their visual perceptions of the image were more accurate).

"When sounds are related to pertinent visual features, those visual features are prioritized and processed more quickly compared to when sounds are unrelated to the visual features," said Jamal Williams, of the University of California - San Diego, who led the study, in an interview with Science Daily.  "So, if you heard the sound of a birdsong, anything bird-like is given prioritized access to visual perception.  We found that this prioritization is not purely facilitatory and that your perception of the visual object is actually more bird-like than if you had heard the sound of an airplane flying overhead."

I guess it could be worse; at least hearing birdsong didn't make you see a bird that wasn't there.  But it does once again make me wonder how eyewitness testimony is still considered to carry the most weight in a court of law when experiment after experiment has demonstrated not only how incomplete and easily biased our perceptions are, but how flawed our memories are.

Something to keep in mind next time you are tempted to say "I know it happened that way, I saw it with my own eyes."

****************************************


Thursday, December 8, 2022

Death metal bat

My favorite wild animals are bats.

I think the flying fox -- a large diurnal species of fruit bat -- has got to be one of the coolest animals in the world.  Think about how amazing it would be, being a flying fox.  You have great big wings and can fly anywhere you want, you get to eat figs and dates all day, and you're cute as the dickens.  What could be better than that?

Fruit-eating sky puppies, is what they are.

[Image licensed under the Creative Commons Trikansh sharma, Eye contact with flying fox, CC0 1.0]

Unfortunately, bats in general have gotten a bad name, even though they're unequivocally beneficial.  (The insectivorous kinds can eat up to a thousand small flying insects -- including disease-carrying mosquitoes -- in an hour.)   The negative reputation comes from two sources: first, an association with drinking blood (only three out of the thousand species of bats do that; all three live in South America and almost never bite humans); and second, that they carry rabies (which can happen -- but so do raccoons, foxes, skunks, feral cats and dogs, and even deer).

Bats are good guys.  They're also incredibly cool.  I did a piece last year about the wild adaptations for echolocating in nocturnal bats, an ability I still find mind-boggling.  Which is why I was so psyched to run across a paper this week in PLOS-Biology about the fact that their ability to produce such an amazing array of sounds is due to the same feature death metal singers use to get their signature growl. 

In "Bats Expand Their Vocal Range By Recruiting Different Laryngeal Structures for Echolocation and Social Communication," biologists Jonas Håkonsson, Cathrine Mikkelsen, Lasse Jakobsen, and Coen Elemans, of the University of Southern Denmark, write:

Echolocating bats produce very diverse vocal signals for echolocation and social communication that span an impressive frequency range of 1 to 120 kHz or 7 octaves.  This tremendous vocal range is unparalleled in mammalian sound production and thought to be produced by specialized laryngeal vocal membranes on top of vocal folds.  However, their function in vocal production remains untested. By filming vocal membranes in excised bat larynges (Myotis daubentonii) in vitro with ultra-high-speed video (up to 250,000 fps) and using deep learning networks to extract their motion, we provide the first direct observations that vocal membranes exhibit flow-induced self-sustained vibrations to produce 10 to 95 kHz echolocation and social communication calls in bats.  The vocal membranes achieve the highest fundamental frequencies (fo’s) of any mammal, but their vocal range is with 3 to 4 octaves comparable to most mammals.  We evaluate the currently outstanding hypotheses for vocal membrane function and propose that most laryngeal adaptations in echolocating bats result from selection for producing high-frequency, rapid echolocation calls to catch fast-moving prey.  Furthermore, we show that bats extend their lower vocal range by recruiting their ventricular folds—as in death metal growls—that vibrate at distinctly lower frequencies of 1 to 5 kHz for producing agonistic social calls.  The different selection pressures for echolocation and social communication facilitated the evolution of separate laryngeal structures that together vastly expanded the vocal range in bats.

NPR did a story on the research, and followed it up by talking to some death metal singers, all of whom were pretty fascinated to find out bats can do it, too.  "In a [masochistic] sort of way ... I think that when I can feel that my vocal cords are getting kind of shredded or beat up, that it sounds better," said Chase Mason, lead singer of the band Gatecreeper.  "You know, like, if there's a little taste of blood in the back of my throat, I think that I'm doing a good job...  A lot of people will compare you to sounding like a bear or something like that, like an animal growling or roaring even... I think it's cool.  It's very dark and gothic.  The imagery of a bat is always associated with the darker sort of things, like vampires and stuff.  So it definitely makes sense."

I'm still more favoring the Sky Puppy model of bats, but hey, I'm not arguing with a guy who can make noises like Chase Mason can.

In any case, add one more thing to the "cool" column for bats, which was pretty lengthy already.  It's incredible that however much we learn about nature, there are always ways it'll come back and surprise you.  That's why if you have a curious side, learn some science -- you'll never be short of new things to wonder at.

****************************************