Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, November 5, 2022

The gift of a voice

I can think of no more terrifying disorder than locked-in syndrome.

Locked-in syndrome, also called a pseudocoma or cerebromedullospinal disconnection, is a (fortunately) rare condition where the entire voluntary muscle system shuts down but leaves the cognitive facilities relatively intact.  The result: you can't move, speak, or respond, but your brain is otherwise functioning normally.

You are a prisoner inside your own useless body.

The most famous case of locked-in syndrome was French journalist Jean-Dominique Bauby, who suffered a massive stroke at age 43 and lost control of his entire muscular system except for partial control over his left eye.  This at least allowed him to eventually communicate to his doctors that he was still conscious, and -- astonishingly -- he used that tiny bit of voluntary muscular movement to direct a cursor on a computer screen and painstakingly, letter by letter, write a book about his experience.  It's called The Diving Bell and the Butterfly, and it is simultaneously devastating and uplifting -- a paean to the human spirit's ability to rise above a level of adversity that, thankfully, very few of us will ever face.

Thanks to an article last week in IEEE Spectrum, I just found out about a new prosthetic device that will give people who have lost their ability to communicate their voices back -- by converting their brain waves into words.

A team at the University of California - San Francisco has done (successful) clinical trials of a device that is implanted through a small port in the patient's skull, and is able to detect the neural signals the cerebellum and motor cortex send to the patient's larynx and mouth when they think about speaking.  From the encoded electric impulses representing the movements the person would have made, had they been able to speak, the prosthesis is able to produce whole sentences of text -- at eighteen words a minute.

Pretty impressive, especially considering that this was just proof-of-concept, so as the technology is refined, it'll only get better.  And considering that Bauby was communicating at two to three letters per minute, this is an incredible leap forward.

"We’re now pushing to expand to a broader vocabulary," said Edward Chang, who led the team that developed the prosthesis.  "To make that work, we need to continue to improve the current algorithms and interfaces, but I am confident those improvements will happen in the coming months and years.  Now that the proof of principle has been established, the goal is optimization.  We can focus on making our system faster, more accurate, and—most important— safer and more reliable.  Things should move quickly now."

They are also working toward developing a wireless version that would be able to pick up the relevant brain waves from outside, thus obviating the need to place a port in the patient's skull.  With further refinements, it might become possible to create a device that could be used on any individual who is unable to speak -- the present ones require a training period during which they learn the patient's specific neural firing patterns.  Once those are generalized, it might be possible to create a "universal translator," something Chang calls "plug-and-play."

Even what they have now is amazing, however.  Imagine regaining your voice after months or years of muteness.  I never fail to be astonished at the progress we're making in science; it seems like everywhere you turn, there are new discoveries, new inventions, new insights.

In a time when so much seems hopeless, it's wonderful that we have such stories to remind us that there are people who are working to ease the burdens of others -- and that, in the words of Max Ehrmann's beautiful poem "Desiderata," "With all its sham, drudgery, and broken dreams, it is still a beautiful world."

****************************************


Friday, November 4, 2022

Tut tut

I ran into an interesting article in Science News yesterday about a new museum in Egypt that will feature the famous treasure trove of King Tutankhamun's tomb.  Tutankhamun was, as you undoubtedly know, the pharaoh of Egypt between about 1332 and 1323 B.C.E. before dying at the age of nineteen (probably of complications from malaria).  Because of his short reign and youth he's been nicknamed "the Boy King," and prior to his tomb's discovery in in 1922 held a relatively obscure spot in Egyptian history.  This may have been what saved his tomb nearly intact for archaeologists to find; no one knew it was there.

He was also eclipsed by his infamous father, the Pharaoh Akhenaten, the "heretic king" who attempted to replace the Egyptian pantheon of gods with a single monotheistic religion, the worship of the god Aten.  Trying to decree a change in people's religion went about as well as you'd expect, and after Akhenaten's death everyone went back to worshiping Ra and Horus and Thoth and Anubis and the rest of the gang, not to mention erasing every trace of Akhenaten they could find.

The whole thing, though, put me in mind of the famous "King Tut's Curse," which supposedly claimed the lives of a number of people who investigated the tomb and has since spawned countless movies and horror novels about evil befalling people who violate people's final resting places.  

[Image licensed under the Creative Commons Roland Unger, CairoEgMuseumTaaMaskMostlyPhotographed, CC BY-SA 3.0]

The story goes that shortly after Tut's tomb was opened, people associated with the expedition began to die.  The first was Lord Carnarvon, who had funded Carter's expedition, who cut himself badly while shaving and died shortly thereafter of sepsis from an infection.  While it's easy enough to explain a death from infection in Egypt prior to the advent of modern antibiotics, the deaths continued after the members of the expedition returned to London:

  • Richard Bethell, Carter's personal secretary, was found smothered in a Mayfair club.
  • Bethell's father, Lord Westbury, fell to his death from his seventh-floor flat -- where he had kept artifacts from the tomb his son had given him.
  • Aubrey Herbert, half-brother of the first victim Lord Carnarvon, died in a London hospital "of mysterious symptoms."
  • Ernest Wallis Budge, of the British Museum, was found dead in his home shortly after arranging for the first public show of King Tut's sarcophagus.
And so on.  All in all, twenty people associated with the expedition died within the first few years after returning to England.  (It must be said that Howard Carter, who led the expedition, lived for another sixteen years; and you'd think that if King Tut would have wanted to smite anyone, it would have been Carter.  And actually, a statistical study done of Egyptologists who had entered pharaohs' tombs found that their average age at death was no lower than that of the background population.)

Still, that leaves some decidedly odd deaths to explain.  And historian Mark Benyon thinks he's figured out how to explain them.

In his book London's Curse: Murder Black Magic, and Tutankhamun in the 1920s West End, Benyon lays the deaths of Carter's associates in London -- especially Bethell, Westbury, Herbert, and Budge, all of which were deaths by foul play -- at the feet of none other than Aleister Crowley.

Crowley, you may recall, was the subject of a seriocomic post here about a magical battle only a couple of months ago, so he's a bit of a frequent flyer here at Skeptophilia.  For those of you who missed that one, Crowley is the guy who proclaimed himself the "Wickedest Man on Earth," and was a sex-obsessed heroin addict who became notorious for founding a magical society called "Thelema."  Thelema's motto was "Do what thou wilt," which narrowly edged out Crowley's second favorite, which was "Fuck anything or anyone that will hold still long enough."  His rituals were notorious all over London for drunken debauchery, and few doubted then (and fewer doubt now) that there was any activity so depraved that Crowley wouldn't happily indulge in it.

Crowley ca. 1912 [Image is in the Public Domain]

One of Crowley's obsessions was Jack the Ripper.  He believed that the Ripper murders had been accomplished through occult means, and frequently was heard to speak of Jack the Ripper with reverence.  Benyon believes that when Crowley heard about Howard Carter's discoveries, he was outraged -- many of Thelema's rituals and beliefs were derived from Egyptian mythology -- and he came up with the idea of a series of copycat murders to get even with the men who had (in his mind) desecrated Tutankhamen's tomb.

It's an interesting hypothesis.  Surely all of the expedition members knew of Crowley; after all, almost everyone in London at the time did.  At least one (Budge) was an occultist who ran in the same circles as Crowley.  That Crowley was capable of such a thing is hardly to be questioned.  Whether Benyon has proved the case or not is debatable, but even at first glance it certainly makes better sense than the Pharaoh's Curse malarkey.  It's probably impossible at this point to prove if Benyon's claim is correct in all its details, rather like the dozens of explanations put forward to explain the Ripper murders themselves.  But this certainly makes me inclined to file the "Mummy's Curse" under "Another woo-woo claim plausibly explained by logic and rationality."

In any case, I'm glad to hear the archaeologists are still working on the discoveries from Tutankhamun's tomb, and not afraid that they themselves will be struck down by the ghost of the Boy King.  I'll take actual scientific research over loony superstition any day of the week.

****************************************


Thursday, November 3, 2022

Damage control

The human psyche is a fragile thing.

I was going to start that sentence with, "At the risk of being called a snowflake...", but then I decided I don't give a flying rat's ass if anyone does call me a snowflake.  Or "woke."  "Snowflake" has become some kind of jerk code for "someone who cares deeply how others feel," and "woke" for "awareness that others' experience and perceptions are as valid as my own, even if I don't share them," and on that basis I'm happy to accept the appellation of Woke Snowflake.

The fact is, all of us, even Un-Woke Non-Snowflakes, can be hurt.  It's all too easy.  Whether we react to that hurt by crying, retreating, laughing it off, or getting angry, the fact remains that none of us are impervious to what others say and think.  It's why dealing with bullying is so critical, and the correct response is not to tell the victim "toughen up, develop a thicker skin, grow some balls," or whatever, all things I was told repeatedly when I was a child.  Unsurprisingly, none of that sage advice had the slightest effect, other than letting the bullies know that no one was going to do a damn thing about it.  It's amazing the number of people who don't recognize this for what it is, which is a game of "blame the victim."

For what it's worth, the correct response is for someone with appropriate authority to tell the bully, "This stops, and it stops now.  I will be watching you."

It's why when I was asked a while back what were the three most important words you could say to someone other than "I love you," my response was, "You are safe."  I never felt safe when I was a kid.  And if you don't think that leaves a mark on someone that persists into adulthood, you are sadly mistaken.

It's why I was sickened by the revelation this week that British actor Kit Connor, best known for playing the character of Nick Nelson on the lovely coming-of-age series Heartstopper, was being harassed online by "fans" who accused him of "queerbaiting" -- pretending to be queer (or being cagey about it) in order to benefit from the cachet of being associated with the LGBTQ community without committing himself outright.  Connor ignored the accusations for a while, but they became so strident that he got onto Twitter on Halloween and posted:


The number of ways this is fucked up leaves me not knowing where to begin.  Apparently part of the firestorm started with photographs of him holding hands with actress Maia Reficco, which adds a whole nasty gloss of "bi people in straight-presenting relationships aren't actually queer" to a situation that is already ugly enough.  I find this infuriating (for obvious reasons); we bisexual people are under no obligation to meet some kind of queerness litmus test set by someone -- anyone -- else.

The deeper problem here, of course, is that nobody should ever push someone to come out before they're ready.  Ever.  This sort of thing happens all too often with actors and musicians, and not just about sexual orientation but about everything.  Fans become desperate to peer into their lives, as if somehow enjoying their skill, talent, and hard work when they perform justifies forgetting that they are real humans who need privacy and have the right to reveal about their personal lives exactly what, when, and how much they choose.  At the far end of this horrible scale is the phenomenon of paparazzi, parasites who are fed by fans' insatiable appetite for lurid details, accurate or not.

The worst part in this particular case is that the lion's share of the accusations of queerbaiting Connor faced came from people who are LGBTQ themselves.  People who should fucking well know better.  People who themselves have undoubtedly faced harassment and discrimination and unfair social pressures, and now have apparently forgotten all that and turned on someone whose only crimes were (1) playing a bisexual character in a television show, and (2) wanting to come out by his own choice and at his own time.

How dare you force someone into this situation.

I can only hope that Kit's trenchant "I think some of you missed the point of the show" drove the message home with these people.  I also hope that the harm done to Kit himself, and potentially to his relationships (whatever those are), doesn't leave a lasting mark.  To the fandom's credit, there was a huge groundswell of people supporting him unconditionally and decrying what had happened, and with luck, that did enough damage control to lessen the pain he endured.

So for heaven's sake, people, start thinking before you speak, and realize that words can do incalculable harm.  Keep in mind that humans are fragile creatures who deserve careful handling.  Always err on the side of compassion.

And if you can't do all that, then at least have the common decency to keep your damn mouth shut.

****************************************


Wednesday, November 2, 2022

Exploding the birth-order myth

How often have you heard a friend mention an odd characteristic of a mutual acquaintance, and follow it up with a statement like, "Well, he's a middle child," leaving you both nodding knowingly as if that explained it?

Conscientious, strong-willed eldest children.  Lost, rebellious middles.  Immature, demanding youngests.  Then there's my situation -- the spoiled, tightly-wound only children, who were doted upon by their parents and had their every whim met immediately.

I know that wasn't really true in my case; far from being overprotected, my youth was more a case of free-range parenting of a child who was damn close to feral.  After school, and all summer long, my parents' style could be summed up as "Be back by dinner and try not to break any bones.  Either yours or anyone else's."  So I knew that at least from a sample size of one, there was something wrong with the birth-order-determines-personality model.


Even seeing other exceptions here and there never left me confident enough to contradict the prevailing wisdom.  After all, the plural of anecdote is not data.  But now a pair of studies has conclusively disproven the connection between birth order and... anything.

In the first, a trio of psychologists at the University of Leipzig analyzed personality assessments for a huge sample size (they had data for over 20,000 individuals), looking for how they scored on what are called the "Big Five" features of personality -- extraversion, emotional stability, agreeableness, conscientiousness, and imagination.  They found no correlation whatsoever between birth order and any of those. In their words:
[W]e consistently found no birth-order effects on extraversion, emotional stability, agreeableness, conscientiousness, or imagination.  On the basis of the high statistical power and the consistent results across samples and analytical designs, we must conclude that birth order does not have a lasting effect on broad personality traits outside of the intellectual domain.
A similar, but much larger study done at the University of Illinois -- this one of 377,000 high school students -- also found no correlation whatsoever:
We would have to say that, to the extent that these effect sizes are accurate estimates of the true effect, birth order does not seem to be an important consideration for understanding either the development of personality traits or the development of intelligence in the between-family context.  One needs only to look at the “confounds,” such as parental socio-economic status and gender, for factors that warrant much more attention given the magnitude of their effects relative to the effects of birth order.
So if there really is nothing to the birth-order effect, why is it such a persistent myth?  I think there are two things going on, here.  One is that during childhood, older children differ in maturity from their younger siblings because... well, they're older.  Of course a fifteen-year-old is going to be more conscientious and articulate than his seven-year-old brother.  There'd be something seriously wrong if he weren't.  So we tend to see any differences that exist between siblings and interpret them in light of the model we already had, thus reinforcing the model itself -- even if it's wrong.

Because that's the second problem -- our old arch-nemesis confirmation bias.  Once we think we know what's going on, our confidence in it becomes unshakable.  I have to wonder how many people are reading this post, and thinking, "Yes, but for my own kids, the birth-order effect works.  So I still believe it."  It's a natural enough human tendency.

On the other hand, I think you have to admit that your own personal family's characteristics really aren't going to call into question a scholarly analysis of 377,000 people.

So that's pretty much that.  No more blaming your appreciation of fart jokes on being an immature youngest child.  And my friends and family will have to cast around for a different explanation for why I'm as neurotic as I am.  There probably is an explanation, but my being an only child isn't it.

****************************************


Tuesday, November 1, 2022

The dynamic Earth

The highlight of my trip to Iceland this past August was seeing the newly-erupting volcano of Fagradalsfjall, southwest of the capital city of Reykjavík.

Fagradalsfjall is Icelandic for "mountain of the beautiful valley."  I'm not sure I'd use the word "beautiful," which to me carries connotations of "benevolent."  When we were there, you could feel the eruption before you heard or saw it; the entire floor of the valley was vibrating, a subsonic rumble that I felt in my gut.  Then you hear the roar, a guttural, low-pitched thunderous booming.  Then you smell it -- the characteristic sulfurous, rotten-egg smell of an active volcano.  Then you crest the top of a low hill, and see it for the first time.


We were close enough that we could feel the warmth radiated from the lava.  Much closer, and the combination of the heat and the sulfur gases would have been overwhelming.  Orange-hot plumes of molten rock exploded out of the fissure and splattered onto the sides of the cinder cone, almost instantly turning to shattered, jagged chunks of black basalt as it cooled and hardened.

It was one of the most spectacular things I've ever witnessed.  In the presence of this kind of power, you truly feel tiny and very, very fragile. 

We were really extraordinarily lucky to see what we did; we were there on the 15th of August, and -- for reasons unknown -- the eruption abruptly ceased on the 21st.  Fagradalsfjall is still very much an active volcano, though.  Just last week it started up again, and this cycle looks like it may actually be even more dramatic.

What brings all this up is a paper last week in Nature about some research out of the University of California - Santa Barbara that analyzed the lava from Fagradalsfjall and found that it ran counter to the conventional model of how volcanoes erupt.  The previous understanding was that magma chambers fill gradually, and undergo mixing from convection and the physical shaking from earthquakes; then, when the eruption happens, the chamber drains.  This would result in a relatively uniform chemistry of the rock produced from the beginning of the eruption to the end.

That's not what geologists saw with Fagradalsfjall.

"This is what we see at Mount Kilauea, in Hawaii," said Matthew Jackson, who co-authored the study.  "You'll have eruptions that go on for years, and there will be minor changes over time.  But in Iceland, there was more than a factor of 1,000 higher rates of change for key chemical indicators.  In a month, the Fagradalsfjall eruption showed more compositional variability than the Kilauea eruptions showed in decades.  The total range of chemical compositions that were sampled at this eruption over the course of the first month span the entire range that has ever erupted in southwest Iceland in the last 10,000 years."

Why this happened is uncertain.  It could be that Fagradalsfjall is being fed by blobs of liquid magma rising from much deeper in the mantle, where the chemistry is different; those much hotter blobs then rose to the surface without a lot of mixing, resulting in a dramatic alteration of the rock being produced over the course of the eruption.  This adds a significant complication to interpreting records of past eruptions, not only in Iceland, but with other volcanoes.

"So when I go out to sample an old lava flow, or when I read or write papers in the future," Jackson said, "it'll always be on my mind: This might not be the complete story of the eruption."

It's fascinating that as far as science has come, we still have a lot to work out -- not only out in the far depths of space (as yesterday's post about MoND described) but right beneath our feet on our own home world.  As eminent astrophysicist Neil de Grasse Tyson put it, "You can’t be a scientist if you’re uncomfortable with ignorance, because scientists live at the boundary between what is known and unknown in the cosmos.  This is very different from the way journalists portray us.  So many articles begin, "Scientists now have to go back to the drawing board."  It’s as though we’re sitting in our offices, feet up on our desks—masters of the universe—and suddenly say, "Oops, somebody discovered something!"  No.  We’re always at the drawing board.  If you’re not at the drawing board, you’re not making discoveries."

****************************************


Monday, October 31, 2022

Newton modified

Back in the 1970s and 1980s, astrophysicist Vera Rubin discovered something odd about the rates at which stars were revolving around their home galaxies; the stars in the outer reaches of the galaxy were orbiting as quickly as the ones nearer to the center.

Called the "flat rotation curve problem," this observation flies in the face of an astronomical principle that's been known since the seventeenth century, which is Kepler's Third Law.  Kepler's Third Law states that for bodies orbiting the same center of gravity, the square of the orbital period (time taken for the object to make a single orbit) is proportional to the cube of the average distance between the object and the center of gravity.  Put more simply, the farther out an orbiting object is, the slower it should be moving.  This law holds beautifully for the planets, asteroids, and comets in the Solar System.

Unfortunately, when Rubin looked at galactic rotation rates, she found that Kepler's Third Law appeared not to hold.  What it looked like was that there was a great deal more mass in the galaxy than could be seen, and that mass was spread out in some kind of invisible halo surrounding it.  That additional mass would account for the flatness of the rotation curves.

It was forthwith nicknamed dark matter.

The calculations of Rubin and others showed that the amount of dark matter was not insignificant.  Current estimates place it at around 27% of the total mass of the universe.  Only 5% is baryonic (ordinary) matter, so the matter we can't see outweighs ordinary matter by over a factor of five.  (The other 68% is the even weirder and more elusive dark energy, about which we know next to nothing.)

The problem is, every experiment designed to directly detect dark matter has resulted in zero success.  Whatever it is, it seems not to interact with ordinary matter at all other than via its gravitational pull.  These repeated failures drew rueful comparisons between dark matter and the luminiferous aether, the mysterious substance through which light waves were alleged to propagate.  The aether was proposed back in the nineteenth century because it was hard to imagine how light waves moved through a vacuum unless it had a medium -- what, exactly, was waving?  The existence of aether was conclusively disproven by the elegant Michelson-Morley experiment, which showed that unlike any other kind of wave, the speed of light waves seemed to be invariant regardless of the direction of motion of the observer.  It remained for Albert Einstein to explain how that could possibly be -- and to figure out all the strange and counterintuitive outcomes of this phenomenon, with his Special and General Theories of Relativity.

More than one modern physicist has surmised that dark matter might similarly be the result of a fundamental misunderstanding of how gravity works -- and that we are just waiting for this century's Einstein to turn physics on its head by pointing out what we've missed.

Enter Israeli physicist Mordehai Milgrom.

Milgrom is the inventor of MoND (Modified Newtonian Dynamics), a model which -- like the Theories of Relativity -- proposes that the explanation for the anomalous observations is not that there's some unseen and undetectable substance causing the effect, but that our understanding of how physics works is incomplete.  In particular, Milgrom says, there needs to be a modification to the equations of motion at very small accelerations, such as the ones experienced by stars orbiting in the outer reaches of galaxies.

With those modifications, the orbital rates make perfect sense.  No dark matter needed.

The Whirlpool Galaxy [Image licensed under the Creative Commons NASA/ESA/JPL/Hubble Heritage Team & C. Violette, M51 (2), CC BY-SA 4.0]

As with relativity -- and any other time someone has claimed to overturn a long-established paradigm -- MoND hasn't achieved anywhere near universal acclaim.  The Wikipedia article on it (linked above) states, gloomily, "no satisfactory cosmological model has been constructed from the hypothesis."  And it does lack the blindingly bright insight of Einstein's models, where taking the "problem of the seeming invariance of the speed of light" and turning it into the "axiom of the actual invariance of the speed of light" triggered a shift in our understanding that has since passed every empirical test ever designed.  Compared to Einstein's model, MoND almost seems like "Newton + an add-on," with no particularly good explanation as to why high accelerations obey Newton's laws but low ones don't.  (Of course, there's a parallel here to Einstein, as well -- at low speeds, Newton's laws are accurate, while at near-light speeds, Einsteinian effects take over.  So maybe Milgrom is on to something after all.)

After all, it's not like the other option -- dark matter -- has much going for it experimentally.

And MoND just got a significant leg up with an observation of the behavior of star clusters that was the subject of a paper in Monthly Notices of the Royal Astronomical Society last week.  In open star clusters, as new stars ignite it produces an outward push that can blow away material (including other stars), creating two "tidal tails" that precede and trail the cluster as it moves through space.  According to Newtonian dynamics (with or without dark matter), the two tails should have about the same mass.

"According to Newton's laws of gravity, it's a matter of chance in which of the tails a lost star ends up," explains Dr. Jan Pflamm-Altenburg of the Helmholtz Institute of Radiation and Nuclear Physics at the University of Bonn.  "So both tails should contain about the same number of stars.  However, in our work we were able to prove for the first time that this is not true: In the clusters we studied, the front tail always contains significantly more stars nearby to the cluster than the rear tail."

This peculiar observation fits the predictions of MoND much better than it does the predictions of the Newtonian model.

"The results [of simulations using MoND] correspond surprisingly well with the observations," said Ingo Thies, co-author of last week's paper.  "However, we had to resort to relatively simple computational methods to do this.  We currently lack the mathematical tools for more detailed analyses of modified Newtonian dynamics."

So the matter is very far from settled.  What's certain is that, similar to the physicists' situation in the late nineteenth century with regards to the behavior of light, there's something significant we're missing.  Whether that's some odd form of matter that doesn't interact with anything except via gravity, or because we've got the equations for the laws of motion wrong, remains to be seen.

And of course, after that, we still have dark energy to explain.  I think the physicists are going to be busy over the next few decades, don't you?

****************************************


Saturday, October 29, 2022

Gold standard

I have a great fondness for a glass of fine red wine or single malt scotch, but I have to admit something up front; to say I have an "undiscerning palate" is a considerable understatement.

I basically have two taste buds: "thumbs up" and "thumbs down."  I know what I like, but that's about where it stops.  On the other end of the spectrum from me are "supertasters" -- people who have a much greater acuity for the sense of taste than the rest of us slobs -- and they are in high demand working for food and drink manufacturers as taste testers, because they can pick up subtleties in flavor that bypass most people.  They're the ones we have to thank for what you read on the labels in wine stores ("This vintage has a subtle nose of asphalt and boiled cabbage; the flavor contains notes of wet dog, garlic, and sour cream, with a delicate hint of chocolate at the finish").

I make fun, but I swear I once saw a sauvignon blanc described as tasting like "cat piss on a gooseberry bush."  I had to try it.  

It was actually rather nice.  "Thumbs up."

What's always struck me about all this is how subjective it seems.  So much of it is, both literally and figuratively, a matter of taste.  This is why I thought it was fascinating that a new study has found a way to quantify the presence of congeners -- the chemicals other than alcohol introduced by the fermentation and aging processes -- which are the source of most of the flavor in wines, beers, and spirits.

[Image licensed under the Creative Commons Pjt56, Glencairn Glass-pjt, CC BY-SA 4.0]

A paper in ACS Applied Nanomaterials, led by Jennifer Gracie of the University of Glasgow, describes a simple test for flavor in whisky using less than a penny's worth of soluble gold ions.  It turns out that the aging process for whiskies involves storing them in charred oak barrels, and this introduces congeners that react strongly with gold, producing a striking red or purple color.  The deeper the color, the more congeners are present -- and the more flavorful the whisky.

The authors write:

The maturation of spirit in wooden casks is key to the production of whisky, a hugely popular and valuable product, with the transfer and reaction of molecules from the wooden cask with the alcoholic spirit imparting color and flavor.  However, time in the cask adds significant cost to the final product, requiring expensive barrels and decades of careful storage.  Thus, many producers are concerned with what “age” means in terms of the chemistry and flavor profiles of whisky.  We demonstrate here a colorimetric test for spirit “agedness” based on the formation of gold nanoparticles (NPs) by whisky.  Gold salts were reduced by barrel-aged spirit and produce colored gold NPs with distinct optical properties...  We conclude that age is not just a number, that the chemical fingerprint of key flavor compounds is a useful marker for determining whisky “age”, and that our simple reduction assay could assist in defining the aged character of a whisky and become a useful future tool on the warehouse floor.

Which is pretty cool.  Better than relying on people like me, whose approach to drinking a nice glass of scotch is not to analyze it, but to pour a second round.  I guess there's nothing wrong with knowing what you like -- even if you can't really put your finger on why you like it.

That's why we non-supertasters rely on studies like this one to provide a gold standard to make up for our own lack of perceptivity.

****************************************