Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
A couple of weeks ago, I wrote about the spike in atmospheric oxygen concentration -- by some estimates, rising to 35% -- during the Carboniferous Period, triggered by explosive growth of forests, and allowing arthropods like insects, arachnids, and millipedes to grow to enormous sixes.
The good times, though (for them at least), were not to last. Around three hundred million years ago, there was a sudden drop in oxygen and rise in carbon dioxide. This triggered rapid climatic shifts that resulted in the Late Carboniferous Rain Forest Collapse, which saw a major alteration from the swamp-dwelling plants and animals at the height of the period to species that could tolerate the dry heat that was to persist throughout the next period, the Permian. (This set up the rise of reptiles, which would see their peak in the dinosaurs of the Mesozoic.)
Artist's depiction of the mid-Carboniferous swamps (ca. 1887) [Image is in the Public Domain]
The source of the excess carbon dioxide was very likely volcanic. Besides the fact that lava can contain dissolved gases (mostly carbon and sulfur dioxide), the heat of the eruptions may have caused the oxidation of the plentiful limestone and coal deposits formed during the earlier lush, wet part of the period -- a precursor of the much bigger disaster that was in store fifty million years later, when at the end of the Permian, the Siberian Traps erupted and tore through a huge amount of the sequestered carbon, causing widespread global anoxia and climate change, and the largest mass extinction ever.
By some estimates, ninety percent of life on Earth died.
But the rain forest collapse at the end of the Carboniferous was bad enough. A study that came out this week in Proceedings of the National Academy of Sciences found the anoxia/hypoxia hit the oceans the hardest, where the oxygen levels rapidly dropped by between four and twelve percent, with a commensurate rise in dissolved carbon dioxide. When carbon dioxide dissolves in water, it produces a weak acid -- carbonic acid -- lowering the pH. Organisms that make their shells out of calcium carbonate, like mollusks, brachiopods, and corals, literally dissolved.
You ready for the kicker?
The study's estimate of the rate of carbon dioxide release during the Late Carboniferous Rain Forest Collapse is a hundred times smaller than the rate we're putting carbon dioxide into the atmosphere today through burning fossil fuels.
"This is a huge discovery, because how do you take an ocean sitting under an atmosphere with much more oxygen than today and permit this?" said Isabel Montañez of the University of California - Davis, senior author of the study. "The message for us is, 'Don't be so sure that we can't do this again with our current human-driven release of carbon dioxide.'"
The problem is, the current administration is in the pockets of the fossil fuel industry, and is doing their level best to pretend this isn't happening, and to discredit anyone who says it is. Worse, actually; they've cancelled funding for any scientific research about climate.
Because apparently "la la la la la la not listening" is now considered wise political policy. This, despite warning signals like the eastern half of the United States sweltering this past week under the most extreme heat wave we've had in over fifty years.
So I'm expecting studies like the one released this week by Montañez et al. to receive exactly zero attention from the people who actually could work toward addressing this situation. It brings to mind a quote from Upton Sinclair, uttered almost a century ago: "It is difficult to get a man to understand something when his salary depends on his not understanding it."
A point I've made before, but one I think is absolutely critical to skeptics, is that sometimes we simply don't have answers -- and are forced to admit that unless further information turns up, we might never have.
I get that it's intensely frustrating. It seems to be hardwired into our brains that some conclusion, any conclusion, is better than remaining in doubt. I recall once being asked by a student if I thought there was an afterlife. My answer was, "I don't know."
"But what do you think?" the student said.
"I don't think anything. The information we have from phenomena like near-death experiences is inconclusive. And no one comes back to report from an actual death experience. I'll find out for sure eventually, but at the moment, I don't have enough evidence to decide one way or the other. So the answer is 'I don't know.'"
This obviously irritated the hell out of the student, and he said, "But you must have an opinion."
"Why?" I answered. "If you want my opinion, it's that the world could do with a great many fewer opinions and a great many more facts."
The problem is, of course, that the intolerance for frustration stemming from a desperation to have the matter settled often drives us to unsupported speculation -- and these meanderings often end up passed along as fact. (How many different The Jack the Ripper Mystery Solved! books have been written? All equally confident, but all with different solutions?) As a less-known but equally fascinating (and, reassuringly, less violent) example, let's consider the strange case of the Wild Boy of Aveyron.
In 1797, near the town of Saint-Sernin-sur-Rance in Aveyron département in southern France, three hunters came upon a boy of about nine. He was completely naked, and ran from them, but they trapped him when he climbed a tree. After capturing him -- and finding he couldn't (or wouldn't) speak, but didn't seem dangerous -- they brought him into the town, where he was taken in by an elderly widow who fed and clothed him.
Within a week, he disappeared -- leaving his clothes behind.
Over the next two years, he was spotted periodically in the woods, but always eluded capture. Then -- in January of 1800 -- he came out of the forest on his own. He eventually ended up in an orphanage in Rodez. Upon examination, psychiatrist Philippe Pinel suggested he was mentally disabled, probably from birth; scars on his body suggested he'd spent most of his childhood in the wilderness.
Of course, the question arose of how a small child could survive in the forest, even in the relatively temperate south of France. How he didn't die of exposure, from starvation, or from being attacked and eaten by wild animals, was a significant mystery. Were his scars from the cuts and scrapes of living outdoors naked -- or were they from early abuse? The physician Jean Marc Gaspard Itard, who worked extensively with the boy (whom he christened "Victor") believed that the evidence supported that Victor had "lived in an absolute solitude from his fourth or fifth almost to his twelfth year, which is the age he may have been when he was taken in the Caune woods."
Lithograph of Victor of Aveyron, ca. 1800 [Image is in the Public Domain]
At this period, France was just emerging from the chaos and horror of the Reign of Terror, and the question was raised of whether he was a child whose parents had been imprisoned or executed. Several couples were located in the region who had sons of the right age that had gone missing while they (the parents) were in jail -- but none of them recognized Victor.
Another curious twist is that one prominent philosophy amongst the intelligentsia at the time was the idea of the "Noble Savage" -- that taken away from the noise and filth and crowds of the city, placed in the tranquility of the forests and glades, humans would revert to some sort of pre-Adam-and-Eve-eating-the-apple blissful state of oneness with nature. People like the Swiss philosopher Jean-Jacques Rousseau tended to have a rather optimistic -- some might say unwarrantedly optimistic -- view of the potential of humanity and the natural world. (Alfred Lord Tennyson's observation that "Nature is red in tooth and claw" came later.) So Victor was studied intensively to see if he showed any signs of Edenic grace and innocence.
Not so much, it turned out. He still didn't like wearing clothes -- although consented to do so when it was cold -- and was fond of doing what just about all teenage boys do at least once a day. Other than being mute, and having peculiar eating habits (raw vegetables were by far his favorite food), he didn't seem to exhibit any sort of before-the-Fall chastity and sinlessness. He did show some signs of what we would now probably classify as autistic behavior -- rocking back and forth or hugging himself when stressed -- and there's been speculation that this, along with his lack of speech, may have been why he was abandoned in the first place.
Victor never learned to speak, other than the words lait ("milk") and oh, Dieu ("oh, God"). But he was never violent, and in fact seemed predisposed to being gentle and caring. When he was around eighteen, he went to stay with a Madame Guérin, with whom he lived for the rest of his life. And when Madame Guérin's husband died, and she was sitting at her dinner table weeping, Victor startled her by going up and putting his arms around her and holding her while she cried.
Victor of Aveyron died in 1828 of pneumonia, at the age of somewhere around forty, and took to his grave whatever he knew about his origins.
The lack of information here is what facilitates wild speculation. Much has been written about Victor -- some claiming that he was below average intelligence, others that he was of ordinary intelligence but autistic, others still that he was basically a normal young man and his early childhood trauma led him to hide the fact that he understood everything people were saying (and, perhaps, could speak as well, but simply refused to do so).
The truth, of course, is that we don't know who Victor was, what his mental capacity was, or where he came from. There just isn't enough in the way of hard data, despite the extensive studies of the young man done by Itard and others. So the correct conclusion is not to come to a conclusion at all.
It's frustrating, especially given such an intriguing story, but that's where we have to leave it.
Like I said, I get the human drive to understand, and how a mystery can nibble at your brain, keeping you puzzling over it. And this can be a very good thing; two examples I can think of, from my own field of linguistics, that were finally solved due to someone's dogged tenacity and absolute refusal to give up are Jean-François Champollion's decoding of Egyptian hieroglyphics and the decipherment of the Linear B script of Crete by Alice Kober and Michael Ventris.
But sometimes -- there simply isn't enough information. And at that point, we have to let it go, and hope that more turns up.
In the case of Victor of Aveyron, though, that's pretty unlikely. So we're left with a mystery: a feral child showed up in late eighteenth-century France, eventually joined society (more or less), grew up, and finally died, and there is probably no way we'll ever know more. Victor is, and shall almost certainly remain, a cipher.
However confounding that is to our natural intellectual curiosity.
The always-hilarious Gary Larson, whose ability to create absurd combinations of cultural references is unparalleled, had a Far Side comic strip showing a typical office waiting room. Sitting in one of the chairs, cross-legged and reading a magazine, is a mummy. The secretary -- with the trademark Larson bouffant hairdo and cat's-eye glasses -- is on the phone to her boss, saying, "Mr. Bailey? There's a gentleman here who claims an ancestor of yours once defiled his crypt, and now you're the last remaining Bailey, and... oh, something about a curse. Shall I send him in?"
The whole "Mummy's Curse" thing usually brings to mind the "Boy King" Tutankhamen, and the claim that twenty members of the expedition that opened the tomb died not long afterward. There are three caveats to this, however: the deaths happened over a decade, suggesting that Tut wasn't in a great hurry to get his vengeance; a statistical study showed that the average age at death of the people who did succumb to "King Tut's Revenge" was no lower than that of the background population; and there is a plausible case to be made that at least two of the deaths (Howard Carter's personal secretary, Richard Bethell, and Bethell's father Lord Westbury, both of whom were murdered) were killed by, or on the orders of, none other than Aleister Crowley.
Whether this last bit is true or not remains very much to be seen; in my opinion, the case relies on highly circumstantial evidence, and after a hundred years it's doubtful we'll ever know for certain. What I'm pretty sure of is that a scattered bunch of deaths, over ten years or so, of men who were mostly upper middle-aged is not really that much of a mystery, and the curse is nothing more than an attempt to give an added frisson to an archaeological find that honestly is interesting enough without all the supernatural trappings.
On the other hand, consider the opening of the tomb of Casimir IV Jagiellon, King of Poland and Grand Duke of Lithuania. Casimir is considered one of the most able Polish kings, and consolidated his territory, won many military victories, and generally was a force to be reckoned with. He died in 1492, and was interred with much pomp and circumstance in Wawel Cathedral in Kraków.
Casimir IV Jagiellon of Poland [Image is in the Public Domain]
In 1973, a team of twelve historians and archaeologists opened his tomb.
Within weeks, ten of the twelve were dead.
So do we have a real-life example of tomb desecration? Oh, and something about a curse?
It turns out that (unsurprisingly) there's nothing supernatural involved here, either. No need to invoke ancient Polish witchcraft. The unfortunate researchers succumbed to infections of Aspergillus flavus, a pathogenic fungus that secretes an especially nasty group of organic compounds called aflatoxins. Fungal infections are notoriously hard to treat -- fungal cells are similar enough to animal cells that chemicals which will kill a fungus often don't do our own tissues any good at all. Fungal spores are also incredibly tough and long-lived; the Aspergillus spores that killed the research team members had likely been there since the tomb was sealed, over 530 years ago.
But Aspergillus isn't all bad. A team at the University of Pennsylvania just published a paper in Nature Chemical Biology looking at a different set of compounds the fungus produces -- and found they target and disrupt cancer cells, especially those in leukemia.
The chemicals are called ribosomally synthesized and post-translationally modified peptides. The biochemists call them RiPPs, even though the actual acronym would be RSaPTMPs, which I have to admit would be a little hard to pronounce, so RiPPs it is. And the scientists found that the RiPPs produced by Aspergillus flavus had as much potency against leukemia cells as cytarabine and daunorubicin, two of the go-to drugs used to treat the disease for decades.
"Nature has given us this incredible pharmacy," said Sherry Gao, senior author of the study. "It's up to us to uncover its secrets. As engineers, we're excited to keep exploring, learning from nature and using that knowledge to design better solutions."
Which I think you will all agree is a better approach than superstition about opening graves.
Still, it's probably best to be cautious in any tomb-raiding you're planning on doing. Curses not withstanding, aspergillosis is nothing to mess around with. Even if the fungus turns out to have some beneficial features, remember to wear your respirators the next time you investigate the burial sites of fifteenth-century Polish kings.
Begun in 1924 by painter Pavel Jerdanowitch, it shares some features with Primitivism, in that there is little effort to make the image realistic. Depictions are brash and bold, with dramatic lines and use of primary colors, but flat, with no particular attention given to perspective and depth. Instead, the focus is on emotion. Here's one example, Jerdanowitch's Aspiration:
Aspiration was selected in early 1926 for reproduction in the prestigious journal Chicago Art World, and Lena McCauley, of the Chicago Evening Post, said it was a "delightful jumble of Gauguin, Pop Art and Negro minstrelsy with a lot of Jerdanowitch individuality."
And perhaps the most famous -- although I was unable to find a color image of it -- Exaltation:
Exaltation won a spot at the Exhibition of the Independents at the New York City Waldorf-Astoria, and the French art magazine Revue du Vrai et du Beau contacted Jerdanowitch to ask for permission to reproduce it, along with a request for an interview, more biographical information, and an essay describing his own interpretation of the painting.
Pretty impressive, considering how competitive the world of fine art can be.
Okay, now let's do this again, shall we?
In 1924, a novelist named Paul Jordan-Smith got good and pissed off because his wife, the talented artist Sarah Bixby Smith, kept getting bad reviews and rejections from shows and museums. Jordan-Smith had never painted before, but grabbed a canvas and some paints and brushes, and slapped together a painting that looked like it had been done either by a four-year-old or a very talented chimp. He signed it "Pavel Jerdanowitch" -- a Russianized version of Paul Jordan -- and took a brooding photograph of himself to accompany the submission:
Jordan-Smith as himself (left), and as Pavel Jerdanowitch (right)
He said that his new school of art was called "Disumbrationism" -- which means, more or less, "removing the shadows" -- and submitted it to a show.
To his amazement and amusement, it got in, winning high praise, and he found himself with multiple requests for more. He was happy to oblige. Other works included a piece called Illumination, which is a bunch of eyes and lightning bolts (this one was accompanied by the text, "It is midnight and the drunken man stumbles home, anticipating a storm from his indignant wife; he sees her eyes and the lightning of her wrath; it is conscience at work") and a piece called Adoration that depicts, I kid you not, a woman bowing before an idol shaped like an enormous erect penis.
All of Jordan-Smith's works were slapdash (to put it mildly); none took longer than an hour to create. He kept thinking that at some point the critics would wise up and realize they were being taken for a ride.
It never happened. He kept getting rave reviews and demands for more. Eventually, he tired of the hoax, and in August of 1927, made a full confession, which appeared on the front page of the Los Angeles Times.
But even after that, Jerdanowitch refused to die. Some of the critics -- perhaps out of an embarrassed attempt to save face -- maintained that Jordan-Smith's paintings did have artistic merit, even if the painter himself had set out to ridicule art snobbery in general. In 1931, Boston's Robert C. Vose Gallery staged an exhibition of Jordan-Smith's work, including a new work called Gination:
About this one, Jerdanowitch/Jordan-Smith wrote:
It depicts the appalling effects of alcohol on Hollywood women of the studios. It is a moral picture. Note the look of corruption on the lady's skin. Everything is unbalanced. While good gin might not have just that effect, boulevard gin brings it about in short time. The picture is painted in bold strokes and with a sure hand. I believe it is the most powerful of my works.
While I think the whole Disumbrationism hoax is fall-out-of-your-chair funny, I also think it points out something more important; art snobs really do need to get off their damn high horses. What someone thinks is good art (or music or writing or any other creation) is a deeply personal thing. It's not that I have any issue with a specific critic saying "Here's what I like/dislike, and here's why;" what I object to is that they append -- sometimes implicitly, sometimes explicitly -- "... and if you disagree with me, you're wrong."
I have zero tolerance for taste-makers and others cut from that same cloth (such as genre snobs, people who say shit like, "Romance books are virtually all poorly-written trash" or "Science fiction is for geeky teenagers and adults who never progressed beyond that stage" -- both statements which, I hasten to point out, I actually saw in print). Who the hell set you up to be the arbiter of worth? As a novelist, I've had to steel myself to accept the fact that not everyone will like what I write, such as the person who said about my novel Sephirot, "This is a sophomoric attempt to blend fantastical fiction with poorly-understood philosophy."
Yeah, that stung at first, but -- okay. You didn't like it. I probably don't like some of the books you adore.
It's why these things are called "opinions."
So I think of Disumbrationism as bursting the bubble not of artists, but of people who appoint themselves as the Gatekeepers of Taste.
Although it is a little ironic that Paul Jordan-Smith lived until 1971, and is more famous for his hoax paintings than he is for any of his novels.
Life's tough for creative types, and the critics and snobs make it tougher -- often without contributing anything of value themselves. To me, the important thing is that we continue to express ourselves through our art, music, and writing. We should work to improve our skills, of course; our ability to convey what we intended will improve if we have better facility with the medium we're working in.
But keep in mind what the brilliant French Impressionist Edgar Degas said: "Art critic! Is that a profession? When I think we are stupid enough, we painters, to solicit those people's compliments and to put ourselves into their hands, I think, what a shame! Should we even accept that they talk about our work?"
His contemporary, Paul Cézanne, put it more succinctly: "Don't be an art critic. Paint. There lies salvation."
I learned a new term yesterday: parasocial relationship.
It means "a strong, one-sided social bond with a fictional character or celebrity." I've never much gotten the "celebrity" side of this; I don't, for example, give a flying rat's ass who is and is not keeping up with the Kardashians. But fictional characters?
Oh, yeah. No question. I have wondered if my own career as a novelist was spurred by the parasocial relationships (now that I know the term, dammit, I'm gonna use it) I formed with fictional characters very early on. In my first two decades, I was deeply invested in what happened to:
The intrepid Robinson family in Lost in Space. This might have been in part because I had a life-threatening crush on Judy Robinson, played by Marta Kristen, who is drop-dead gorgeous even though in retrospect the character she played didn't have much... character.
The crew of the U.S.S. Enterprise. Some of the old Star Trek episodes are almost as cringeworthy as Lost in Space, but when I was ten and I heard Scotty say, "The warp core is gonna blow! I canna stop it, Captain! Ye canna change the laws of physics!", I believed him.
Carl Kolchak from the TV series The Night Stalker. Okay, so apparently I gravitated toward cringeworthy series.
Luke Skywalker and his buddies. I'll admit it, I cried when Obi-Wan died, even though you find out immediately afterward that he's still around in spirit form, if Becoming One With The Force can be considered an afterlife.
Books hooked me as well, sometimes even more powerfully than television and movies. A Wrinkle in Time, The Chronicles of Narnia, Lord of the Rings, The Lathe of Heaven, Something Wicked This Way Comes, The Chronicles of Prydain... I could go on and on. Most of which caused the shedding of considerable numbers of tears over the fate of some character or another.
More recently, my obsession is Doctor Who, which will come as no shock to regular readers of Skeptophilia because I seem to find a way to work some Who reference into every other post. Not only do I spend an inordinate time discussing Doctor Who trivia with other fans, I have found a way to combine this with another hobby:
I made (L-to-R) a ceramic Weeping Angel, Dugga Doo, Dalek, Beep the Meep, and K-9, which sit on my desk watching me as I work. I'm careful not to blink.
The reason this comes up is a paper in The Journal of Social and Personal Relationships that looked at these parasocial relationships -- specifically, whether the COVID-19 pandemic and the uncertain years following had weakened our relationships with actual people, perhaps with a commensurate strengthening of our one-sided relationships with fictional characters.
The heartening results are there hasn't been a weakening of our bonds to our friends, but our bonds have strengthened to the fictional characters we love. So, real friends of mine, you don't need to worry that my incessant fanboying over the Doctor is going to impact our relationship negatively, unless you get so completely fed up with my obsession you decide to hang around with someone who wants to discuss something more grounded in reality, like fantasy football teams.
"The development, maintenance, and dissolution of socio-emotional bonds that media audiences form with televised celebrities and fictional characters has long been a scholarly interest of mine," said study author Bradley J. Bond, of the University of San Diego, in an interview with PsyPost. "The social function of our parasocial relationships with media figures has been debated in the literature: do our parasocial relationships supplement our real-life friendships? Can they compensate for deficiencies in our social relationships?... Social distancing protocols and quarantine behaviors that spawned from the global COVID-19 pandemic provided an incredibly novel opportunity to study how our parasocial relationships with media figures function as social alternatives when the natural environment required individuals to physically distance themselves from their real-life friends... [The research suggests that] our friendships are durable, and we will utilize media technologies to maintain our friendships when our opportunities for in-person social engagement are significantly limited. However, our favorite celebrities and fictional characters may become even more important components of our social worlds when we experience severe alterations to our friendships."
Which I find cheering. The events of the last few years have forced us all into coping mode, and it's nice to know that the tendency of many of us to retreat into books, television, and movies isn't jeopardizing our relationships with real people.
So I guess I'm free to throw myself emotionally into fictional relationships. However much they cost me in anguish. For example, I will never forgive Russell T. Davies for what he did to the brilliant and fearless Captain Adelaide Brooke in the last minutes of the episode "The Waters of Mars:"
Dammit, Russell. She (and her entire crew) deserved better.
Be that as it may, it's nice to know I'm not alone in my fanboy tendencies, and that by and large, such obsessions are harmless. Now, y'all'll have to excuse me, because I need to go work on my ceramic replica of the TARDIS. Maybe I can install a little speaker inside it so when I press the button, it'll make the whoosh-whoosh-whoosh noise. How cool would that be?
A recent study found that regardless how thoroughly AI-powered chatbots are trained with real, sensible text, they still have a hard time recognizing passages that are nonsense.
Given pairs of sentences, one of which makes semantic sense and the other of which clearly doesn't -- in the latter category, "Someone versed in circumference of high school I rambled" was one example -- a significant fraction of large language models struggled with telling the difference.
In case you needed another reason to be suspicious of what AI chatbots say to you.
As a linguist, though, I can confirm how hard it is to detect and analyze semantic or syntactic weirdness. Noam Chomsky's famous example "Colorless green ideas sleep furiously" is syntactically well-formed, but has multiple problems with semantics -- something can't be both colorless and green, ideas don't sleep, you can't "sleep furiously," and so on. How about the sentence, "My brother opened the window the maid the janitor Uncle Bill had hired had married had closed"? This one is both syntactically well-formed and semantically meaningful, but there's definitely something... off about it.
The problem here is called "center embedding," which is when there are nested clauses, and the result is not so much wrong as it is confusing and difficult to parse. It's the kind of thing I look for when I'm editing someone's manuscript -- one of those, "Well, I knew what I meant at the time" kind of moments. (That this one actually does make sense can be demonstrated by breaking it up into two sentences -- "My brother opened the window the maid had closed. She was the one who had married the janitor Uncle Bill had hired.")
Then there are "garden-path sentences" -- named for the expression "to lead (someone) down the garden path," to trick them or mislead them -- when you think you know where the sentence is going, then it takes a hard left turn, often based on a semantic ambiguity in one or more words. Usually the shift leaves you with something that does make sense, but only if you re-evaluate where you thought the sentence was headed to start with. There's the famous example, "Time flies like an arrow; fruit flies like a banana." But I like even better "The old man the boat," because it only has five words, and still makes you pull up sharp.
The water gets even deeper than that, though. Consider the strange sentence, "More people have been to Berlin than I have."
This sort of thing is called a comparative illusion, but I like the nickname "Escher sentences" better because it captures the sense of the problem. You've seen the famous work by M. C. Escher, "Ascending and Descending," yes?
The issue both with Escher's staircase and the statement about Berlin is if you look at smaller pieces of it, everything looks fine; the problem only comes about when you put the whole thing together. And like Escher's trudging monks, it's hard to pinpoint exactly where the problem occurs.
I remember a student of mine indignantly telling a classmate, "I'm way smarter than you're not." And it's easy to laugh, but even the ordinarily brilliant and articulate Dan Rather slipped into this trap when he tweeted in 2020, "I think there are more candidates on stage who speak Spanish more fluently than our president speaks English."
It seems to make sense, and then suddenly you go, "... wait, what?"
An additional problem is that words frequently have multiple meanings and nuances -- which is the basis of wordplay, but would be really difficult to program into a large language model. Take, for example, the anecdote about the redoubtable Dorothy Parker, who was cornered at a party by an insufferable bore. "To sum up," the man said archly at the end of a long diatribe, "I simply can't bear fools."
A great many of Parker's best quips rely on a combination of semantic ambiguity and idiom. Her review of a stage actress that "she runs the gamut of emotions from A to B" is one example, but to me, the best is her stinging jab at a writer -- "His work is both good and original. But the parts that are good are not original, and the parts that are original are not good."
Then there's the riposte from John Wilkes, a famously witty British Member of Parliament in the last half of the eighteenth century. Another MP, John Montagu, 4th Earl of Sandwich, was infuriated by something Wilkes had said, and sputtered out, "I predict you will die either on the gallows or else of some loathsome disease!" And Wilkes calmly responded, "Which it will be, my dear sir, depends entirely on whether I embrace your principles or your mistress."
All of this adds up to the fact that languages contain labyrinths of meaning and structure, and we have a long way to go before AI will master them. (Given my opinion about the current use of AI -- which I've made abundantly clear in previous posts -- I'm inclined to think this is a good thing.) It's hard enough for human native speakers to use and understand language well; capturing that capacity in software is, I think, going to be a long time coming.
It'll be interesting to see at what point a large language model can parse correctly something like "Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo." Which is both syntactically well-formed and semantically meaningful.
Have fun piecing together what exactly it does mean.
I was pondering the question of what the hell is wrong with so many of the people in positions of power on our planet, and I've come to the conclusion that part of it is that they've lost the capacity to feel awestruck.
When we're awestruck, in a way, our entire world gets turned on its head. The day-to-day concerns that take up most of our mental and emotional space -- jobs, relationships, paying the bills, keeping up with household chores, the inevitable aches and pains -- suddenly are drowned by a sense that in the grand scheme of things, we are extremely small. It's not (or shouldn't be) a painful experience. It's more that we are suddenly aware that our little cares are just that: little. We live in a grand, beautiful, mysterious, dazzling universe, and at the moments when we are privileged to perceive that, our senses are swept away.
The philosophers have come up with a name for such experiences: numinous. It doesn't imply a connection to a higher power (although it manifests that way, or is interpreted that way, for some people). German writer Rudolf Otto describes such a state as "a non-rational, non-sensory experience or feeling whose primary and immediate object is outside the self... This mental state presents itself as wholly other, a condition absolutely sui generis and incomparable, whereby the human being finds himself utterly abashed."
What would happen if you couldn't -- or were afraid to -- experience awe? This would trap you in the petty quotidian trivia of life, and very likely magnify their importance in your mind, giving them far more gravitas than they deserve. I suspect it could also magnify your own self-importance.
It'd be interesting to see if there's an inverse correlation between narcissism and our capacity to feel awestruck. After all, how could you simultaneously perceive the glory and grandeur of the universe, and remain convinced that your needs are the most important thing within it? And if you combine narcissism with amorality, you produce an individual who will never admit fault, never look beyond their own desires, and stop at nothing to fulfill them.
We could probably all name a few prominent people this describes.
I think the two things that have the greatest ability to make me feel awe are music and astronomy. Music has had the ability to pick me up by the emotions and swing me around since I was very small; my mom used to tell the story of my being about four and begging her to let me use the record player. She finally relented (one of the few times she ever did) and showed me how, and -- to my credit -- I never damaged a single record. They were simply too important to me.
Just a couple of days ago, I was in the car, and Ralph Vaughan Williams's Fantasia on a Theme by Thomas Tallis came on the classical station I was listening to. If I had to name one piece that has that ability to lift me out of myself, that's the one I'd pick. The first time I heard it, as a teenager, I ended up with tears streaming down my face, and honestly had been unaware of where I was for the entire fifteen-minute play time.
It's astronomy, though, that is why this topic comes up today. A paper this week in the journal Astronomy and Astrophysics describes a new study of the Silver Coin Galaxy in the constellation Sculptor, a beautiful spiral galaxy about 11.4 million light years away. The study, which required fifty hours of time at the European Southern Observatory in Chile, produced an image with unprecedented detail:
The Silver Coin is called a "starburst galaxy," a region of space undergoing an exceptionally high rate of star formation, so it's of great interest to astronomers and astrophysicists as we learn more about how galaxies, stars, and planetary systems form and evolve. "Galaxies are incredibly complex systems that we are still struggling to understand," said Enrico Congiu, who led the study. "The Sculptor Galaxy is in a sweet spot. It is close enough that we can resolve its internal structure and study its building blocks with incredible detail, but at the same time, big enough that we can still see it as a whole system."
In that one rectangular photograph is captured the light from billions of stars. From what we know of stars in our own galaxy, it's likely that the majority of those points of light have their own planetary systems. It's not certain -- but many astronomers think it's very likely -- that a good many of those planets host life. Some of that life might be intelligent, and looking back at us through their own telescopes, wondering about us as we do about them.
How could anyone look at this image, think those thoughts, and not be awestruck?
To me, that was part of what I wanted as a science teacher. I honestly couldn't have cared less if my students got to the end of the year and couldn't tell me what the endoplasmic reticulum did. (If they need to know that at some point in their lives, they can look it up.) What I do care deeply about is that they know how to think critically, can distinguish truth from fiction, and have enough basic understanding of biology to be able to make good decisions about their health and the environment. And in addition, I tried to instill in them a sense of wonder at how cool science is.
That I did at least sometimes succeed is supported by a funny incident from not long before I retired. I was having one of our required twice-yearly administrator observations, and the principal was watching me teach a lesson to my AP Biology class. I recall that it was something about genetics -- always a favorite subject -- but I can't remember what exactly the topic was that day. But something I said made one kid's eyes pop open wide, and he said, "Wow, that is so fucking cool."
Then he had the sudden aghast realization that the principal was sitting in the back of the room.
The kid turns around, red-faced, and said, "Oh, my god, Mr. Koeng, I'm sorry."
The principal grinned and said, "No, that's okay. You're right. It is really fucking cool."
I was lucky to work, by and large, for great administrators during my 32-year career, and I often discussed with them my goal as a science teacher of instilling wonder. But I think we all need to land in that space more often. The ability to look around us and say, "Wow. Isn't this amazing?" is incredibly important, and also terribly easy to lose. The morass of daily concerns we're faced with can add up in our minds to something big enough to block out the stars.
And isn't that sad?
So I'll end with an exhortation: find some time this week to look and listen and experience what's around you. Get down and examine the petals of a flower. Go out on a dark, clear night and look up at the stars. Listen to a piece of music -- just listen, don't engage in the "listening while" that most of us do every day. Create the space in your life to experience a little awe.
But don't be surprised if you come out of the experience changed. Being awestruck will do that.