Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
One of the most persuasive pieces of evidence of the common ancestry of all life on Earth is genetic overlap -- and the fact that the percent overlap gets higher when you compare more recently-diverged species.
What is downright astonishing, though, is that there is genetic overlap between all life on Earth. Yeah, okay, it's easy enough to imagine there being genetic similarity between humans and gorillas, or dogs and foxes, or peaches and plums; but what about more distant relationships? Are there shared genes between humans... and bacteria?
The answer, amazingly, is yes, and the analysis of these universal paralogs was the subject of a fascinating paper in the journal Cell Genomics last week. Pick any two organisms on Earth -- choose them to be as distantly-related to each other as you can, if you like -- and they will still share five groups of genes, used for making the following classes of enzymes:
aminotransferases
imidazole-4-carboxamide isomerase
carbamoyl phosphate synthetases
aminoacyl-tRNA synthetases
initiation facter IF2
The first three are connected with amino acid metabolism; the last two, with the process of translation -- which decodes the message in mRNA and uses it to synthesize proteins.
The fact that all life forms on Earth have these five gene groups suggests something wild; that we're looking at genes that were present in LUCA -- the Last Universal Common Ancestor, our single-celled, bacteria-like forebear that lived in the primordial seas an estimated four billion years ago. Since then, two things happened -- the rest of LUCA's genome diverged wildly, under the effects of mutation and selection, so that now we have kitties and kangaroos and kidney beans; and those five gene groups were under such extreme stabilizing selection that they haven't significantly changed, in any of the branches of the tree of life, in millions or billions of generations.
The authors write:
Universal paralog families are an important tool for understanding early evolution from a phylogenetic perspective, offering a unique and valuable form of evidence about molecular evolution prior to the LUCA. The phylogenetic study of ancient life is constrained by several fundamental limitations. Both gene loss across multiple lineages and low levels of conservation in some gene families can obscure the ancient origin of those gene families. Furthermore, in the absence of an extensive diagnostic fossil record, the dependence of molecular phylogenetics on conserved gene sequences means that periods of evolution that predated the emergence of the genetic system cannot be studied. Even so, emerging technologies across a number of areas of computational biology and synthetic biology will expand our ability to reconstruct pre-LUCA evolution using these protein families. As our understanding of the LUCA solidifies, universal paralog protein families will provide an indispensable tool for pushing our understanding of early evolutionary history even further back in time, thereby describing the foundational processes that shaped life as we know it today.
It's kind of mind-boggling that after all that time, there's any commonality left, much less as much as there's turned out to be. "The history of these universal paralogs is the only information we will ever have about these earliest cellular lineages, and so we need to carefully extract as much knowledge as we can from them," said Greg Fournier of MIT, who co-authored the paper, in an interview with Science Daily.
So all life on Earth really is connected, and the biological principle of "unity in diversity" is literally true. Good thing for us; the fact that we have shared metabolic pathways -- and especially, shared genetic transcription and translation mechanisms -- is what allows us to create transgenic organisms, which express a gene from a different species. For example, this technique is the source of most of the insulin used by the world's diabetics -- bacteria that have been engineered to contain a human insulin gene. Bacteria read DNA exactly the same way we do, so they transcribe and translate the human insulin gene just as our own cells would, producing insulin molecules identical to our own.
This is also, conversely, why the idea of an alien/human hybrid would never work. Even assuming that some alien species we met was humanoid, and had all the right protrusions and indentations to allow mating to work, there is just about a zero likelihood that the genetics of two species that didn't have a common ancestor would line up well enough to allow hybridization. Consider that most of the time, even relatively closely-related terrestrial species can't hybridize and produce fertile offspring; there's no way humans could do so with any presumed alien species.
Star Trek's claims to the contrary notwithstanding.
So that's our mind-blowing science news of the day. The discovery of five gene families that were present in our ancestors four billion years ago, and which are still present today in every life form on Earth. Some people apparently think it's demeaning to consider that we're related to "lower" species; me, I think it's amazingly cool to consider that everything is connected, that I'm just one part of a great continuum that has been around since not long after the early Earth cooled enough to have liquid water. All the more reason to take care of the biosphere -- considering it's made up of our cousins.
If I had to pick the scientific law that is the most misunderstood by the general public, it would have to be the Second Law of Thermodynamics.
The First Law of Thermodynamics says that the total quantity of energy and mass in a closed system never changes; it's sometimes stated as, "Mass and energy cannot be destroyed, only transformed." The Second Law states that in a closed system, the total disorder (entropy) always increases. As my long-ago thermodynamics professor put it, "The First Law says you can't win; the Second Law says you can't break even."
Hell of a way to run a casino, that.
So far, there doesn't seem to be anything particularly non-intuitive about this. Even from our day-to-day experience, we can surmise that the amount of stuff seems to remain pretty constant, and that if you leave something without maintenance, it tends to break down sooner or later. But the interesting (and less obvious) side starts to appear when you ask the question, "If the Second Law says that systems tend toward disorder, how can a system become more orderly? I can fling a deck of cards and make them more disordered, but if I want I can pick them up and re-order them. Doesn't that break the Second Law?"
It doesn't, of course, but the reason why is quite subtle, and has some pretty devastating implications. The solution to the question comes from asking how you accomplish re-ordering a deck of cards. Well, you use your sensory organs and brain to figure out the correct order, and the muscles in your arms and hands (and legs, depending upon how far you flung them in the first place) to put them back in the correct order. How did you do all that? By using energy from your food to power the organs in your body. And to get the energy out of those food molecules -- especially glucose, our primary fuel -- you broke them to bits and jettisoned the pieces after you were done with them. (When you break down glucose to extract the energy, a process called cellular respiration, the bits left are carbon dioxide and water. So the carbon dioxide you exhale is actually broken-down sugar.)
Here's the kicker. If you were to measure the entropy decrease in the deck of cards, it would be less -- way less -- than the entropy increase in the molecules you chopped up to get the energy to put the cards back in order. Every time you increase the orderliness of a system, it always (1) requires an input of energy, and (2) increases the disorderliness somewhere else. We are, in fact, little chaos machines, leaving behind a trail of entropy everywhere we go, and the more we try to fix things, the worse the situation gets.
I've heard people arguing that the Second Law disproves evolution because the evolutionary model claims we're in a system that has become more complex over time, which according to the Second Law is impossible. It's not; and in fact, that statement betrays a fundamental lack of understanding of what the Second Law means. The only reason why any increase in order occurs -- be it evolution, or embryonic development, or stacking a deck of cards -- is because there's a constant input of energy, and the decrease in entropy is offset by a bigger increase somewhere else. The Earth's ecosystems have become more complex in the 4.5 billion year history of life because there's been a continuous influx of energy from the Sun. If that influx were to stop, things would break down.
Fast.
The reason all this comes up is because of a paper in Physical Review X that gives another example of trying to make things better, and making them worse in the process. This one has to do with the accuracy of clocks -- a huge deal to scientists who are studying the rate of reactions, where the time needs to be measured to phenomenal precision, on the scale of nanoseconds or better. The problem is, we learn from "Measuring the Thermodynamic Cost of Timekeeping," the more accurate the clock is, the higher the entropy produced by its workings. So, in effect, you can only measure time in a system to the extent you're willing to screw the system up.
All clocks, in some form or another, use the evolution of nature towards higher entropy states to quantify the passage of time. Due to the statistical nature of the second law and corresponding entropy flows, fluctuations fundamentally limit the performance of any clock. This suggests a deep relation between the increase in entropy and the quality of clock ticks... We show theoretically that the maximum possible accuracy for this classical clock is proportional to the entropy created per tick, similar to the known limit for a weakly coupled quantum clock but with a different proportionality constant. We measure both the accuracy and the entropy. Once non-thermal noise is accounted for, we find that there is a linear relation between accuracy and entropy and that the clock operates within an order of magnitude of the theoretical bound.
Study co-author Natalia Ares, of the University of Oxford, summarized their findings succinctly in an article in Science News; "If you want a better clock," she said, "you have to pay for it."
So a little like the Heisenberg Uncertainty Principle, the more you try to push things in a positive direction, the more the universe pushes back in the negative direction.
Apparently, even if all you want to know is what time it is, you still can't break even.
So that's our somewhat depressing science for the day. Entropy always wins, no matter what you do. Maybe I can use this as an excuse for not doing housework. Hey, if I make things more orderly here, all it does is mess things up elsewhere, so what's the point?
Some of you may have heard of the Sylacauga meteorite -- a 5.5 kilogram, grapefruit-sized piece of rock that gained more notoriety than most because it crashed through a woman's roof on the afternoon of November 30, 1954, and hit her on the hip as she slept on the sofa.
The victim, Ann Hodges of Sylacauga, Alabama, was bruised but otherwise okay.
Here's Hodges with her rock, and an expression that clearly communicates, "A woman can't even take a damn nap around here without this kind of shit happening."
Hodges isn't the only one who's been way too close to falling space rocks. In August of 1992 a boy in Mbale, Uganda was hit by a small meteorite -- fortunately, it had been slowed by passing through the tree canopy, and he was startled but unharmed. Only two months later, a much larger (twelve kilogram) meteorite landed in Peekskill, New York, and clobbered a parked Chevy Malibu:
The most deadly meteorite fall in historical times, though, is a likely airburst and subsequent shower of rocks that occurred near Qingyang, in central China, in the spring of 1490. I say "likely" because there haven't been any meteorites from the incident that have survived to analyze, but a meteoritic airburst -- a "bolide" -- is the explanation that seems to fit the facts best.
Stones fell like rain in the Qingyang district. The larger ones were four to five catties [a catty is a traditional Chinese unit of mass, equal to about a half a kilogram], and the smaller ones were two to three catties. Numerous stones rained in Qingyang. Their sizes were all different. The larger ones were like goose's eggs and the smaller ones were like water-chestnuts. More than ten thousand people were struck dead. All of the people in the city fled to other places.
The magnitude of the event brings up comparisons to the colossal Tunguska airburst of 1908, when a meteorite an estimated fifty meters in diameter exploded above a (fortunately) thinly-populated region of Siberia, creating a shock wave that blew down trees radially outward for miles around, and registered on seismographs in London.
Interestingly, the Qingyang airburst wasn't the only strange astronomical event in 1490; Chinese, Korean, and Japanese astronomers also recorded the appearance of a new comet in December of that year. From their detailed records of its position, modern astronomers have calculated that its orbit is parabolic -- in other words, it won't be back, and is currently on its way out of the Solar System. However, it left a debris trail along the path of its one pass near us which is thought to be the origin of the bright Quadrantid meteor shower, which peaks in early January.
It's likely, however, that the Qingyang airburst and the December comet were unrelated events.
Much has been made of the likelihood of Earth being struck by an asteroid, especially something like the Chicxulub Impactor, which 66 million years ago ended the hegemony of the dinosaurs. Thing is, most of the bigger items in the Solar System's rock collection have been identified, tracked, and pose no imminent threat. (There is, however, a four percent chance that a seventy-meter-wide asteroid will hit the Moon in 2032, triggering a shower of debris, some of which could land on Earth.)
But there are lots of smaller rocks out there that we'd never see coming. The 2013 Chelyabinsk airburst was estimated to be from an eighteen-meter-wide meteor, and created a shock wave that blew in windows, and a fireball that was visible a hundred kilometers away. Our observational ability has improved dramatically, but eighteen meters is still below the threshold of what we could detect before it's too late.
The Double Asteroid Redirection Test (DART) Mission of 2022 showed that if we had enough time, we could theoretically run a spacecraft into an asteroid and change its orbit enough to deflect it, but for smaller meteors, we'd never spot them soon enough.
The good part of all this is that your chance of being hurt or killed by a meteorite is still way less than a lot of things we take for granted and do just about every day, like getting into a car. That last bit, though, is why people tend to over-hype the risk; we do that with stuff that's weird, things that would make the headlines of your local newspaper. (I remember seeing a talk about risk that showed a photograph of an erupting volcano, a terrorist bombing, an airplane crash, and a home in-ground swimming pool, and the question was, "Which of these is not like the others?" The answer, of course, was the swimming pool -- because statistically, it's much more likely to kill someone than any of the others.)
So it's nothing to lose sleep over. Unless you're Ann Hodges of Sylacauga, Alabama, who was just trying to take a damn nap for fuck's sake when this stupid rock came crashing through the roof and hit her, if you can believe it.
The connection between a spoken language and its written form, known as its orthography, is seldom straightforward.
Much has been made of the non-intuitive symbol-to-sound correspondence in English -- you've probably seen the old quip that "learning English spelling is rough, but it can be taught through tough, thorough thought, though." There are two main reasons for the often weird pronunciation rules (and multiple exceptions) in English. First, there's a general rule of thumb that the older a language's writing system is, the more time it's had to diverge from pronunciation. (Put a different way, pronunciation tends to shift faster than written language does.) Second, English is an amalgam of Germanic/Old English and Romance/Norman French, each of which had its own (different) pronunciation rules, with borrow-words added in from just about every culture English-speakers have contacted.
Honestly, though, for a strange writing-to-pronunciation correspondence, I don't think any language in the world can beat Irish and Scottish Gaelic. In what sensible system would the feminine name Caoimhe be pronounced "kwee-va?"
Now, don't get me wrong, I think Irish and Scottish Gaelic are both gorgeous languages. I just look at the written forms and think, "I can't even make a guess at how that's pronounced."
Of course, there's no problem that arose naturally and organically that humans can't make worse out of sheer cussedness. Deliberate misspellings in (for example) business names make me wonder how any child grows up knowing how to spell correctly. Near my village there used to be a children's dance studio -- now long out of business -- called, I shit you not, "The Shug'r-n-Spyce Skool of Dance."
And I'm with Dave Barry, who said that any business tacking an extra "e" onto the end of words to make them look old and quaint should be taxed at a higher rate. ("Ye Olde Curiositie Shoppe.")
We add another layer of weirdness when there are ill-advised attempts to meld English with non-English alphabets. There's a whole thing called "faux Cyrillic," where Cyrillic letters are thrown in to give something a pseudo-Russian flavor. Just look at the header on the game Tetris -- it's almost always written "TETЯIS." The problem is, "Я" isn't pronounced /r/, it's pronounced /ya/, so the game spelled this way would be "Tetyais."
Then, there's this sign in front of a Greek restaurant that I saw while visiting family in Santa Fe, New Mexico:
"Ooh, my favorite! Grssk Rthtphsssrphs!"
Throw into the mix the recent development of "Textspeak" -- lmfao and brb and ttyl are so commonly used that they don't even flag as misspellings -- and you have the makings of a confused mess. These sorts of conventions aren't only created to speed up communication, however; they can also be used to hide -- like "Leet," an online spelling convention originating in the late 1980s to allow hackers to communicate with each other on message boards without alerting the moderators by using forbidden keywords. (An example of Leet is that "elite hacker" is written "31337 H4XØR" -- the first word using 3, 1, and 7 for the letters E, L, and T, respectively, so the first word is "eleet.")
It's always a struggle to stay one step ahead of bad actors, and there are scammers who have used this kind of technique to get people to respond to scam emails (or click on their websites), by substituting one similar, but non-English, character in a legitimate-looking website address. "Citibank.com," for example, might turn into "Citibɑnk.com" -- substituting the IPA symbol "ɑ" for the standard "a" -- and unless you were looking closely, you might well not notice the difference, and click your way to a website that is definitely not the real Citibank.
So what we end up with is a mishmash of problems that arose from a combination of the vagaries of language evolution and deliberate attempts to mess things up further, along with a good measure of pure idiocy:
As Julius Caesar so famously said, "Vspph, vphdph, vphcph."
In the above case, there's also an apparent disregard of what my tattoo artist said to me -- "Be sure it's what you want, because that shit's permanent."
So that's this morning's musings on some weird features of written language. Understand that I'm not one of those types who rails at every alteration from the King's English -- I'm about as far from a prescriptivist as you can get. I can't help but wonder, though, if some of what's happened has actually made it more difficult to be understood.
Of course, if you're a ЯUSSIAИ 31337 H4XØR, that's probably exactly what you wanted.
The Copernican principle is an idea from cosmology that can be summed up as "we're nothing special."
I'm sure you all know that there was a widespread belief prior to Nicolaus Copernicus's proposal of the heliocentric model that the Earth was at the center of the universe, with everything up to and including the stars traveling in perfect circles around us. Part of this came from observation, given that the Sun and stars and all appear to be circling us; but a large part of this misapprehension was motivated by religion. Not only was there at least some passing mentions in the Bible that suggested geocentrism was correct (such as "Sun, stand thou still over Gibeon" from Joshua chapter 10), it seemed that as the site of the Garden of Eden and the Incarnation and Crucifixion, of course God would put us at the center of the universe.
Then, along came Copernicus, followed by Galileo (who, upon discovering four of the moons of Jupiter, demonstrated that at least some celestial bodies didn't revolve around us), Kepler, and Tycho Brahe, the latter two of whom showed that astronomical objects don't even demonstrate heavenly perfection by traveling in circles, but move in "imperfect" ellipses.
Since then, we've been pushed farther and farther from the center of things. In 1924 astronomer Edwin Hubble proved that not only was the Milky Way not the only galaxy, but many of the "nebulae" (the Latin word for "cloud," since prior to that there were no telescopes powerful enough to resolve individual stars in them) were "island universes" themselves, with the nearest -- Andromeda -- at an astonishing 2.5 million light years away.
Hubble also used the strange red shift of light from these distant objects to conjecture that the universe was expanding, the first step toward establishing the Big Bang model of the origin of the universe. Oddly, though, almost everything Hubble looked at was red-shifted; it appeared that the whole universe was rushing away from us, as if we -- once again -- were at the center of things. But a bit of three-dimensional geometry showed that this is exactly what we'd expect if space itself were expanding, carrying objects along with it. No matter where you are, whether here on Earth or on a planet in the Whirlpool Galaxy over thirty million light years away, it looks like everything is moving away from you.
The Whirlpool Galaxy (Messier 51) [Image is in the Public Domain courtesy of NASA and the ESA]
Most of the data we have suggests that the universe is largely homogeneous (any given volume of space is likely to have on average the same amount of matter in it) and isotropic (every direction you aim your telescope looks approximately the same). Not even the region of space we sit in is remarkable in any way.
The Copernican principle is sometimes called the principle of mediocrity; we don't occupy a privileged place in the cosmos. And this same principle has cropped up elsewhere. Genetics and evolution have shown us humans to be part of the Great Continuum of Life, just one branch of the extensive tree that includes all living things. (And our nearest relatives, the great apes, share something like 98-99% of our genetic makeup.) We may be the smartest animals -- although events of the last year have made me question that -- but animals we most certainly are.
And a lot of people really don't like this. I'm not just talking about the creationists, who have a doctrine-based reason for disbelieving all of the above; but there's a certain brand of woo-woo that rebels against the Copernican principle just as hard, only in a different way. And even if they come to different conclusions than the biblical literalists, I find myself wondering if they're not, at their cores, motivated by the same drive.
Because he's back at it, even bigger and better. Now, he's telling us that the human species was created in a lab by superpowerful aliens from the Andromeda Galaxy, who pulled together and melded the DNA from twenty-two diverse alien species to produce us. (I guess the fact of our having a near-perfect genetic overlap with other primates here on Earth is just a strange coincidence.) He also has some insights about what to expect now that this astonishing information has been revealed:
[A] “dimensional collapse” [has] already begun, marked by changes in sound and color. [Collier] mentioned that people would soon start hearing about “rods” — streaks of light captured on video. According to him, these were etheric, fourth- and fifth-dimensional craft moving through space, unaware that they were passing right through our dimension. He explained this as a sign of an ongoing implosion between dimensions...
[M]ore ghosts and apparitions would become visible because souls trapped between the third and fourth densities would appear more frequently as Earth’s frequency rose. Many of these souls, unless healed, would eventually transition out of this plane.
He also apparently said that we should "be cautious about anyone claiming to be an angel," which is good advice, but not for the reason he thinks.
What struck me about all this is not that some wingnut has a crazy idea -- after all, that's what wingnuts do -- but that this is really nothing more than a modern iteration of the "We are too special!" mental set that has been plaguing us pretty much forever. A lot of pseudoscience works this way, doesn't it? Astrology posits that the (apparent) arrangements and movements of astronomical bodies somehow shapes the courses of human lives. Numerology suggests that the chance occurrence of patterns of numbers is because the universe is set up to send us information. Even practices like Tarot divination presuppose that your own life's path is important enough to influence magically what comes up from shuffling and dealing a deck of cards.
I mean, I get that life (way) off-center is a little scary and disorienting sometimes. Bill Watterson's brilliant Calvin & Hobbes captured it perfectly:
But I think it's better to relax into the awe of living in a vast, grand, only-partly-comprehensible cosmos than either succumbing to fear of our own insignificance or else resorting to making shit up to try, futilely, to shove us back toward the center of things.
It's enough that we have, against all odds, begun to take our first tentative steps into understanding how everything works. That's all the self-aggrandizement I need as a human. I'll end with the short but mind-blowing quote from Carl Sagan: "The cosmos is within us. We are made of star-stuff. We are a way for the universe to know itself."
In September of 1931, the Irving family -- James and Margaret, and their thirteen-year-old daughter Voirrey -- started hearing strange noises from the walls. At first it was just furtive scratching and rustling, but soon they could discern words. James and Voirrey made some attempt to speak to whatever-it-was, but were alarmed one evening when James said, "What in the name of God can he be?" and heard a high-pitched, thin voice repeat those words back in a singsong fashion.
I was immediately (and unfortunately) reminded of Brown Jenkin, the mocking, squeaky-voiced demonic familiar of the evil Keziah Mason in the H. P. Lovecraft short story "Dreams in the Witch-House." But unlike Brown Jenkin, who would happily bite your toes off as you slept, the creature in the Irving house apparently intended them no harm. Eventually they were able to coax out a small furry animal that was somehow sentient, and (conveniently) spoke English. It introduced itself as Gef (pronounced "jeff"), and said -- I shit you not -- that it was a mongoose who had been born in New Delhi, India in 1852.
How he got from India to the Isle of Man was never clarified, but after all, that's hardly the only weird thing about this story.
Voirrey reported that Gef was "the size of a rat," but had yellow fur and a bushy tail. She also claimed -- and her father backed her up -- that Gef had told them that he was "an extra extra clever mongoose," but also that he was "an earthbound spirit" and "a ghost in the form of a weasel," although it's hard to see how he could be all three simultaneously. He also told Voirrey, "I am a freak. I have hands and I have feet, and if you saw me you'd faint, you'd be petrified, mummified, turned into stone or a pillar of salt!"
Supposedly she saw him many times, and none of those things happened to her, so I'm inclined to take his pronouncements with a grain of salt.
Once folks found out about the Irvings' claims, naturally the questions started coming. It was nothing to worry about, James insisted; Gef had already shown himself to be helpful, doing things like warning them when strangers were on the property, waking family members when they overslept, and even once putting out the fire in the stove when it had inadvertently been left burning after the family retired for the night. For myself, I'd have been less worried about Gef's usefulness than establishing that he actually existed, but apparently most folks in the area just shrugged and said, "Huh. A magical talking mongoose. How about that," and went on about their business.
A few, though, wanted more evidence (fancy that!), and the Irvings were happy to oblige. More than one person who visited them heart Gef's voice, and some saw signs like a pair of yellow eyes staring at them from underneath a bed. But the Irvings seemed unperturbed, and said they were perfectly happy having Gef live with them, and rewarded him by leaving out chocolate, bananas, and biscuits for him to eat.
Then, neighbors began to claim they'd actually seen Gef, too. Two teenagers corroborated the yellow fur and bushy tail, and a villager named George Scott made a drawing of him:
What astonishes me about all this is how seriously people took it. A few people called it out as a hoax -- one claimed that it was thirteen-year-old Voirrey's doing, that she was an accomplished ventriloquist and had hoodwinked her parents (and everyone else). Voirrey heatedly denied this, and in fact was still denying it shortly before she died in 2005 at the age of eighty-seven.
But the reports got the attention of the psychic investigators, and that's when the story really exploded. Harry Price got involved -- you may recall his name from my posts about the haunting of Borley Rectory and the odd story of the Brown Lady of Raynham Hall -- and this brought Gef into the public eye. Price is kind of a notorious figure in the history of psychic investigation, because even the True Believer types have to admit that his approach was a little sketchy, with veracity often taking a back seat to publicity. And even Price was suspicious about Gef. The house, he said, was like "one great speaking-tube, with walls like sound boards. By speaking into one of the many apertures in the panels, it should be possible to convey the voice to various parts of the house." Price had also made plaster casts of pawprints supposedly left behind by Gef, and sent them to zoologist Reginald Innes Pocock of the Natural History Museum, and Pocock came back with the rather unsatisfying answer that the prints may have come from a dog, but they definitely hadn't been made by a mongoose, talking or not.
The fact that the Irvings couldn't even get Price on their side was significant. The somewhat more reliable Nandor Fodor, of the Society for Psychical Research, actually stayed in the Irving house for three weeks and saw no evidence of Gef. He speculated that James Irving may have suffered from dissociative personality disorder, and had orchestrated the hoax, using Gef to give voice to a fragment of his psyche.
Despite all this, the Irvings stuck by their story. Gef was real, they said, not a hoax, regardless what anyone thought.
James Irving died in 1945, and Margaret and Voirrey were forced to sell the house at a loss -- its reputation for being haunted evidently reduced its appeal to potential buyers. The next owner, one Leslie Graham, reported that he'd shot and killed Gef, and displayed a body of a furry animal -- but it was black-and-white, and larger than Gef's reported size.
"That's not Gef," Voirrey said.
Naturally, I'm inclined to think the whole thing was a hoax right from the start -- whether by James or Voirrey is unclear. But what's striking about the case is how many people bought into it. You would think that if somebody in your town said, "Oh, by the way, I have an eighty-year-old talking yellow mongoose living in my walls, but it's all cool because he does chores for us as long as we feed him biscuits," everyone would kind of back away slowly, not making any sudden moves, and do what they could to get the person professional help.
Oddly, that didn't happen. After the first flurry of investigations and news articles died down, life pretty much continued the same as before. There was some increase in tourism from people who wanted to see Gef's house, but even that waned as the years passed. Voirrey took in stride her connection to the Case of the Talking Mongoose, and seemed, on the whole, unembarrassed by it -- and also, never admitted it was a hoax.
So that's our strange tale for the day. Hopefully a mood-lightener after some of the darker explorations of the week. Since finding out about Gef, I've been listening for rustling in the walls of my own house, and... nothing. Just as well. The last time I heard something like that it turned out to be a family of red squirrels in our attic, which took forever to get rid of. I don't know what I'd do if we had to deal with a talking mongoose.
In Shirley Jackson's eerie gothic novel We Have Always Lived in the Castle, the main character -- an eighteen-year-old named Merricat Blackwood -- lives in the outskirts of an unnamed village in New England that contains echoes of H. P. Lovecraft's Arkham and Dunwich.
But if you're familiar with Jackson's better-known short story "The Lottery," you know that she was a past master at flipping the script when you least expect it, and about a third of the way through the book, you begin to suspect there's more to the story than meets the eye -- in particular, that there may be some justification to how the villagers see the Blackwoods. I won't spoil the end, but suffice it to say that the unsettling truth behind the relationship between the Blackwoods and the villagers shows once again that the world is a complex place, and very few of us have either purely good or purely evil motives.
Reading We Have Always Lived in the Castle left me thinking, though, that it's not just damaged individuals like Merricat, Constance, and Uncle Julian who are unreliable narrators of their own lives; we all are. We view our fellow humans through the lenses of our own experience, and reflect outward to them the parts of us we want them to see.
As Anaïs Nin put it, "We don't see the world as it is. We see the world as we are."
It doesn't always work, though. You can probably think of times that you discovered someone you thought you knew was hiding something you never dreamed of, or -- conversely -- that some part of you you'd preferred remained well-hidden suddenly came to light. But really, we shouldn't be surprised when this happens. Nearly all of us wear masks with others, showing a particular face at work, another with friends, another with strangers we meet in the market, yet another with our significant others.
To be fair, there's a large measure of this that isn't deliberate deception. When I was a teacher, my professional face in the classroom quite rightly took precedence over any turmoil I was experiencing in my private life. We often choose what to show and what to conceal for good reasons. But the problem is, hiding can become a habit, especially for people who (like myself) suffer from mental illness. When the mask slips with people with depression and anxiety , and we unexpectedly show others what we're going through, it's much less likely that we "suddenly went into a tailspin" than that we'd been pretending to be well for months or years.
Explaining why even our nearest and dearest will often say in shock, "I never realized."
The whole thing got me thinking about a conversation between two of my own characters -- the breezy, outgoing Seth Augustine and the introverted, deeply damaged telepath Callista Lee in Poison the Well:
Seth’s mind returned to his earlier thoughts, about Bethany and the few other people who had disliked him, instantly and almost instinctively. “It can be painful to find out the truth.”
“Not nearly as painful as finding out that no one actually knows what the truth is,” Callista said.
When Seth didn’t respond, she continued, with more animation than he’d heard in her voice yet. “Everyone’s just this bundle of desires and emotions and random thoughts, resentment and love and fear and sex and anger and compassion bubbling right beneath the surface—all in conflict, all of the time, only most people aren’t aware of it. They think things, and their mind looks at them and says ‘this is true’—and they don’t realize that they almost always decide that something is true because it soothes the unpleasant parts—the resentment and fear and anger. It’s not because it actually is true. People believe things because their belief makes the demons quieter.”
We're all unreliable narrators of our own lives, aren't we? And that includes those of us -- I count myself amongst them -- who try to be as truthful as we can. Our determination to be as clear-eyed as possible, not only about others but about ourselves, only goes so far. We're not all hiding a secret as dire as the Blackwoods, I hope. But it highlights how important it is to leave our little self-absorbed bubbles and check in on our friends, often.
It's a well-worn saw by now, but I still remember being told this by a family friend when I was something like six years old. It left me gobsmacked then, and I've never forgotten it. It seems as good a place as any to end this. "Always be kinder than you think you need to be, because everyone you meet is fighting a terrible battle that you know nothing about."