Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, August 5, 2021

Letters from the home world

In choosing topics for this blog, I try not to have it simply devolve into taking random pot shots at crazies.  Loony ideas are a dime a dozen, and given the widespread access to computers that is now available, just about anyone who wants one can have a website.  Given these two facts, it's inevitable that wacky webpages grow like wildflowers on the fields of the internet.

When an alert reader brought this one to my attention, however, I just couldn't help myself.  Entitled "Did Humans Come From Another Planet?", it represents one of the best examples I've ever seen of adding up a bunch of facts and obtaining a wildly wrong answer.  The only ones who, in my experience, do this even better are the people who write for the Institute for Creation Research, and to be fair, they've had a lot longer to practice being completely batshit crazy, so it's only to be expected.

Anyhow, the contention of the "Did Humans Come From Another Planet?" people can be summed up by, "Yes. Duh."  We are clearly aliens, and I'm not just talking about such dubiously human individuals as Mitch McConnell.  All of us, the article claims, descend from an extraterrestrial race.  But how can we prove it?

Well, here's the argument, if I can dignify it with that term.
  1. Human babies are born completely unable to take care of themselves, and remain that way for a long time.  By comparison, other primate babies, despite similar gestation periods, develop much more rapidly.
  2. In a lower gravitational pull, humans could fall down without hurting themselves, "just like a cat or a dog."
  3. Humans have biological clocks, and in the absence of exposure to the external day/night cycle, they come unlocked from "real time" and become free-running.  So, clearly we came from a planet that had a different rotational period.
  4. Humans don't have much body hair.  At least most of us don't, although I do recall once going swimming and seeing a guy who had so much back hair that he could have singlehandedly given rise to 80% of the Bigfoot sightings in the eastern United States.
  5. Geneticists have found that all of humanity descends from a common ancestor approximately 350,000 years ago; but the first modern humans didn't exist until 100,000 years ago.  So... and this is a direct quote, that I swear I am not making up: "In what part of the universe was he [Homo sapiens] wandering for the remaining 250 thousand years?"
Now, take all of this, and add:
  1. Some nonsense about Sirius B and the Dogon tribe, including a bizarre contention that the Sun and Sirius once formed a double-star system, because this "doesn't contradict the laws of celestial mechanics;"
  2. The tired old "we only use 3% of our brains" contention;
  3. Adam and Eve; and
  4. the ancient Egyptians.
Mix well, and bake for one hour at 350 degrees.

The result, of course, is a lovely hash contending that we must come from a planet with a mild climate where we could run around naked all the time, not to mention a lower gravitational pull so we could just sort of bounce when we fall down, plenty of natural food to eat, and "no geomagnetic storms."  I'm not sure why the last one is important, but it did remind me of all of the "cosmic storms" that the folks in Lost in Space used to run into.  And they also came across lots of weird, quasi-human aliens, while they were out there wandering around.  So there you are.


In any case, that's today's example of adding 2 + 2 and getting 439.  All of this just goes to show that even if you have access to a lot of factual information, not to mention the internet, you still need to know how to put that factual information together in order to get the right answer.  For that, you need science, not just a bunch of nutty beliefs, assumptions, and guesses.  So, as usual, science FTW.


Which, of course, applies to a good many more situations than just this one, but as I've already given a nod to the Institute for Creation Research, I'll just end here.

**********************************************

Author and biochemist Camilla Pang was diagnosed with autism spectrum disorder at age eight, and spent most of her childhood baffled by the complexities and subtleties of human interactions.  She once asked her mother if there was an instruction manual on being human that she could read to make it easier.

Her mom said no, there was no instruction manual.

So years later, Pang recalled the incident and decided to write one.

The result, Explaining Humans: What Science Can Teach Us About Life, Love, and Relationships, is the best analysis of human behavior from a biological perspective since Desmond Morris's classic The Naked Ape.  If you're like me, you'll read Pang's book with a stunned smile on your face -- as she navigates through common, everyday behaviors we all engage in, but few of us stop to think about.

If you're interested in behavior or biology or simply agree with the Greek maxim "gnothi seauton" ("know yourself"), you need to put this book on your reading list.  It's absolutely outstanding.

[Note:  if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, August 4, 2021

Music on the brain

A pair of new studies last week in The Journal of Neuroscience analyzed the connections between two phenomena related to music listening that I know all too well -- our ability to replay music in our imaginations, and our capacity for anticipating what the next notes will be when we hear the first part of a melody.

The first, which I seem to excel at, is a bit of a mixed blessing.  It's my one and only superpower -- I can essentially remember tunes forever.  In my ten years as flutist in a Celtic dance band, I had just about every tune in our repertoire memorized.  I'm lousy at connecting the names to the tunes, though; so when my bandmate would say, "Next, let's play 'Drummond Castle,'" and I responded, sotto voce, "How the hell does 'Drummond Castle' go?" she'd say, "It's the one that goes, 'deedly-dum, da-deedly-dum, dum-da-deedly-deedly-deedly,'" then I'd say, "Oh, of course," and proceed to play it -- in the correct key.


[Image licensed under the Creative Commons © Nevit Dilmen, Music 01754, CC BY-SA 3.0]

The most striking example of this was a tune that I remembered literally for decades without hearing it once during that time.  When I was about 25 I took a Balkan dance class, and there was one tune I especially liked.  I intended to ask the instructor what the name of it was, but forgot (indicating that my memory in other respects isn't so great).  In those pre-internet days, searching for it was damn near impossible, so I forgot about it... sort of.  Twenty years went by, and my wife and I went to a nine-day music camp in the California redwoods, and I made friends with an awesome accordionist and all-around nice guy named Simo Tesla.  One day, Simo was noodling around on his instrument, and instantaneously, I said, "That's my tune!"  There was no doubt in my mind; this was the same tune I'd heard, a couple of times, two decades earlier.

If you're curious, this is the tune, which is called "Bojerka":


The downside, of course, is that because I never forget a tune, I can't forget one even if I want to.  I'm plagued by what are called earworms -- songs that get stuck in your head, sometimes for days at a time.  There are a few songs that are such bad earworms that if they come on the radio, I'll immediately change the channel, because even a few notes are enough to imbed the tune into my brain.  (Unfortunately, sometimes just hearing the name is enough.)

And no, I'm not going to give examples, because then I'll spend the rest of the day humming "Benny and the Jets," and heaven knows I don't want to... um...

Dammit.

The second bit -- imagining what comes next in a piece of music -- also has a positive and a negative side.  The negative bit is that it is intensely frustrating when I'm listening to a song and it gets cut off, so that I don't get to hear the resolution.  The importance of resolving a musical phrase was demonstrated by my college choral director, Dr. Tiboris, who to illustrate the concept of harmonic resolution played on the piano, "Hark, the herald angels sing, glory to the newborn..."  And stopped.

Three or four of us -- myself included -- sang out "KING!" because we couldn't stand to leave the phrase unresolved.

The positive side, though, happens when I listen to a piece of music for the first time, and it resolves -- but not in the way I expected.  That thwarting of expectations is part of the excitement of music, and when done right, can send a shiver up my spine.  One of my favorite moments in classical music is a point where you think you know what's going to happen, and... the music explodes in a completely different direction.  It occurs in the pair of pieces "Quoniam Tu Solus Sanctus" and "Cum Sancto Spiritu" from J. S. Bach's Mass in B Minor.  (If you don't have time to listen to the whole thing, go to about 5:45 and listen for the moment you get lifted bodily off the ground.)


All of which is a long-winded way to get around to last week's papers, which look at both the phenomena of imagining music and of anticipating what will happen next, through the use of an EEG to determine what the brain is actually doing.  What the researchers found is that when you are imagining a piece of music, your brain is responding in exactly the same way as it does when you're actually listening to the piece.  When there's a silent bit in the music, your brain is functionally imaging what's coming next -- whether it's real or imagined.

What was more interesting is the brain's response to the notes themselves.  Imagined notes generate a negative change in voltage in the relevant neurons; real notes generate a positive voltage change.  This may be why when our expectations and the reality of what phrase comes next match up, we can often tune it out completely -- the two voltage changes, in essence, cancel each other out.  But when there's a mismatch, it jolts our brains into awareness -- just like what happens at the end of "Quoniam Tu Solus Sanctus."

I find the whole thing fascinating, as it ties together music and neuroscience, two subjects I love.  I've often wondered about why some pieces resonate with me and others don't; why, for example, I love Stravinsky's music and dislike Brahms.  These studies don't answer that question, of course, but they do get at our ability both to remember (and replay) music in our minds, and also why we have such a strong response when music does something contrary to our expectations. 

But I think I'll wind this up, and just add one more musical track that is pure fun -- the "Polka" from Shostakovich's The Age of Gold.  This is Shostakovich letting loose with some loony light-heartedness, and I defy anyone to anticipate what this piece is gonna do next.  Enjoy!



**********************************************

Author and biochemist Camilla Pang was diagnosed with autism spectrum disorder at age eight, and spent most of her childhood baffled by the complexities and subtleties of human interactions.  She once asked her mother if there was an instruction manual on being human that she could read to make it easier.

Her mom said no, there was no instruction manual.

So years later, Pang recalled the incident and decided to write one.

The result, Explaining Humans: What Science Can Teach Us About Life, Love, and Relationships, is the best analysis of human behavior from a biological perspective since Desmond Morris's classic The Naked Ape.  If you're like me, you'll read Pang's book with a stunned smile on your face -- as she navigates through common, everyday behaviors we all engage in, but few of us stop to think about.

If you're interested in behavior or biology or simply agree with the Greek maxim "gnothi seauton" ("know yourself"), you need to put this book on your reading list.  It's absolutely outstanding.

[Note:  if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Tuesday, August 3, 2021

The voices of our ancestors

One of the (many) reasons I love science is that as a process, it opens up avenues to knowledge that were previously thought closed.  Couple that with the vast improvements in technological tools, and you have a powerful combination for exploring realms that once were not considered "science" at all.

Take, for example, historical linguistics, the discipline that studies the languages spoken by our ancestors.  It is a particular fascination of mine -- in fact, it is the field I studied for my MA.  (Yes, I know I spent 32 years teaching biology.  It's a long story.)  I can attest to the fact that it's a hard enough subject, even when you have a plethora of written records to work with, as I did (my thesis was on the effects of the Viking invasions on Old English and Old Gaelic).  When records are scanty, or worse yet, non-existent, the whole thing turns into a highly frustrating, and highly speculative, topic.

This is the field of "reconstructive linguistics" -- trying to infer the characteristics of the languages spoken by our distant ancestors, for the majority of which we have not a single written remnant.  If you look in an etymological dictionary, you will see a number of words that have starred ancestral root words, such as *tark, an inferred verb stem from Proto-Indo-European that means "to twist."  (A descendant word that has survived until today is torque.)  The asterisk means that the word is "unattested" -- i.e., there's no proof that this is what the word actually was, in the original ancestor language, because there are no written records of Proto-Indo-European.  And therein, of course, lies the problem.  Because it's an unattested word, no one can ever be sure if it's correct (which the linguists will tell you straight up; they're not trying to claim more than they should -- thus the asterisks). 

So if you think a particular Proto-Indo-European root reconstructs as *lug and your colleague thinks it's *wuk, you can argue about it till next Sunday and you still will never be certain who's right, as there are very few Proto-Indo-Europeans around these days who could tell you for sure.

Okay, then how do the linguists even come up with a speculative ancestral root?  The inferred words in etymological dictionaries come mainly from the application of one of the most fundamental rules of linguistics: Phonetic changes are regular.




As a quick illustration of this -- and believe me, I could write about this stuff all day -- we have Grimm's Law, which describes how stops in Proto-Indo-European became fricatives in Germanic languages, but they remained stops in other surviving (non-Germanic) Indo-European languages.  One example is the shift of /p/ to /f/, which is why we have foot (English), fod (Norwegian), Fuss (German), fótur (Icelandic), and so on, but poús (Greek), pes (Latin), peda (Lithuanian), etc.  These sorts of sound correspondences allowed us to make guesses about what the original word sounded like.

Note the use of the past tense in the previous sentence.  Because now linguists have a tool that will take a bit of the guesswork out of reconstructive linguistics -- and shows promise to bringing it into the realm of a true science.

An article in Science World Report, entitled "Ancient Languages Reconstructed by Linguistic Computer Program, describes how a team of researchers at the University of British Columbia and the University of California - Berkeley has developed software that uses inputted lexicons to reconstruct languages.  (Read their original paper here.)  This tool automates a process that once took huge amounts of painstaking research, and even this first version has had tremendous success -- the first run of the program, using data from 637 Austronesian languages currently spoken in Asia and the South Pacific, generated proto-Austronesian roots for which 85% matched the roots derived by experts in that language family to within one phoneme or fewer.

What I'm curious about, of course, is how good the software is at deriving root words for which we do have written records.  In other words, checking its results against something other than the unverifiable derivations that historical linguists were already doing.  For example, would the software be able to take lexicons from Spanish, French, Portuguese, Italian, Catalan, Provençal, and so on, and correctly infer the Latin stems?  To me, that would be the true test; to see what the shortcomings were, you have to have something real to check its results against. 

But even so, it's a pretty nifty new tool.  Just the idea that we can make some guesses at what language our ancestors spoke six-thousand-odd years ago is stunning, and the fact that someone has written software that reduces the effort to accomplish this is cool enough to set my little Language Nerd Heart fluttering.  It is nice to see reconstructive linguistics using the tools of science, thus bringing together two of my favorite things.  Why, exactly, I find it so exciting to know that *swey may have meant "to whistle" to someone six millennia ago, I'm not sure.  But the fact that we now have a computer program that can check our guesses is pretty damn cool.

**********************************************

Author and biochemist Camilla Pang was diagnosed with autism spectrum disorder at age eight, and spent most of her childhood baffled by the complexities and subtleties of human interactions.  She once asked her mother if there was an instruction manual on being human that she could read to make it easier.

Her mom said no, there was no instruction manual.

So years later, Pang recalled the incident and decided to write one.

The result, Explaining Humans: What Science Can Teach Us About Life, Love, and Relationships, is the best analysis of human behavior from a biological perspective since Desmond Morris's classic The Naked Ape.  If you're like me, you'll read Pang's book with a stunned smile on your face -- as she navigates through common, everyday behaviors we all engage in, but few of us stop to think about.

If you're interested in behavior or biology or simply agree with the Greek maxim "gnothi seauton" ("know yourself"), you need to put this book on your reading list.  It's absolutely outstanding.

[Note:  if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, August 2, 2021

Sponges, bird brains, and ugly plants

There's a story about Socrates, who was asked what he thought about his reputation for being the smartest man in the world.

The great philosopher thought for a moment, and responded, "If I am, it is only because I alone realize how little I know."

I think there's something to this.  Ignorance confers a kind of cockiness sometimes; another great thinker, Bertrand Russell, once said, "The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts."  It's inevitable that learning generates some level of humility, because one is always reminded of how much left there is to learn.

This is probably why I was so damn cocky as a college freshman.  Once I got to be a junior, I realized how foolish that was, as I got an inkling of how much I didn't know.  (Of course, nearly failing Classical Mechanics also had a dampening effect on my ego.  That was the moment I realized I didn't have the brains to be a physicist.)

Whenever I start perusing scientific journals -- a common occupation, as I'm looking for topics for Skeptophilia -- I'm amazed at what we've uncovered about the world we live in, and also how much there is left to learn.  That was one of my main takeaways from three scientific papers I came across last week; that, and a sense of wonder at how cool science is.

The first was a link sent to me by my buddy (and fellow writer) Gil Miller.  A paper in Nature by Elizabeth Turner, paleontologist at the Harquail School of Earth Sciences at Laurentian University, describes a find at a dig site in northwestern Canada that seems to contain fossils of one of the earliest and simplest animal groups -- sponges

What's mind-boggling about this discovery is that the rocks of the Stone Knife Formation, where the fossils were discovered, are about 890 million years old.  So if confirmed, this would predate the next-oldest undisputed sponge fossils by 350 million years.  This might just get a shoulder shrug, because most people -- myself included, unless I force myself to stop and think about it -- get lost when the numbers get large, so a 350 million year gap falls into the "it's big, but I can't visualize how big" category.  Let me put this number in perspective for you: if you went back 350 million years from today, you'd be in a world where there were no dinosaurs -- the earliest dinosaurs wouldn't appear for another 90 million years or so.

That's how far back Turner's discovery pushes the earliest animals.

If confirmed, this would place the origin of animals prior to the Cryogenian Period (also called the "Snowball Earth") of between 720 and 635 million years ago, one of the most massive worldwide glaciation events known.

The second paper, in Science Advances, is about the evolution of modern dinosaurs -- or, as we usually call them, "birds."  It's striking that the ancestors of today's birds survived a catastrophic bottleneck at the end of the Cretaceous Period 66 million years ago, caused by the double whammy of a massive meteorite collision and a near-simultaneous flood basalt eruption in what is now India.  (Scientists have yet to determine if the two events were connected -- if, perhaps, the collision destabilized the crust and caused the eruption.)

The paper centers on the discovery of a fantastically well-preserved fossil of Ichthyornis, an aquatic bird species of about 70 million years ago.  Picture a  gull with teeth, and you have a pretty good idea of what Ichthyornis looked like.  

Reconstruction of Icthyornis dispar [Image licensed under the Creative Commons El fosilmaníaco, Ichthyornis restoration, CC BY-SA 4.0]

What is remarkable about this fossil is the preservation of the skull, which gives the researchers a good look at the structure of the brain it once enclosed.  What they found is that the likelihood of a bird lineage surviving the bottleneck was largely due to one thing -- brain size.  Put simply, when the extinction came, the big dumb species tended to die out, and the small smart species survived.  

"Living birds have brains more complex than any known animals except mammals," said study lead investigator Christopher Torres, of the University of Texas and Ohio University.  "This new fossil finally lets us test the idea that those brains played a major role in their survival...  If a feature of the brain affected survivorship, we would expect it to be present in the survivors but absent in the casualties, like Ichthyornis.  That's exactly what we see here."

The third paper, in Nature, is about one of the world's weirdest plants -- Welwitschia mirabilis, of the deserts of Namibia.  The number of bizarre features of this plant are too many to list, but include:
  • The plant can live thousands of years, but only ever has two leaves.  (The Afrikaans name for the plant, tweeblaarkanniedood means, "two leaves, doesn't die.)  The leaves are strap-like and can eventually grow to four meters in length.  They eventually get shredded by the wind into what looks like a giant pile of seaweed.
  • The root is also about four meters in length, and looks like a giant carrot.
  • Despite its appearance, its closest familiar relatives are conifers, like pines, spruces, and firs.
To me it falls into the "ugly but fascinating" category.

[Image licensed under the Creative Commons Muriel Gottrop, Welwitschia at Ugab River basin, CC BY-SA 3.0]

The current paper is about the Welwitschia genome, which has a number of interesting features.  First, it seems to have originated when there was a spontaneous duplication of the DNA about 85 million years ago that led to its being genetically isolated from its near relatives, after which it continued to develop along its own lines.  Duplication of the genome has an advantage -- providing extra copies of vital genes, so if mutation knocks out a copy, there's still a functional one available -- but it has the disadvantage of overproduction of gene products (too much of a protein can be as bad as not enough; this is why chromosomal duplications, as in Down syndrome, lead to developmental problems).

Welwitschia solved the disadvantage by a process called methylation, which chemically ties up and shuts down genes.  This is done during normal development in many species, where turning genes on and off at the right times is absolutely critical, and it also knocks out genetic parasites called transposons (a transposon is a segment of DNA that is able to copy itself and splice those copies elsewhere in the DNA -- a sort of copy-and-paste function gone haywire).  So Welwitschia ended up with a huge genome, of which a lot -- the researchers found about 55% -- is composed of shut-down transposons and other methylated (i.e. non-functional) sequences.

Also very weird is the balance between the different nitrogenous bases in Welwitschia's DNA.  You probably know that the "alphabet" of DNA is made up of four bases -- adenine, thymine, cytosine, and guanine -- and that they pair together like puzzle pieces, A with T, C with G.  So in normal DNA, there will always be exactly as much A as T and exactly as much C as G.

But the other ratios -- A to C, for example -- vary by species.  Still, the number of A/T pairs and C/G pairs is usually fairly close.  Unsurprisingly, this plant, which is an exception to so many rules, is an exception to this one as well -- only 29% of its DNA is made up of C/G pairs.

The upshot: this paper shows that an ugly but fascinating plant is even more interesting than we'd realized.

All of this, published just in the last week.  Which brings me back to Socrates.  I'm not claiming to be anywhere near as smart as he was, but I do share one belief with him.

So much to learn, so little time.

**********************************************

Author and biochemist Camilla Pang was diagnosed with autism spectrum disorder at age eight, and spent most of her childhood baffled by the complexities and subtleties of human interactions.  She once asked her mother if there was an instruction manual on being human that she could read to make it easier.

Her mom said no, there was no instruction manual.

So years later, Pang recalled the incident and decided to write one.

The result, Explaining Humans: What Science Can Teach Us About Life, Love, and Relationships, is the best analysis of human behavior from a biological perspective since Desmond Morris's classic The Naked Ape.  If you're like me, you'll read Pang's book with a stunned smile on your face -- as she navigates through common, everyday behaviors we all engage in, but few of us stop to think about.

If you're interested in behavior or biology or simply agree with the Greek maxim "gnothi seauton" ("know yourself"), you need to put this book on your reading list.  It's absolutely outstanding.

[Note:  if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Saturday, July 31, 2021

Fast modules, slow modules, and ghost photographs

Yesterday, a friend of mine sent me a YouTube video link about the frightening paranormal legends from the Superstition Mountains in Arizona.  The video doesn't provide much in the way of evidence but I have to admit it was pretty atmospheric.  Well, one thing led to another, and soon I was looking at photographs of alleged ghosts, and completely creeping myself out.

Just so I can share the experience with you, here are a few that I found especially shiver-inducing.

First, from a security camera in a library in Evansville, Indiana, comes this image of a hunched, shadowy creature creeping across the floor... of the Children's Reading Room:


Or how about this one, an old photograph from the 1940s that shows a screaming ghost reaching out towards an unsuspecting young couple:


Or this shot of a stern man standing behind an elderly woman -- a man who supposedly wasn't there when the photograph was taken:


Or the shadow in the kitchen -- a shadow cast by no object visible in the photograph.  This one immediately reminded me of the episode "Identity Crisis" from Star Trek: The Next Generation -- one of the flat-out scariest episodes they ever did.  If you've seen it, you probably recall the moment Geordi is in the Holodeck, one by one removing the shadows of all of the individuals in the simulation he's standing in -- and ending up with one shadow left over:


So, anyway, there I am, getting more and more weirded out (and still, for some reason, not simply switching to a website with cute pictures of puppies, or something).  And I thought, "Why am I freaking out about all of this?  Not only have I never had a single experience of anything supernatural, I don't even believe in any of this stuff.  I am morally certain that all of these photographs were either deliberate hoaxes, or were camera malfunctions/artifacts, or are examples of pareidolia -- some completely natural explanation must be responsible.  So why am I scared?"

And my mind returned to a book that was a Skeptophilia book-of-the-week a while back, Thinking, Fast and Slow by Daniel Kahneman, the psychologist who won the Nobel Prize in economics in 2002.  Kahneman's specialty is why humans make irrational decisions; his research into how that applies to economic decision-making is why he won the Nobel.  More interesting to me, though, is the facet of his research that shows that human thinking is split into two discrete modules -- a fast module and a slow one.  And those two modules are frequently at odds with one another.

The fast module is what allows us to take quick stock of what's around us.  It is, for example, what allows us to do an immediate assessment of the following photograph:


No "rational thinking" is needed to come to the conclusion that this woman is angry.  On the other hand, the slow module is invoked when doing a math problem, like what is 223 x 1,174?  The vast majority of us could solve that problem, but it would take time and concentration.  (The fact that there are savants who can solve problems like that nearly instantaneously makes me wonder if their brains are somehow wired to do math with the fast module of the brain; merely a speculation, but it's suggestive.)

As an example of how the two modules can be at odds, consider the "Linda Problem."  Participants in a study were told a story about Linda, a single woman, intelligent and outspoken, who was very concerned with issues of social justice.  The participants were then asked which of the following possibilities was more likely: (1) Linda is a bank teller; or (2) Linda is a bank teller and is active in the feminist movement.  By a vast majority, participants chose option 2.  (Did you?)

The problem is, option 2 is wrong.  Not just maybe wrong, it's completely wrong, as in impossible.  How could the likelihood of Linda's being a feminist bank teller exceed the likelihood of her being a bank teller?  All feminist bank tellers are bank tellers; adding an extra detail to the description can only have the effect of decreasing the probability.  (To make this clearer, how can there be more brown dogs than there are dogs?)  But the fast module's quick assessment of the situation was that from the information given, she was very likely to be a feminist; the likelihood that she was a bank teller was equal in both possibilities; so it jumped to the (incorrect) conclusion that the combined probability was higher.

So, you can see how the fast module, however useful it is in making the snap judgments that are essential in getting us through the day, is not, at its basis, rational.  It is primed by previous experience, and is inherently biased toward finding the quickest answer possible, even if that answer is completely contrary to rationality.

And that, I think, explains why a diehard skeptic can still be completely weirded out by ghost pictures.  The slow module in my brain thinks, "Okay, pareidolia.  Or the photo was doctored.  No way is this real." My fast module, on the other hand, is thinking, "Good lord, that's terrifying!  Time for to dump a liter or two of adrenaline into my bloodstream!"  And no amount of soothing talk from my slow module seems to make any difference.

Especially the photo with the creeping thing in the library.  That one is freakin' scary.

**************************************

One of the characteristics which is -- as far as we know -- unique to the human species is invention.

Given a problem, we will invent a tool to solve it.  We're not just tool users; lots of animal species, from crows to monkeys, do that.  We're tool innovators.  Not that all of these tools have been unequivocal successes -- the internal combustion engine comes to mind -- but our capacity for invention is still astonishing.

In The Alchemy of Us: How Humans and Matter Transformed One Another, author Ainissa Ramirez takes eight human inventions (clocks, steel rails, copper telegraph wires, photographic film, carbon filaments for light bulbs, hard disks, scientific labware, and silicon chips) and looks not only at how they were invented, but how those inventions changed the world.  (To take one example -- consider how clocks and artificial light changed our sleep and work schedules.)

Ramirez's book is a fascinating lens into how our capacity for innovation has reflected back and altered us in fundamental ways.  We are born inventors, and that ability has changed the world -- and, in the end, changed ourselves along with it.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, July 30, 2021

Working titles

An author friend of mine recently posted a dilemma; she had come up with a killer title for her work-in-progress only to find out that another author had grabbed it first.  What to do?

Well, except for very famous, high monetary-value stories -- such as the ones owned by the Mouse Who Shall Not Be Named -- few titles are actually trademarked, which means that legally, you can publish a book under a title that's already been used.  In terms of common courtesy, however, the best answer comes from Wile E. Coyote: "Back to the old fiasco hatchery."

Myself, I think titles are critical.  They're one of the first things a potential reader sees (the first is most likely the cover illustration).  I find it intriguing to consider what people choose for titles, especially in cases where the choice is highly un-memorable.  Consider the formulaic approach, used most commonly in spaceship-and-alien science fiction: "The" + "alien sounding word" + one of the following words: "Maneuver, Gambit, Strategy, Solution, Encounter, Factor, Machine, Incident, Syndrome."   The Sqr'll'nutz Factor. The Bäbu'shkä Maneuver.  That sort of thing.

This book isn't real, but it definitely should be, because I would read the hell out of it.  (For other amazing examples, visit the page "Fake Book Titles Extravaganza!"  Do not try to drink anything while looking at this website.  You have been warned.)

The problem is, formulaic titles are often so ridiculously uncreative that they will promptly blend in with all of the other Encounters and Gambits and Maneuvers you've read about, and as a writer, that's definitely not the impression you want to create.  Memorable titles are short, pithy, and intriguing.  I tend to like metaphorical titles -- ones which provoke curiosity ("What on earth could that be referring to?") coupled with an "Aha!" moment when you read the story and actually figure it out.

As some examples, here are some of my favorite titles I've run across:
  • All Hallow's Eve (Charles Williams)
  • A Murder is Announced (Agatha Christie)
  • Closet Full of Bones (A. J. Aalto)
  • The Lathe of Heaven (Ursula LeGuin)
  • The Eyes of the Amaryllis (Natalie Babbitt)
  • Among the Dolls (William Sleator)
  • The Ocean at the End of the Lane (Neil Gaiman)
  • Everything is Illuminated (Jonathan Safran Foer) - and interestingly, I didn't particularly like this book.  But the title is awesome.
  • Something Wicked This Way Comes (Ray Bradbury)
  • Midnight in the Garden of Good and Evil (John Berendt)
  • Things Fall Apart (Chinua Achebe)
  • Their Eyes Were Watching God (Zora Neale Hurston)
  • The Girl Who Loved Tom Gordon (Stephen King)
  • The Stupidest Angel (Christopher Moore)
  • The Fifth Elephant (Terry Pratchett)
  • Wolves in the Walls (Neil Gaiman)
And a few that I think don't work so well:
  • "O, Whistle and I'll Come To You, My Lad" (M. R. James) - a brilliant, and terrifying, short story with a title that's way too long and cumbersome.
  • A Wind in the Door (Madeleine l'Engle) - an interesting title, but what the hell is the relevance?  At the end of the story, a door blows shut for no apparent reason, and I presume we're supposed to raise an eyebrow and say, "Ahhhh, now I see"?
  • Dorothy Sayers's novels are kind of a mixed bag.  Busman's Honeymoon is really clever and intriguing, but Unnatural Death is generic and boring (aren't all murder mysteries about unnatural deaths)?  Interestingly, the latter started out as The Dawson Pedigree -- a much better title, in my opinion -- then for some reason she chose to go with the bland.
  • Brandy of the Damned (Colin Wilson) - oh, come on.  I doubt the damned will get brandy, frankly.
  • Postern of Fate (Agatha Christie) - my opinion may be colored by the fact that I think this is far and away the worst book she ever wrote -- rambling, incoherent, with long passages of supposed-to-be-witty repartee, and after reading it I still have no clue why the title is relevant to the plot.
  • The Island of the Sequined Love Nun (Christopher Moore) - okay, I love Moore's novels and I know he was trying to give it a campy title.  Actually it's an awesome book - but the title is just goofy.
So, anyway, that gives you an idea of what I shoot for, with titles.  Here are a few titles I've come up with that I think work pretty well.  I'll leave it to you to decide if you think they're intriguing or dreadful.
  • The Dead Letter Office
  • Slings & Arrows
  • The Shambles
  • We All Fall Down (novella)
  • Whistling in the Dark
  • Kári the Lucky
  • Descent into Ulthoa
  • "The Pool of Ink" (short story)
  • "The Germ Theory of Disease" (short story)
**************************************

One of the characteristics which is -- as far as we know -- unique to the human species is invention.

Given a problem, we will invent a tool to solve it.  We're not just tool users; lots of animal species, from crows to monkeys, do that.  We're tool innovators.  Not that all of these tools have been unequivocal successes -- the internal combustion engine comes to mind -- but our capacity for invention is still astonishing.

In The Alchemy of Us: How Humans and Matter Transformed One Another, author Ainissa Ramirez takes eight human inventions (clocks, steel rails, copper telegraph wires, photographic film, carbon filaments for light bulbs, hard disks, scientific labware, and silicon chips) and looks not only at how they were invented, but how those inventions changed the world.  (To take one example -- consider how clocks and artificial light changed our sleep and work schedules.)

Ramirez's book is a fascinating lens into how our capacity for innovation has reflected back and altered us in fundamental ways.  We are born inventors, and that ability has changed the world -- and, in the end, changed ourselves along with it.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, July 29, 2021

The cost of personal courage

I have been following, from some distance, the hue-and-cry over Simone Biles's removing herself from competition on the U.S. Olympic gymnastics team.  Biles was completely up-front about why.  "You have to be there 100%," she told reporters.  "If not, you get hurt.  Today has been really stressful.  I was shaking.  I couldn't nap.  I have never felt like this going into a competition, and I tried to go out and have fun.  But once I came out, I was like, 'No.  My mental is not there.'  It's been a long year, and I think we are too stressed out.  We should be out here having fun.  Sometimes that's not the case."

Well, immediately the pundits started weighing in.  Charlie Kirk called her a "selfish sociopath" and bemoaned the fact that "we are raising a generation of weak people like Simone Biles."  Clay Travis suggested she be removed from future competition because she couldn't be relied on.  Piers Morgan was perhaps the worst -- not surprising given his ugly commentary in the past.  "Are 'mental health issues' now the go-to excuse for any poor performance in elite sport?  What a joke...  Sorry Simone Biles, but there's nothing heroic or brave about quitting because you're not having 'fun' – you let down your team-mates, your fans and your country."

And so on.  The criticism came fast and furious.  There were voices who spoke up in support of her decision, but it seemed to me the nastiness was a lot louder.

[Image licensed under the Creative Commons Agência Brasil Fotografias, Simone Biles Rio 2016e, CC BY 2.0]

Or maybe I'm just sensitive.  Other writers have spoken with more authority about the rigors of Olympic training and gymnastics in particular, not only the physical aspects but the mental, topics which I am unqualified to discuss.  But whatever the context, there is one thing I'm dead certain about.

If someone says they're struggling mentally and/or emotionally, you fucking well believe them.

I have fought mental illness all my life.  I've been open about this here before; I have come to realize it is no more shameful than any other chronic condition.  I do know, however, first-hand how debilitating anxiety can be.  I've also suffered from moderate-to-severe depression, fortunately now ameliorated by medications and a family who is understanding and supportive.  So at present, I'm doing okay.

But it hasn't always been that way.  For much of my life, I was in a situation where "suck it up and deal" and "be tough, be a man" and "you should be thankful for what you have" were the consistent messages.  Therapy was for the weak; psychiatric care (and meds) were for people who were crazy.  There's nothing wrong with you, I was told.  You just spend too much time feeling sorry for yourself and worrying about things you can't control.

The result?  Twice I was suicidal, once at age seventeen and once at age twenty, to the point that I had a plan and a method and was ready to go for it.  That I didn't -- fortunately -- is really only due to one thing; I was scared.  I spent a good bit of my first marriage haunted by suicidal ideation, and there the only thing that kept me alive was my commitment to my students, and later, to my children.

But I thought about it.  Every.  Single.  Damn.  Day.

That a bunch of self-appointed arbiters of proper behavior have told this remarkable young woman "No, I don't care how you feel or what you're going through, get back in there and keep performing for us" is somewhere beyond reprehensible.  I don't even have a word strong enough for it.  If you haven't experienced the hell of anxiety, panic attacks, and depression, you have zero right to criticize someone else, especially when she's doing what people in a bad mental space should be doing -- advocating for herself, setting her limits, and admitting when she can't manage to do something.

I wish I had known how to do that when I was twenty-four (Simone Biles's age).  But I was still a good fifteen years from understanding the mental illness I have and seeking out help -- and unashamedly establishing my own personal boundaries.

So to all the critics out there who think they know what Simone Biles should do better than she does -- shut the fuck up.  I presume you wouldn't go up to a person with a serious physical illness and have the temerity to tell them what they can and can't do, and to pass judgment on them if they don't meet your standards.  This is no different.  We have a mental health crisis in this country; skyrocketing incidence of diagnosed mental illnesses and uncounted numbers who go undiagnosed and unaided, and a health care system that is unable (or unwilling) to address these problems effectively.  What Simone Biles did was an act of bravery, and she deserves unequivocal support for it.  The cost of personal courage shouldn't be nasty invective from a bunch of self-appointed authorities who have never set foot on the road she has walked.

And those who can't understand that should at least have the good grace to keep their damn opinions to themselves.

**************************************

One of the characteristics which is -- as far as we know -- unique to the human species is invention.

Given a problem, we will invent a tool to solve it.  We're not just tool users; lots of animal species, from crows to monkeys, do that.  We're tool innovators.  Not that all of these tools have been unequivocal successes -- the internal combustion engine comes to mind -- but our capacity for invention is still astonishing.

In The Alchemy of Us: How Humans and Matter Transformed One Another, author Ainissa Ramirez takes eight human inventions (clocks, steel rails, copper telegraph wires, photographic film, carbon filaments for light bulbs, hard disks, scientific labware, and silicon chips) and looks not only at how they were invented, but how those inventions changed the world.  (To take one example -- consider how clocks and artificial light changed our sleep and work schedules.)

Ramirez's book is a fascinating lens into how our capacity for innovation has reflected back and altered us in fundamental ways.  We are born inventors, and that ability has changed the world -- and, in the end, changed ourselves along with it.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]