Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, January 26, 2021

The cost of regret

"But what would have been the good?"

Aslan said nothing.

"You mean," said Lucy rather faintly, "that it would have turned out all right – somehow?  But how?  Please, Aslan!  Am I not to know?"

"To know what would have happened, child?" said Aslan.  "No.  Nobody is ever told that."

"Oh dear," said Lucy.

"But anyone can find out what will happen," said Aslan.  "If you go back to the others now, and wake them up; and tell them you have seen me again; and that you must all get up at once and follow me – what will happen?  There is only one way of finding out."
This passage, from C. S. Lewis's novel Prince Caspian, has always struck me with particular poignancy, because one of the most consistent themes of my life has been regret at not having made different decisions.  People I dearly wish I had not hurt.  Opportunities I passed up because of my shyness and risk-aversion.  More specific ones, like my (all things considered) terrible decision to live at home while going to college.  My (at the time) barely-acknowledged choice to keep my bisexuality hidden for decades.

It's not, mind you, that I'm unhappy with my life as it is.  I have a wonderful wife, two sons I'm proud of, and spent 32 years in a rewarding career that I discovered quite by accident,  as a consequence of other seemingly unrelated decisions I made.  I have seventeen books in print, something I have dreamed about since elementary school.  I live in a wonderful part of the world, and have had the good fortune to travel and see dozens of other wonderful places.

And I'm aware of the fact that things could have turned out far worse.  Whatever else you can say about the decision, my choice to live at home during college, with conservative, strait-laced parents who kept close tabs on me, kept me out of all sorts of trouble I might otherwise have gotten into.  If I'd come out as bisexual in college, it would have been in around 1980 -- and this was right at the beginning of the AIDS epidemic, when the disease was still poorly understood, and a diagnosis was tantamount to a death sentence.

There's any number of ways the course of my life could have been deflected into an alternate path, and led me to somewhere very different.  Big decisions -- where to go to college, who to marry, what career to pursue.  Tiny actions with big effects, such as Donna Noble's choice of which direction to turn at an intersection in the mind-blowing Doctor Who episode "Turn Left" -- and of which in my own case I'm almost certainly unaware because looking back, they seem entirely insignificant.  


As I said, I like my life just fine.  Even so, I've never been able to shuck the regret, and more than that the fact that like Lucy Pevensie in Prince Caspian, I'll never know what would have happened had I done otherwise.

The topic comes up because of a fascinating paper in the journal Psychological Science called "The Lure of Counterfactual Curiosity: People Incur a Cost to Experience Regret," by Lily FitzGibbon and Kou Muryama (of the University of Reading), and Asuka Komiya (of Hiroshima University).  They did a risk/choice/reward assessment task with 150 adults, and after the task was completed, the volunteers are allowed to pay for information about how they would have fared had they chosen differently.

It turns out, people are willing to pay a lot, even when they find out that they chose poorly (i.e. they would have had a greater reward had they made a different choice), and even though knowledge of their poor decision causes regret, self-doubt, and worse performance on subsequent tasks.  The authors write:
After one makes a decision, it is common to reflect not only on the outcome that was achieved but also on what might have been.  For example, one might consider whether going to a party would have been more fun than staying home to work on a manuscript.  These counterfactual comparisons can have negative emotional consequences; they can lead to the experience of regret.  In the current study, we examined a commonly observed yet understudied aspect of counterfactual comparisons: the motivational lure of counterfactual information—counterfactual curiosity.  Specifically, we found that people are so strongly seduced to know counterfactual information that they are willing to incur costs for information about how much they could have won, even if the information is likely to trigger negative emotions (regret) and is noninstrumental to obtaining rewards.
Why would people seek out information when they know ahead of time it is likely to make them feel bad?  The authors write:
One explanation for seeking negative information is that people may also find it interesting to test their emotional responses—a mechanism that might also underlie so-called morbid curiosity.  Counterfactual information of the kind sought in the current experiments may be desirable because it has high personal relevance—it relates to decisions that one has made in the recent past.  People’s desire for information about their own performance is known to be strong enough to overcome cognitive biases such as inequality aversion.  Thus, opportunities to learn about oneself and the actual and counterfactual consequences of one’s decisions may have powerful motivational status.
Chances are, if I was able to do what Donna did in "Turn Left" and see the outcome had I chosen differently, I'd find the results for my life's path would be better in some aspects and worse in others.  Like everything, it's a mixed bag.  Given the opportunity to go back in time and actually change something -- well, tempting as it would be, I would be mighty hesitant to take that step and risk everything I currently have and have accomplished.

But still -- I'd like to know.  Even if in some cases, I'd have done far better making a different choice, and then would add the certainty of having made a bad decision on top of the more diffuse regret I already have.  The temptation to find out would be almost irresistible.

Maybe it's better, honestly, that we don't see the long-term consequences of our actions.  Fortunate, to put it in Aslan's words, that "Nobody is ever told that."  It's hard enough living with knowing you fell short or behaved badly; how much worse it would be if we saw that things could have been far better if we'd only chosen differently.

****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Monday, January 25, 2021

The shifting sands

In H. P. Lovecraft's wildly creepy story "The Shadow Out of Time," we meet a superintelligent alien race called the Yith who have a unique way of gathering information.

The Yith, who lived in what is now Australia's Great Sandy Desert some 250 million years ago, are capable of temporarily switching personalities with other intelligent beings throughout the cosmos and from any time period.  While the consciousness of the kidnapped individual is residing in its temporary Yith body, it enjoys the freedom to learn anything it wants from the extensive library of information the Yith have gleaned -- as long as the individual is willing to contribute his/her own knowledge to the library.  The main character, early twentieth century professor Nathaniel Peaslee, is switched, and while he is living with the Yith he meets a number of luminaries whose personalities have also been swiped, including:

  • Titus Sempronius Blaesus: a Roman official from 80 B.C.E.
  • Bartolomeo Corsi: a twelfth-century Florentine monk
  • Crom-Ya: a Cimmerian chief who lived circa 15,000 B.C.E. 
  • Khephnes: a Fourteenth Dynasty (circa 1700 B.C.E.) Egyptian pharaoh
  • Nevil Kingston-Brown: an Australian physicist who would die in 2518 C.E.
  • Pierre-Louis Montagny: an elderly Frenchman from the time of Louis XIII (early seventeenth century)
  • Nug-Soth: a magician from a race of conquerors in16,000 C.E,
  • S'gg'ha: a member of the star-headed "Great Race" of Antarctica, from a hundred million years ago
  • Theodotides: a Greco-Bactrian official of 200 B.C.E.
  • James Woodville: a Suffolk gentleman from the mid-seventeenth century
  • Yiang-Li: a philosopher from the empire of Tsan-Chan, circa 5000 C.E.
Compared to most of the gory dismemberments other Lovecraftians entities were fond of, the Yith are remarkably genteel in their approach. Of course, it's not without its downside for the kidnapped individual; not only do they lose control over their own bodies for a period up to a couple of years, they experience serious disorientation (bordering on insanity in some cases) upon their return to their own bodies.

Nevertheless, it's a fantastic concept for a story, and I remember when I first read it (at about age sixteen) how taken I was with the idea of being able to meet and talk with individuals from both past and future, not to mention other species.  But what struck me most viscerally when I read it was when Peaslee, in the Yith's body, describes what he sees surrounding the library.

It's a tropical rain forest.  What now is a barren desert, with barely a scrap of vegetation, was a lush jungle:

The skies were almost always moist and cloudy, and sometimes I would witness tremendous rains.  Once in a while, though, there would be glimpses of the Sun -- which looked abnormally large -- and the Moon, whose markings held a touch of difference from the normal that I could never fathom.  When -- very rarely -- the night sky was clear to any extent, I beheld constellations which were nearly beyond recognition.  Known outlines were sometimes approximated, but seldom duplicated; and from the position of the few groups I could recognize, I felt I must be in the Earth's southern hemisphere, near the Tropic of Capricorn.

The far horizon was always steamy and indistinct, but I could see that great jungles of unknown tree ferns, Calamites, Lepidodendron, and Sigillaria lay outside the city, their fantastic fronds waving mockingly in the shifting vapors...  I saw constructions of black or iridescent stone in glades and clearings where perpetual twilight reigned, and traversed long causeways over swamps so dark I could tell but little of their towering, moist vegetation.

[Image licensed under the Creative Commons Carl Malamud, Cretaceous Diorama 2, CC BY 2.0]

I think it's the first time I'd really gotten hit square between the eyes with how different the Earth is now than it had been, and that those changes haven't halted.  In the time of Lovecraft's Yith, 250 million years ago, where I am now (upstate New York) was underneath a shallow saltwater ocean.  Only a hundred thousand years ago, where my house stands was covered with a thick layer of ice, near the southern terminus of the enormous Laurentide Ice Sheet.  (In fact, the long, narrow lakes that give the Finger Lakes Region its name were carved out by that very glacier.)

I was immediately reminded of that moment of realization when I read a paper a couple of days ago in Nature called "Temperate Rainforests Near the South Pole During Peak Cretaceous Warmth," by a huge team led by Johann Klages of the Alfred-Wegener-Institut Helmholtz-Zentrum für Polar- und Meeresforschung, of Bremerhaven, Germany.  Klages's team made a spectacular find that demonstrates that a hundred million years ago, Antarctica wasn't the windswept polar desert it currently is, but something more like Lovecraft's vision of the site of the prehistoric library of Yith.  The authors write:

The mid-Cretaceous period was one of the warmest intervals of the past 140 million years, driven by atmospheric carbon dioxide levels of around 1,000 parts per million by volume.  In the near absence of proximal geological records from south of the Antarctic Circle, it is disputed whether polar ice could exist under such environmental conditions.  Here we use a sedimentary sequence recovered from the West Antarctic shelf—the southernmost Cretaceous record reported so far—and show that a temperate lowland rainforest environment existed at a palaeolatitude of about 82° S during the Turonian–Santonian age (92 to 83 million years ago).   This record contains an intact 3-metre-long network of in situ fossil roots embedded in a mudstone matrix containing diverse pollen and spores.  A climate model simulation shows that the reconstructed temperate climate at this high latitude requires a combination of both atmospheric carbon dioxide concentrations of 1,120–1,680 parts per million by volume and a vegetated land surface without major Antarctic glaciation, highlighting the important cooling effect exerted by ice albedo under high levels of atmospheric carbon dioxide.

It's a stunning discovery from a number of perspectives.  First, just the wonderment of realizing that the climate could change so drastically.  Note that this wasn't, or at least wasn't entirely, because of tectonic movement; the site of the find was still only eight degrees shy of the South Pole even back then.  Despite that, the warmth supported a tremendous assemblage of life, including hypsilophodontid dinosaurs, labyrinthodontid amphibians, and a diverse flora including conifers, cycads, and ferns.  (And given that at this point Antarctica and Australia were still connected, Lovecraft's vision of the home of the Yith was remarkably accurate.)

So, if it wasn't latitude that caused the warm climate, what was it?  The other thing that jumps out at me is the high carbon dioxide content of the atmosphere back then -- 1,000 parts per million.  Our current levels are 410 parts per million, and going up a steady 2.5 ppm per year.  I know I've rung the changes on this topic often enough, but I'll say again -- this is not a natural warm-up, like the Earth experienced during the mid-Cretaceous.  This is due to our out-of-control fossil fuel use, returning to the atmosphere carbon dioxide that has been locked up underground for hundreds of millions of years.  When the tipping point will occur, when we can no longer stop the warm up from continuing, is still a matter of debate.  Some scientists think we may already have passed it, that a catastrophic increase in temperature is inevitable, leading to a complete melting of the polar ice caps and a consequent rise in sea level of ten meters or more.

What no informed and responsible person doubts any more is that the warm-up is happening, and that we are the cause.  People who are still "global warming doubters" (I'm not going to dignify them by calling them skeptics; a skeptic respects facts and evidence) are either woefully uninformed or else in the pockets of the fossil fuel interests.

I don't mean to end on a depressing note.  The Klages et al. paper is wonderful, and gives us a vision of an Earth that was a very different place than the one we now inhabit, and highlights that what we have now is different yet from what the Earth will look like a hundred million years in the future.  It brings home the evocative lines from Alfred, Lord Tennyson's wonderful poem "In Memoriam:"

There rolls the deep where grew the tree.
O Earth, what changes hast thou seen?
There where the long road roars hath been
The stillness of the central sea.
The hills are shadows, and they flow
From form to form, and nothing stands;
They melt like mist, the solid lands,
Like clouds, they shape themselves and go.
****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, January 23, 2021

The voice of nature

Yesterday I wrote about my difficulty with maintaining concentration.  My mind's tendency to wander has been with me all my life, and at after sixty years of fighting with it I'm beginning to think it always will be.  This, coupled with an unfortunate history of not sticking with things long if I don't see quick results, has been why my attempts to make a practice of meditation have, all things considered, been failures.

I've had more than one person recommend meditation and mindfulness training as a means for combatting depression, anxiety, and insomnia, all of which I struggle with.  I even did a six-week mindfulness training course three years ago, thinking that if perhaps I learned some strategies for dealing with my errant brain, I might be more successful.  But even training didn't seem to be able to fix the fact that when I meditate, I nearly always veer off either into an anxiety attack or else fall asleep.  Steering a middle course -- being relaxed and tranquil enough to gain some benefit from it, but not so relaxed and tranquil that I lose consciousness -- just never seemed to be within my grasp.

Part of my problem is that I have a loud internal voice,  I know we all deal with internal chatter, but mine has the volume turned up to eleven.  And it's not even interesting chatter, most of the time.  I sometimes have looped snippets of songs, usually songs I hate.  (Last week, I woke up at two AM with the song "Waterloo" by Abba running through my head.  God alone knows why.  I don't even like that song during the day.)  Sometimes it's just completely random musings, like while I was running yesterday and pondering how weird the word "aliquot" is.  (For you non-science folks, it means "a sample" -- as in, "transfer a 3.5 ml aliquot of the solution to a test tube."  I also found out, because I was still thinking about it later and decided to look it up, that it comes from a Latin word meaning "some.")

So most of the time, my brain is like a horse that's always on the verge of spooking, throwing its rider, and then running off a cliff.

The topic comes up because of a paper that appeared this week in the journal Psychomusicology: Music, Mind, and Brain, which found that the old technique used for combatting distraction during meditation -- to focus on your breath -- simply doesn't work well for some people.  Not only is it an ongoing battle, a lot of people have the same problem I did, which is taking those mindfulness skills and then applying them during the ordinary activities of the day.

In "Exploring Mindfulness Attentional Skills Acquisition, Psychological and Physiological Functioning and Well-being: Using Mindful Breathing or Mindful Listening in a Nonclinical Sample," by Leong-Min Loo, Jon Prince, and Helen Correia, we read about a study of 79 young adults who were trained in mindfulness and meditation techniques -- but for some of them, they were instructed in the traditional "return to your breath if you get distracted" method, and others were told to focus on external sounds like quiet recorded music or sounds of nature.  Interestingly, the ones who were told to focus on external sounds not only reported fewer and shorter episodes of distraction during meditation, they reported greater ease in using those techniques during their ordinary daily activities -- and also reported lower symptoms of depression and anxiety afterward than the group who mediated in silence.

What's funny is I was just thinking about the idea of soothing sounds a couple of days ago, when I participated in one of those silly online quizzes.  One of the questions was, "What are your favorite sounds?" -- and after I rattled off a few, I realized that all but one of them were natural sounds.  Thunder.  Wind in the trees.  The dawn chorus of birds in spring.  A hard rain striking the roof.  Ocean waves.  (The only one on my list that wasn't natural was "distant church bells at night" -- a sound that reminds me of when I was nine and lived with my grandma for a year, and every evening heard the beautiful and melancholy sound of the bells of Sacred Heart Catholic Church in Broussard, Louisiana, rising and falling with the breeze.)

So maybe it's time to try meditation again, but using some recordings of natural sounds to aid my focus.  I know I'll still have to combat my brain's tendency to yell absurd and random stuff at me, and also my unfortunate penchant for giving up on things too easily.  But something external to focus on seems like it might help a bit, at least with the attentional part of it.

And lord help me, if it purges "Waterloo" from my brain, it'll be worthwhile regardless.

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Friday, January 22, 2021

The mental walkabout

I don't know about you, but I have a real problem with my mind wandering.

It's not a new thing.  I can remember getting grief for daydreaming back when I was in grade school.  I'd be sitting in class, trying my damndest to concentrate on transitive verbs or the Franco-Prussian War or whatnot, but my gaze would drift off to some point in the middle distance, my auditory processing centers would switch from "external input" to "internal input" mode, and in under a minute I'd be out in interstellar space or roaming around Valhalla with Odin and the Boys or rowing a boat down the Amazon River.

Until the teacher would interrupt my reverie with some irrelevant comment like, "Gordon!  Pay attention!  Why don't you tell us how to find x in the equation 5x - 9 = 36?"  I was usually able to refrain from saying what came to mind, namely, that she was the one who lost x in the first place and it was hardly my responsibility to find it, but I usually was able to get myself together enough to take a shot at playing along and giving her a real answer.

I never outgrew the tendency (either to daydreaming or to giving authority figures sarcastic retorts).  It plagued me all through college and beyond, and during my teaching career I remember dreading faculty meetings because I knew that five minutes in I'd be doodling on the agenda despite my vain attempt to be a Good Boy and pay attention.  It's part of how I developed my own teaching style; a mentor teacher told me early along that teaching was 25% content knowledge and 75% theater, and I took that to heart.  I tried to lecture in a way that kept students wondering what the hell I was going to say or do next, because I know that's about the only thing that kept me engaged when I was sitting in the student's desk and someone else was in front of the room.

One amusing case in point -- Dr. Cusimano, who taught a British History elective I took as a senior in college.  He was notorious for working puns and jokes into his lectures, and doing it so smoothly and with such a straight face that if you weren't paying attention, it could slip right past you.  I recall early in the course, when he was talking about the fall of the Roman Empire, Dr. Cusimano said, "During that time, what was left of the Roman Empire was invaded by a series of Germanic tribal leaders -- there was Alaric, King of the Visigoths; Gunderic, King of the Vandals; Oscar Mayer, King of the Franks..."

I'd bet cold hard cash there were students in the class who wrote that down and only erased it when one by one, their classmates caught on and started laughing.

I never daydreamed in Dr. Cusimano's class.

Edward Harrison May, Daydreaming (1876) [Image is in the Public Domain]

Anyhow, all of this comes up because of a study out of the University of California - Berkeley that appeared this week in Proceedings of the National Academy of Sciences.  Entitled, "Distinct Electrophysiological Signatures of Task-Unrelated and Dynamic Thoughts," by Julia W. Y. Kam, Zachary C. Irving, Caitlin Mills, Shawn Patel, Alison Gopnik, and Robert T. Knight, this paper takes the fascinating angle of analyzing the electroencephalogram (EEG) output of test subjects when focused on the task at hand, when focusing on something unrelated, or when simply wandering from topic to topic -- what the authors call "dynamic thought," like much of the game of random free association that my brain spends a significant portion of its time in.

The authors write:

Humans spend much of their lives engaging with their internal train of thoughts.  Traditionally, research focused on whether or not these thoughts are related to ongoing tasks, and has identified reliable and distinct behavioral and neural correlates of task-unrelated and task-related thought.  A recent theoretical framework highlighted a different aspect of thinking—how it dynamically moves between topics.  However, the neural correlates of such thought dynamics are unknown. The current study aimed to determine the electrophysiological signatures of these dynamics by recording electroencephalogram (EEG) while participants performed an attention task and periodically answered thought-sampling questions about whether their thoughts were 1) task-unrelated, 2) freely moving, 3) deliberately constrained, and 4) automatically constrained...  Our findings indicate distinct electrophysiological patterns associated with task-unrelated and dynamic thoughts, suggesting these neural measures capture the heterogeneity of our ongoing thoughts.

"If you focus all the time on your goals, you can miss important information," said study co-author Zachary Irving, in an interview with Science Direct.  "And so, having a free-association thought process that randomly generates memories and imaginative experiences can lead you to new ideas and insights."

Yeah, someone should have told my elementary school teachers that.

"Babies' and young children's minds seem to wander constantly, and so we wondered what functions that might serve," said co-author Allison Gopnik.  "Our paper suggests mind-wandering is as much a positive feature of cognition as a quirk and explains something we all experience."

So my tendency to daydream might be a feature, not a bug.  Still, it can be inconvenient at times.  I know there are a lot of things that would be a hell of a lot easier if I could at least control it, like when I'm reading something that's difficult going but that I honestly want to pay attention to and understand.  Even when my intention is to concentrate, it usually doesn't take long for me to realize that my eyes are still tracking across the lines, my fingers are turning pages, but I stopped taking anything in four pages ago and since that time have been imagining what it'd be like to pilot a spaceship through the Great Red Spot.  Then I have to go back and determine when my brain went AWOL -- and start over from there until the next time I go on mental walkabout.

I guess there's one advantage to being an inveterate daydreamer; it's how I come up with a lot of the plots to my novels.  Sometimes my internal imaginary worlds are more vivid than the real world.  However, I do need to re-enter the real world at least long enough to get the story down on paper, and not end up being too distracted to write down the idea I came up with while I was distracted last time.

In any case, I guess I'd better wrap this up, because I'm about at the limits of my concentration.  I'd like to finish this post before my brain goes on walkies and I end up staring out of my office window and wondering if there's life on Proxima Centauri b.  Which I guess is an interesting enough topic, but hardly the one at hand.

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Thursday, January 21, 2021

World enough and time

Because I'm writing this in the last hours of the Trump presidency, and my other alternative is to become so anxious about what his followers might still do to fuck things up that I chew my fingernails till they bleed, today I'm going to focus on things that are very, very far from planet Earth.

Let's begin with the closest-to-home, three thousand light years away, which seems like it might be almost far enough for safety.

A new study of planetary nebulae -- gas and dust clouds that are what's left of stars that went supernova -- was the subject of a talk at the meeting of the American Astronomical Society last Friday.  Using the Hubble Space Telescope's Wide Field Camera, astronomers were able to photograph these amazing stellar remnants panchromatically (across the frequency spectrum of light).  And what they're learning is changing a lot of what we thought we understood.

Take, for example, NGC 6302, better known as the Butterfly Nebula.  It got its name because of symmetrical "wings" of debris that were thrown out when the central star blew up.  Why it has this strange symmetry is probably due to the magnetic field of the central star, but what's most surprising is that what astronomers thought was the central star doesn't seem to be, but is simply a white dwarf much closer to the Earth that happens to lie between us and the nebula.  Wherever the actual central star is, it's a doozy; from the spectral lines of the nebula, created when light from the star is absorbed and then re-emitted by the dust plumes, its surface is one of the hottest known, at a staggering 250,000 C.  (By comparison, the surface of our own Sun is a paltry 6,000 C or so.)

The Butterfly Nebula [Image is in the Public Domain courtesy of the Hubble Space Telescope and NASA/JPL]

Then there's NGC 7027, the Jewel Bug Nebula, which is also remarkable because of its symmetry -- depending on what feature you're looking at, it shows spherical symmetry (symmetry around the center, like a basketball), axis symmetry (symmetry around a line, like the letter T), or point symmetry (symmetry across a central point, like the letter N).  It's simultaneously one of the brightest planetary nebulae and one of the smallest, and the new study confirms that it's a recently-formed object -- it's only six hundred years old.  (Of course, since it's three thousand light years away, the structure is actually 3,600 years old; but what we're seeing is what it looked like when it was a mere six hundred.)

"We're dissecting [planetary nebulae]," said Joel Kastner, a professor in the Rochester Institute of Technology's Chester F. Carlson Center for Imaging Science and School of Physics and Astronomy.  "We're able to see the effect of the dying central star in how it's shedding and shredding its ejected material.  We're able to see that material that the central star has tossed away is being dominated by ionized gas, where it's dominated by cooler dust, and even how the hot gas is being ionized, whether by the star's UV or by collisions caused by its present, fast winds."

Moving farther afield, another paper presented at the AAS meeting is about a weird object in NGC 253, the Sculptor Galaxy, which is 11.4 million light years away.  It's called a magnetar, and is another stellar remnant, but this one of a supergiant star.  The Fermi Gamma-ray Space Telescope and the Mars Odyssey orbiter both picked up a 140-millisecond-long pulse of gamma rays which seems to have been caused by a starquake on the surface of this object, a cosmic shudder that in one burst released one thousand trillion trillion (10 followed by 27 zeroes) times more energy than the largest recorded earthquake Earth has experienced.  The quake ejected a blob of plasma at nearly the speed of light, and the acceleration is what caused the gamma rays.

The new study gives us a lens into the behavior of some of the oddest structures in the universe, and one that may also be responsible for "fast radio bursts" -- quick pulses of radio waves whose source has been a mystery up until now.  "The apparent frequency of magnetar flares in other galaxies is similar to the frequency of fast radio bursts," said astrophysicist Victoria Kaspi of McGill Space Institute.  "That argues that actually, most or all fast radio bursts could be magnetars."

Last, we go out an astonishing thirteen billion light years, which is only seven hundred million light years shy of the radius of the observable universe.  Another paper at the AAS meeting describes a quasar -- an ancient supermassive black hole that is radiating energy from infalling material, and is one of the brightest objects known -- that lies at the center of a galaxy, and now holds the record for the oldest black hole ever observed.

Like all good scientific discoveries, this one raises almost as many questions as it solves, especially about how such a massive object could have formed so early in the life of the universe.  "A gargantuan seed black hole may have formed through the direct collapse of vast amounts of primordial hydrogen gas," said study co-author Xiaohui Fan, of the University of Arizona in Tucson.  "Or perhaps J0313-1806’s seed started out small, forming through stellar collapse, and black holes can grow a lot faster than scientists think.  Both possibilities exist, but neither is proven.  We have to look much earlier [in the universe] and look for much less massive black holes to see how these things grow."

So that leaves us all the way across the universe, which is a nice comfortable distance to put between myself and the Proud Boys.  It'd be better still to have me stay here and send the Proud Boys out to the farthest reaches of interstellar space, so that their inevitable tweets about what a god-figure Trump is and what a libtard snowflake I am will take thirteen billion years to get here.

But I guess that's not gonna happen.  We all have to stay here and solve our own problems, quasars and magnetars and nebulae notwithstanding.  I'll end with a quote from Doctor Who, which seems apt somehow given the voyage through time and space we just took: "I do think there’s always a way to put things right.  If I didn’t believe that I wouldn’t get out of bed in the morning,  I wouldn’t eat breakfast; I wouldn’t leave the TARDIS ever.  I would never have left home.  There is always something we can do."

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, January 20, 2021

The illusion of causality

Fighting bad thinking is an uphill battle, sometimes.  Not only, or even primarily, because there's so much of it out there; the real problem is that our brains are hard-wired to make poor connections, and once those connections are made, to hang on to them like grim death.

A particularly difficult one to overcome is our tendency to fall for the post hoc, ergo propter hoc fallacy -- "after this, therefore because of this."  We assume that if two events are in close proximity in time and space, the first one must have caused the second one.  Dr. Paul Offit, director of the Vaccine Education Center at Children's Hospital of Philadelphia, likes to tell a story about his wife, who is a pediatrician, preparing to give a child a vaccination.  The child had a seizure as she was drawing the vaccine into the syringe.  If the seizure had occurred only a minute later, right after the vaccine was administered, the parents would undoubtedly have thought that the vaccination caused the seizure -- and after that, no power on Earth would have likely convinced them otherwise.

[Image is in the Public Domain courtesy of the NIH]

Why do we do this?  The most reasonable explanation is that in our evolutionary history, forming such connections had significant survival value.  Since it's usual that causes and effects are close together in time and space, wiring in a tendency to decide that all such correspondences are causal is still going to be right more often than not.  But it does lead us onto some thin ice, logic-wise.

Which is bad enough, but consider the study from three researchers -- Ion Yarritu (Deusto University), Helena Matute (University of Bilbao), and David Luque (University of New South Wales) -- that shows our falling for what they call the "causal illusion" is so powerful that even evidence to the contrary can't fix the error.

In a paper called "The dark side of cognitive illusions: When an illusory belief interferes with the acquisition of evidence-based knowledge," published in the British Journal of Psychology, Yarritu et al. have demonstrated that once we've decided on an explanation for something, it becomes damn near impossible to change.

Their experimental protocol was simple and elegant.  The authors write:
During the first phase of the experiment, one group of participants was induced to develop a strong illusion that a placebo medicine was effective to treat a fictitious disease, whereas another group was induced to develop a weak illusion.  Then, in Phase 2, both groups observed fictitious patients who always took the bogus treatment simultaneously with a second treatment which was effective.  Our results showed that the group who developed the strong illusion about the effectiveness of the bogus treatment during Phase 1 had more difficulties in learning during Phase 2 that the added treatment was effective.
The strength of this illusion explains why bogus "alternative medicine" therapies gain such traction.  All it takes is a handful of cases where people use "deer antler spray" and find they have more energy (and no, I'm not making this up) to get the ball rolling.  A friend just told me about someone she knows who has stage four breast cancer.  Asked how her chemo treatment was going, the friend said cheerfully, "Oh, I'm not doing chemo.  I'm treating it with juicing and coffee enemas!  And I feel fine!"

Sadly, she'll "feel fine" until she doesn't anymore, and at that point it'll probably be too late for chemo to help her.

Homeopathy owes a lot to this flaw in our reasoning ability; any symptom abatement that occurs after taking a homeopathic "remedy" clearly would have happened even if the patient had taken nothing -- which is, after all, what (s)he did.

And that's not even considering the placebo effect as a further complicating factor.

Helena Matute, one of the researchers in the recent study, has written extensively about the difficulty of battling causal illusions. In an article she wrote for the online journal Mapping Ignorance, Matute writes:
Alternative medicine is often promoted on the argument that it can do no harm.  Even though its advocates are aware that its effectiveness has not been scientifically demonstrated, they do believe that it is harmless and therefore it should be used.  "If not alone, you should at least use it in combination with evidence-based treatments," they say, "just in case."  
But this strategy is not without risk... even treatments which are physically innocuous may have serious consequences in our belief system, sometimes with fatal consequences.  When people believe that a bogus treatment works, they may not be able to learn that another treatment, which is really effective, is the cause of their recovery. This finding is important because it shows one of the mechanisms by which people might decide to quit an efficient treatment in favor of a bogus one.
I think this same effect is contributory to errors in thinking in a great many other areas.  Consider, for instance, the fact that belief in anthropogenic climate change rises in the summer and falls in the winter.  After being told that human activity is causing the global average temperature to rise, our brains are primed to look out of the window at the snow falling, and say, "Nah.  Can't be."

Post hoc, ergo propter hoc.  To quote Stephen Colbert, "Global warming isn't real, because I was cold today.  Also great news: world hunger is over because I just ate."

The study by Yarritu et al. highlights not only the difficulty of fighting incorrect causal connections, but why it is so essential that we do so.  The decision that two things are causally connected is powerful and difficult to reverse; so it's critical that we be aware of this bias in thinking, and watch our own tendency to leap to conclusions.  But even more critical is that we are given reliable evidence to correct our own errors in causality, and that we listen to it.  Like any cognitive bias, we can combat it -- but only if we're willing to admit that we might get it wrong sometimes.

Or as James Randi was fond of saying, "Don't believe everything you think."

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Tuesday, January 19, 2021

Ghost wardrobe

Debating endlessly over silly conjectures is nothing new.  The claim has been endlessly circulated that the medieval scholastics, for example, conducted learned arguments over how many angels could dance on the head of a pin.   Whether they actually argued over the issue is itself the subject of debate; it seems like the earliest iteration of the idea for which we have written evidence is in The Reasons of the Christian Religion by seventeenth century Puritan theologian Richard Baxter, wherein he writes:
And Schibler with others, maketh the difference of extension to be this, that Angels can contract their whole substance into one part of space, and therefore have not Extra Partes.  Whereupon it is that the Schoolmen have questioned how many Angels may fit upon the point of a Needle?
Which I think we can agree is equally silly.  Given that no one has actually conducted a scientific examination of an angel, determining whether they have Extra Partes is kind of a waste of time.

Although you may recall that Alan Rickman as the Angel Metatron in Dogma made a significant point about angels not having genitalia.  Whether that's admissible as evidence, however, is dubious at best.



So there's a good bit of precedent for people wasting inordinate amounts of time arguing over questions that there's no way to settle.  Which is why I have to admit to rolling my eyes more than once over the article by Stephen Wagner, "Paranormal Phenomena Expert," called, "Why Are Ghosts Seen Wearing Clothes?"

I have to admit, however, that it was a question I'd never considered. If the soul survives, and some souls decide not to go on to their Eternal Reward but to hang around here on Earth to bother the living, you have to wonder why their clothes came along with them.   Clothes, I would imagine, have no souls themselves, so the idea that you're seeing the Undying Spirit of grandpa's seersucker jacket is kind of ridiculous.

Be that as it may, most ghosts are seen fully clothed.  There are exceptions; in 2011 a woman in Cleveland claims to have captured video of two naked ghosts having sex.  But I think we have to admit that such afterlife in flagrante delicto is pretty uncommon.

Wagner spoke with some ghost hunters, and turns out that there's a variety of explanations that have been offered for this.  Troy Taylor, of the American Ghost Society (did you know there was an American Ghost Society?  I didn't) said that ghosts are seen clothed because a haunting is the replaying of a deceased spirit's visualization of itself, and we usually don't picture ourselves in the nude.

On the other hand, Stacey Jones, who calls herself the "Ghost Cop," says that ghosts can project themselves any way they want to.  So what they're doing is creating an image of themselves that has the effect they're after, whether it is eliciting fear, pity, sympathy, or a desire for revenge.  Does that mean that Anne Boleyn, for example, could wander around the Tower of London wearing a bunny suit if she wanted to?  You'd think that she'd be mighty bored after nearly five centuries of stalking around with her Head Tucked Underneath Her Arm, and would be ready for a change.

Ghost hunters Richard and Debbie Senate were even more terse about the whole thing.  It's a "gotcha question," they say.  But if pressed, they'd have to say that "Ghosts appear as wearing clothes because that's how they appear to us."  Which I think we can all agree is unimpeachable logic.

I find it pretty amusing that this is even a topic for debate.  Shouldn't we be more concerned about finding scientifically-sound evidence that ghosts exist, rather than fretting over whether we get to take our wardrobe with us into the next world?  As I've said more than once, I am completely agnostic about the afterlife; I simply don't know.  I find some stories of near-death experiences and hauntings intriguing, but I've never found anything that has made me come down on one or the other side of the debate with any kind of certainty.  I'll find out one way or the other at some point no matter what, and if I haven't figured it out before then, I'm content to wait.

So I suppose this falls into the "No Harm If It Amuses You" department.  But it does raise the question of what kind of clothes I want to bring with me if it turns out you do get to choose.  If I end up haunting somewhere nice and tropical -- certainly my preference -- all I'll need is a pair of swim trunks.  On the other hand, if I'm stuck here in upstate New York, which seems more likely, I want my winter jacket, wool scarf, hat, and gloves.

Unless my spirit getting stuck here in perpetuity, with no cold-weather gear, is because I've been sent to hell by the powers-that-be.  Which unfortunately also seems fairly likely.

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]