Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, May 8, 2023

The dying of the light

In the brilliant, funny, thought-provoking, and often poignant television series The Good Place, a character named Simone, who is an Australian neuroscientist, ends up in heaven (the titular "Good Place") and flatly refuses to believe it.

The whole thing, she claims, is merely a hallucination cooked up by her dying, oxygen-starved brain.  That she died (or was in the process of it), she could believe; but knowing what she does about neurophysiology, it is simply impossible for her to accept that what she is seeing is real.

The more you know about the brain and its sensory/perceptual system, the easier it is to understand how an actual neuroscientist would come to that conclusion.  As we've seen here at Skeptophilia a good many times, what we perceive is fragmentary and inaccurate, and that's even while we're alive, wide awake, and all the relevant organs are in good working order.  As astrophysicist Neil deGrasse Tyson put it, all too accurately, "The human brain is rife with ways of getting it wrong."

Oh, it works well enough most of the time.  We wouldn't have survived long otherwise.  But to assume that what you're perceiving, and (even worse) what you remember perceiving, is at all complete and accurate is simply false.

It gets even dicier when things start to go wrong.  Which was why I was so fascinated with a study from the University of Michigan that was published a couple of weeks ago in Proceedings of the National Academy of Sciences that looked at EEG traces from comatose patients who had experienced cardiac arrest and died, and the researchers found as the patients died, their brains showed a surge of activity in the regions associated with consciousness and perception.

Gamma wave activity -- associated with awareness -- spiked, as did signaling at the junction of the temporal, occipital, and parietal lobes of the cerebrum.  This area is correlated with dreaming, hallucination, and other altered states of consciousness, and the high activity there might be an explanation for the commonalities in near-death experiences, like the familiar "tunnel of light" that has been reported hundreds of times.

This story was reported in a lot of popular media as providing support for claims that "your life flashes before your eyes" as you die, but that seems to me to be a significant stretch.  For one thing, the study was small; only four individuals, understandable given the specificity of the criteria.  For another, the spike of activity in the temporal-occipital-parietal junction is correlated with altered states of consciousness, but it doesn't tell us what these people were actually experiencing.  And we can't ask them about it, because they're dead.

[Image from Punch, 1858, is in the Public Domain]

So what this says about the experience of dying is in the category of "interesting but very preliminary," and what it says about the possibility of an afterlife is "nothing."  My guess is people who already disbelieve in an afterlife will, like Simone, add this to the evidence against, and the people who already believe in it will add it to the evidence in favor.  In reality, of course, the new study only looks at the threshold of death, not what happens after it occurs.  I'm still agnostic about an afterlife, myself.  I recently read an article written by by Stafford Betty, professor emeritus of religious studies at California State University - Bakersfield, who stated that survival after death was "a near certainty" and that doubters are simply ignoring a mountain of evidence.  "They are so dug into their materialist worldview," Betty writes, "that they refuse to investigate research that contradicts it.  They are afraid of getting entangled in a worldview, often religiously based, that belongs to a past they 'outgrew.'"

Well, maybe.  I've read a lot of the research, and I don't think it's as clear-cut as all that, nor is my skepticism due to my clinging to materialism or a fear of getting trapped in religion.  In fact, I can say without hesitation that if I found out there was an afterlife, I'd be pretty thrilled about it.  (Some afterlifes, anyway.  I'm not so fond of the ones where you're tortured for eternity.  But Valhalla, for example, sounds badass.)  It's more that the evidence I've seen doesn't reach a level of rigor I find convincing.

But I'm certainly open to the idea.  Like I said, the other option, which is simply ceasing to be, isn't super appealing.

Anyhow, the University of Michigan paper is fascinating, and gives us a unique lens into the experience of someone while dying.  It's the one thing that unites us all, isn't it?  We'll all go through it eventually.  It reminds me of the passage from my novel The Communion of Shadows, where the main characters are discussing the fear of death:

“Aren’t you scared?” came T-Joe’s voice from behind him, after a moment’s silence.

“Scared? A little.”  Leandre paused.  “It’s like when I was a child, and I used to climb an oak tree that leaned out over the bayou.  You’re there, hunched on the branch, nothing but the empty air between your naked body and the water’s surface.  It looks like it’s a hundred feet down.  You think, ‘I can’t do it.  I can’t jump.’  Your hands cling to the branch, your heart is pounding, you’re dripping sweat.  You know once you jump it’ll be all right, you’ll swim to shore and in a moment be ready to do it again.  But in that instant, it seems impossible.”  He paused, giving a lazy swat at a mosquito.  “I’m once again that skinny little boy in the tree, looking down at the bayou, and thinking I’ll never have the courage to leap.  I know I can do it, and that it’ll be okay.  Think of all the people who have passed these gates, endured whatever death is and gone on to what awaits us beyond this world.”  He turned around with a broad smile on his face.  “If they can do it, so can I.”

****************************************



Saturday, May 6, 2023

Resurrecting a fossil

A year ago I wrote about linguistic isolates -- single languages, or small clusters of related languages, that have no apparent relation to any other language on Earth.  The problem, of course, is that being spoken by only a small number of people, these are some of the most endangered languages.  There are many of them for which the last native speakers are already elderly, and there's a high risk of their going extinct without ever being thoroughly studied.

And as I pointed out in my post, the sad part of that is that each one of those is a lens into a specific culture and a particular way of thinking.  Once lost, they're gone forever, or only exist in scattered remnants, like the fossils of extinct animals.  What you can reconstruct from these relics is perhaps better than nothing, but still, there's always an elegiac sense of what we've lost, and what we're still losing.

This topic comes up because of an article in Smithsonian sent to me by a friend and frequent contributor of topics to Skeptophilia, about some linguists who are trying to reconstruct the extinct indigenous Timucuan language of northern Florida.  Timucuan was a linguistic isolate, and seems to be unrelated to the languages spoken by neighboring groups (such as the Seminole, Muscogee, and Choctaw).  The Timucua people, which at the time of European contact in 1595 comprised an estimated 200,000 people in 35 chiefdoms, each of which spoke a different dialect, was decimated by war and by diseases like smallpox.  By 1700, there were only about a thousand Timucuans left, and the slave trade eradicated those few survivors.  There is currently a genetic study to see if some populations in Cuba might be the descendants of the Timucuans, but so far the results are inconclusive.

This would just be another in the long list of complete and irretrievable cultural loss, if it weren't for the efforts of linguists Alejandra Dubcovsky and Aaron Broadwell.  Working with a handful of letters written in Timucuan (using the Latin alphabet), and a rather amazing bilingual document by Spanish missionary Francisco Pareja called Confessions in the Castilian and Timucua Language, With Some Tips to Encourage the Penitent, they have assembled the first Timucuan dictionary and grammar, and reconstructed how a long-gone people spoke.

A page from the Confessions, with Spanish on the left and Timucuan on the right [Image courtesy of the John Carter Brown Library, Brown University]

Which is incredibly cool, but there's also a wryly amusing side to it, because with Dubcovsky's and Broadwell's knowledge of the Timucuan language, they're able to compare what Pareja wanted the translators to say with what they actually did say.  "Our favorite is the description of marriage," Dubcovsky said.  "The Spanish side asks very clearly, 'Have the man and a woman been joined together in front of a priest?'  And the Timucua version of that sentence is, 'Did you and another person consent to be married?'  The Timucua translation not only takes out any mention of gender, but it also removes any mention of a religious officiant.  A priest did not write this, because a priest does not forget to include himself in the story."

So the Confessions document is not only a Rosetta Stone for Timucuan, it gives us a fascinating window into how the Timucuan translators saw the Spanish Catholic culture that was being imposed upon them.

It's tragic that this language and its people were so thoughtlessly (and ruthlessly) eradicated; worse still that such tragedies are all too common.  So it's all the more important that people like Dubcovsky snd Broadwell work to resurrect these extinct languages from the scant fossils they left behind.  It can't ever repair the damage that was done, but at least allows us to glimpse the minds of an extinct culture -- and to honor their memory in whatever way we can.

****************************************



Friday, May 5, 2023

Rough neighborhood

In keeping with the stargazing topics that have been our focus this week, today we're going to start with my favorite naked-eye astronomical object: the Pleiades.

[Image is in the Public Domain courtesy of NASA/JPL]

It's also known as the Seven Sisters; in Greek mythology, the seven brightest stars (about all you can see without a telescope, even if you have good vision) represented the seven daughters of the Titan Atlas and the Oceanid nymph Pleione.  Where I live they're visible in the winter; I love seeing them glittering in the black sky on cold, clear nights.

The Pleiades are mostly hot type-B stars, and the whole group is about 444 light years from Earth, making it one of the closest star clusters.  Stars of this class are so energetic that they have relatively short life spans.  It's estimated that the Pleiades formed about a hundred million years ago from a cloud of gas and dust similar to the Orion Nebula; already the energy output of the individual stars is blowing away the shroud of material from which they were formed, resulting in the halo-like "reflection nebulae" you see surrounding them.

They're also moving away from each other, leaving the "stellar nursery" in which they were born.  In another couple of hundred million years, they will have separated widely enough that future astronomers (assuming there are any around) will have no obvious way to know they started out in the same region of space.  Plus, the biggest and brightest of them will already be approaching the ends of their lives, exploding in the violent cataclysm of a supernova, leaving behind a rapidly-rotating stellar remnant called a neutron star, spinning like a lighthouse beacon to mark the spot where a star died.

The reason all this comes up is some recent research into the composition of the stellar nursery where the Sun formed.  Because it, after all, was born the same way; along with a number of siblings, it coalesced in a massive cloud of hydrogen and helium, with a few heavier elements thrown in as well.  When you look up into the night sky, any of the stars you see could be one of the Sun's sibs.  It's impossible, from where science currently stands, to tell which ones.  They've all undoubtedly traveled a long way away from their point of origin in the 4.6 billion years since they formed.

But the research, which appeared in the Monthly Notices of the Royal Astronomical Society, uncovered a bit more about what our star's stellar nursery was like.  These formations do have some significant differences -- some are small and quiet, with only enough material to form a few stars, while others are enormous and violently active (such as the aforementioned Orion Nebula).  In particular, the models of stellar formation suggest that the two different environments would influence the quantities of heavier elements like aluminum and iron.  By measuring the amounts of these elements in meteorite fragments that are thought to be leftover material from the formation of the Solar System, the researchers concluded that the Sun formed in a high-energy intense environment like the Orion Nebula, swept by gales of dust and hammered by the shock waves of supernovae.

What a sight that would have been.  (From a safe distance.)

So next time you see the Pleiades or Orion's Belt, think about the fact that our calm and stable home star was born in a rough neighborhood.  Lucky for us, it's grown up and settled down a little.  As beautiful as the Pleiades are, I don't think I'd fancy living there.

****************************************



Thursday, May 4, 2023

Blowing bubbles

After Monday's post, about the bizarre hypergiant star Stephenson 2-18, a reader commented, "If you think that's weird, look up 'Fermi bubbles.'"

So I did.  And... yeah.

Discovered back in 2010, the Fermi bubbles -- so named because they were discovered by NASA's Fermi Gamma-ray Telescope -- are a pair of nearly perfectly symmetrical bubbles of high-intensity gamma rays positioned above and below the galactic plane of the Milky Way.  They're huge; each one has a diameter of about 23,000 light years.

False-color image of the Fermi bubbles.  The Milky Way is seen edge-on, running across the middle of the photograph.  [Image is in the Public Domain courtesy of NASA/Goddard Space Flight Center]

Back in 2015, the Fermi bubbles were still completely unexplained, and in fact made #1 in Astronomy magazine's list of "The Fifty Weirdest Objects in the Universe."  That they had something to do with Sagittarius A*, the enormous black hole at the center of the galaxy, seemed like a reasonable guess; but what could create something with such a peculiar figure-eight shape was unknown.

A team led by astrophysicist Rongmon Bordoloi of the Massachusetts Institute of Technology, however, has a model to explain them.  Something around nine million years ago -- not really that far back, in the grand scheme of things -- Sagittarius A* pulled in an enormous cloud of gas and dust.  The origin of that dust cloud is uncertain, but what happened after it got caught is all too clear.  Most of it undoubtedly took the one way trip past the event horizon, but some of it was spun so fast by the black hole's rotation and the resultant twisting of space-time that it gained enough momentum to escape along Sagittarius A*'s spin axis -- i.e., perpendicular to the galactic plane.

This not only accelerated the gas to an unimaginable two million miles an hour, it heated it -- at its edges to just shy of ten thousand degrees C, and near the point of outflow to almost ten million degrees.  It's this heating that caused it to produce gamma rays, which is how the structure was detected.

Not a phenomenon you'd want to be standing in the way of.

"We have traced the outflows of other galaxies, but we have never been able to actually map the motion of the gas," Bordoloi said, somehow resisting adding, and holy shit, this thing is amazing.  "The only reason we could do it here is because we are inside the Milky Way.  This vantage point gives us a front-row seat to map out the kinematic structure of the Milky Way outflow."

And, along the way, to figure out what's going on with the number one Weirdest Object in the Universe.  Having an explanation doesn't make it any less impressive, of course.  Gas at a temperature of ten million degrees being flung about at two million miles per hour by a ginormous black hole isn't exactly a cause for a shoulder-shrug.

Besides, there are forty-nine more weird objects (at least) left to explain.  If you're into science, it means you'll never be bored.

****************************************



Wednesday, May 3, 2023

The mind readers

In Isaac Asimov's deservedly famous short story "All the Troubles of the World," the megacomputer Multivac has so much data on each person in the world (including detailed brain scans) that it can predict ahead of time if someone is going to commit a crime.  This allows authorities to take appropriate measures -- defined, of course, in their own terms -- to prevent it from happening.

We took a step toward Asimov's dystopian vision, in which nothing you think is secret, with a paper this week in Nature Neuroscience about a new invention called a "brain activity decoder."

Developed by a team of researchers at the University of Texas at Austin, the software uses an fMRI machine to measure the neural activity in a person's brain, and is able to convert that neural activity into a continuous stream of text -- i.e., the output is what the person was thinking.

The researchers had volunteers listening to podcasts over headphones while the fMRI watched how their brains responded.  This allowed them to compare the actual text the test subjects were hearing with what the brain activity decoder picked up from them.  After only a short span of training the software, the results were scary good.  One listener heard, "I don't have my driver's license yet," and the decoder generated the output "She has not even started to learn to drive yet."  Another had the input, "I didn’t know whether to scream, cry or run away. Instead, I said, 'Leave me alone!'", which resulted in the output, "Started to scream and cry, and then she just said, 'I told you to leave me alone.'"

Not perfect, but as a proof-of-concept, it's jaw-dropping.

[Image licensed under the Creative Commons © Nevit Dilmen, Brain MRI 131058 rgbca, CC BY-SA 3.0]

The researchers touted its possible use for people who have lost the ability to communicate, in situations like locked-in syndrome.  However, I don't think it takes an overactive imagination to come up with ways such a device could be abused.  What would happen to the concept of privacy, if a machine could read your thoughts?  What about the Fifth Amendment right not to self-incriminate?  Like in Asimov's story, how could the authorities separate what a person had done from what they were contemplating doing?

Or would they?

Jerry Tang, who led the research, emphasizes that the decoder had to be trained on the person whose thoughts were going to be read; if it were trained on me, it couldn't immediately be used to figure out what you were thinking.  My response to that is: yet.  This is already leaps and bounds past previous attempts at thought-reading, which was only able to output single words and short sentences.  Given more time and further refinements, this technique will only get better.

Or scarier, as the case may be.

Tang also pointed out that even with improvements, the software would be defeated by someone putting up resistance (e.g., deliberately thinking other things to block the fMRI from getting the correct output).  He also is aware of the possibility of abuse.  "We take very seriously the concerns that it could be used for bad purposes and have worked to avoid that," he said.  "We want to make sure people only use these types of technologies when they want to and that it helps them."

Well, maybe.  I'm not a conspiracy-theory type, nor someone who thinks that all government is inherently bad.  Here, though, it seems like the potential for Orwellian thought-crime is a short step away.

Keep in mind, too, how generally inaccurate our brain's storage system is.  As we've seen over and over here at Skeptophilia, what we remember is an amalgam of what actually happened, what we were told happened, what we imagine happened, and a good dollop of falsehood.  False memories can be as convincingly real as accurate ones.  If the brain activity decoder were used on an unwilling person to extract his/her thoughts, there is no guarantee that the output would be at all reflective of reality.  In fact, it's almost certain not to be.

But since eyewitness testimony -- in other words, recall -- is considered one of the highest forms of evidence in a court of law, it's no stretch to wonder if a person's thoughts would be given the same undeserved weight.

I'm not sure what the right step is, honestly.  There are some who believe that a potential for misuse shouldn't stop scientific progress; anything, they argue, can be used for harm.  Others feel like the hazards can sometimes outweigh the benefits, and trusting the powers-that-be to do the right thing with technology this powerful is foolish.

I don't have an answer.  But I will say that my mind was forced back to the prescient quote from another seminal science fiction writer, Michael Crichton: "Scientists are preoccupied with accomplishment.  So they are focused on whether they can do something.  They never stop to ask if they should."

****************************************



Tuesday, May 2, 2023

Off the chart

Way back around 1910, Danish astronomer Ejnar Hertzsprung and American astronomer Henry Norris Russell independently found a curious pattern when they did a scatterplot correlation between stars' luminosities and temperatures.

The graph, now called the Hertzsprung-Russell Diagram in their honor, looks like this:

[Image licensed under the Creative Commons Richard Powell, HRDiagram, CC BY-SA 2.5]

Most stars fall on the bright swatch running from the hot, bright stars in the upper left to the cool, dim stars in the lower right; the overall trend for these stars is that the lower the temperature, the lower the luminosity.  Stars like this are called main-sequence stars.  (If you're curious, the letter designations along the top -- O, B, A, F, G, K, and M -- refer to the spectral class the star belongs to.  These classifications were the invention of the brilliant astronomer Antonia Maury, whose work in spectrography revolutionized our understanding of stellar evolution.)

There is also a sizable cluster of stars off to the upper right -- relatively low temperatures but very high luminosities.  These are giants and supergiants.  In the other corner are white dwarfs, the exposed cores of dead stars, with very high temperatures but low luminosity, which as they gradually cool slip downward to the left and finally go dark.

So there you have it; just about every star in the universe is either a main-sequence star, in the cluster with the giants and supergiants, or in the curved streak of dwarf stars at the bottom of the diagram.

Emphasis on the words "just about."

One star that challenges what we know about how stars evolve is the bizarre Stephenson 2-18, which is in the small, dim constellation Scutum ("the shield"), between Aquila and Sagittarius.  At an apparent magnitude of +15, it is only visible through a powerful telescope; it was only discovered in 1990 by American astronomer Charles Bruce Stephenson, after whom it is named.

Its appearance, a dim red point of light, hides how weird this thing actually is.

When Stephenson first analyzed it, he initially thought what he was coming up with couldn't possibly be correct.  For one thing, it is insanely bright, estimated at a hundred thousand times the Sun's luminosity.  Only its distance (19,000 light years) and some intervening dust clouds make it look dim.  Secondly, it's enormous.  No, really, you have no idea how big it is.  If you put Stephenson 2-18 where the Sun is, its outer edge would be somewhere near the orbit of Saturn.  You, right now, would be inside the star.  Ten billion Suns would fit inside Stephenson 2-18. 

If a photon of light circumnavigated the surface of the Sun, it would take a bit less than fifteen seconds.  To circle Stephenson 2-18 would take nine hours.

This puts Stephenson 2-18 almost off the Hertzsprung-Russell Diagram -- it's in the extreme upper right corner.  In fact, it's larger than what what stellar evolution says should be possible; the current model predicts the largest stars to have radii of no more than 1,500 times that of the Sun, and this behemoth is over 2,000 times larger.

Astronomers admit that this could have a simple explanation -- it's possible that the measurements of Stephenson 2-18 are overestimates.  But if not, there's something significant about stellar evolution we're not understanding.

Either way, this is one interesting object.

There's also a question about what Stephenson 2-18 will do next.  Astrophysicists suspect it might be about to blow off its outer layers and turn either into a luminous blue variable or a Wolf-Rayet star (the latter are so weird and violent I wrote about them here a while back).  So it may not be done astonishing us.

Puts me in mind of the quote from Richard Dawkins: "The feeling of awed wonder that science can give us is one of the highest experiences of which the human psyche is capable."

****************************************



Monday, May 1, 2023

The kludge factory

Know what a kludge is?

Coined by writer Jackson Granholm in 1962, a kludge is "an ill-assorted collection of poorly-matching parts, forming a distressing whole."  Usually created when a person is faced with fixing something and lacks (1) the correct parts, (2) the technical expertise to do it right, or (3) both, kludges fall into the "it works well enough for the time being" category.

[Image licensed under the Creative Commons Zoedovemany, Screen Shot 2015-11-19 at 11.54.48 AM, CC BY-SA 4.0]

Evolution is essentially a giant kludge factory.

At its heart, it's the "law of whatever works."  It's why the people who advocate Intelligent Design Creationism always give me a chuckle -- because if you know anything about biology, "intelligently designed" is the last thing a lot of it is.  Here are a few examples:

  • Animals without hind legs -- notably whales and many snakes -- that have vestigial hind leg bones.
  • Primates are some of the only mammals that cannot synthesize their own vitamin C -- yet we still carry the gene for making it.  It just doesn't work because it has a busted promoter.
  • Human sinuses.  Yeah, you allergy sufferers know exactly what I'm saying.
  • The recurrent laryngeal nerve in fish follows a fairly direct path, from the brain past the heart to the gills.  However, when fish evolved into land-dwelling forms and their anatomy changed -- their necks lengthening and their hearts moving lower into the body -- the recurrent laryngeal nerve got snagged on the circulatory system and had to lengthen as its path became more and more circuitous.  Now, in giraffes (for example), rather than going from the brain directly to the larynx, it goes right past its destination, loops under the heart, and then back up the neck to the larynx -- a distance of almost five meters.
  • Our curved lower spines were clearly not "designed" to support a vertically-oriented body.  Have you ever seen a weight-bearing column with an s-bend?  No wonder so many of us develop lower back issues.
  • One of the kludgiest of kludges is the male genitourinary tract.  Not only does the vas deferens loop way upward from the testicles (not quite as far as the giraffe's laryngeal nerve, admittedly), along the way it joins the urethra to form a single tube through the penis, something about which a friend of mine quipped, "There's intelligent design for you.  Routing a sewer pipe through a playground."  It also passes right through the prostate, a structure notorious for getting enlarged in older guys.  C'mon, God, you can do better than that.

The reason all this comes up is that the kludging goes all the way down to the molecular level.  A study from a team at Yale, Harvard, and MIT that appeared last week in the journal Science looked at the fact that when you compare the human genome to that of our nearest relatives, you find that one of the most significant differences is that our DNA has deleted sections.

That's right; some of why humans are human comes from genes that got knocked out in our ancestors.

The researchers found that there are about ten thousand bits of DNA, a lot of them consisting only of a couple of base pairs, that chimps and bonobos have and we don't.  A lot of these genetic losses were in regions involved in cognition, speech, and the development of the nervous system, all areas in which our differences are the most obvious.

The reason seems to have to do with gene switching.  Deleting a bit of switch that is intended to shut a gene off can leave the gene functioning for longer, with profound consequences.  Often these consequences are bad, of course.  There are some types of cancer (notably retinoblastoma) that are caused by a developmental gene having a faulty set of brakes.

But sometimes these changes in developmental patterns have a positive result, and therefore a selective advantage -- and we may owe our large brains and capacity for speech to kludgy switches.

"Often we think new biological functions must require new pieces of DNA, but this work shows us that deleting genetic code can result in profound consequences for traits make us unique as a species," said Steven Reilly, senior author of the paper.  "The deletion of this genetic information can have an effect that is the equivalent of removing three characters -- n't -- from the word isn't to create the new word is...  [Such deletions] can tweak the meaning of the instructions of how to make a human slightly, helping explain our bigger brains and complex cognition."

So yet another nail in the coffin of Intelligent Design Creationism, if you needed one.  Of course, I doubt it will convince anyone who wasn't already convinced; as I've observed more than once, you can't logic your way out of a belief you didn't logic your way into.

But at least it's good to know the science is unequivocal.  And, as astrophysicist Neil deGrasse Tyson said, "The wonderful thing about science is that it's true whether or not you believe in it."

****************************************