Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label electroencephalogram. Show all posts
Showing posts with label electroencephalogram. Show all posts

Thursday, February 29, 2024

The dying of the light

In July of 2004, my father died.  I was at his bedside in Our Lady of Lourdes General Hospital in Lafayette, Louisiana when it happened.  He'd  been declining for a while -- his once razor-sharp mental faculties slipping into a vague cloudiness, his gait slowing and becoming halting and cautious, his former rapier wit completely gone.  The most heartbreaking thing was his own awareness of what he had lost and would continue to lose.  It looked like a slow slide into debility.

Then, in June, he had what the doctors described as a mini-stroke.  Afterward, he was still fairly lucid, but was having trouble walking.  It had long been his deepest fear (one I share) that he'd become completely dependent on others for his care, and it was obvious to us (and probably to him as well) that this was the direction things were going.

What happened next was described in three words by my mother: "He gave up."

Despite the fact that the doctors could find no obvious direct cause of it, his systems one by one started to shut down.  Three weeks after the mini-stroke and fall that precipitated his admission into the hospital, he died at age 83.

I had never been with someone as they died before (and haven't since).  I was out of state when my beloved grandma died in 1986; and when my mother died, eight months after my father, it was so sudden I didn't have time to get there.  But I was by my father's side as his breathing slowed and finally stopped.  The event itself wasn't at all dramatic; the transition between life and death was subtle, gentle, and peaceful.  However wrenching it was on my mother and me, for him there seemed to be hardly a boundary between "here" and "not here."

Of course, I'm judging that from the outside.  No one knows -- no one can know -- what the experience was like for him.  It's funny, really; death is one of the experiences that unites us as human, and one which we all will ultimately share, but none of us knows what it actually is.

Noël LeMire, La Mort et le Mourant (ca. 1770) [Image is in the Public Domain]

A study in the journal Frontiers in Aging Neuroscience, though, may be the first clue as to what the experience is like.  An 87-year-old Canadian epilepsy patient was set up for an electroencephalogram to try and get a picture of what was causing his seizures, when he unexpectedly had a severe heart attack.  The man was under a DNR (Do Not Resuscitate) order, so when his heart stopped beating, they let him die...

... but he was still hooked up to the EEG.

This gave his doctors our first glimpse into what is happening in the brain of someone as they die.  And they found a sudden increase in activity in the parts of the brain involved in memory, recall, and dreaming -- which lasted for thirty seconds after his heart stopped, then gradually faded.

"Through generating oscillations involved in memory retrieval, the brain may be playing a last recall of important life events just before we die, similar to the ones reported in near-death experiences," said Ajmal Zemmar, a neurosurgeon who was the study's lead author.  "As a neurosurgeon, I deal with loss at times.  It is indescribably difficult to deliver the news of death to distraught family members.  Something we may learn from this research is that although our loved ones have their eyes closed and are ready to leave us to rest, their brains may be replaying some of the nicest moments they experienced in their lives."

Which is a pleasant thought.  Many of us -- even, for some reason, the devoutly religious, who you'd think would be positively eager for the experience -- are afraid of death.  Me, I'm not looking forward to it; I rather like being alive, and as a de facto atheist I have no particular expectation that there'll be anything afterwards.  Being with my father as he died did, however, have the effect of making me less afraid of death.  The usual lead-up, with its frequent pain and debility and illness, is still deeply terrifying to me, but crossing the boundary itself seemed fairly peaceful.

And the idea that our brains give us one last go-through of our pleasant memories is kind of nice.  I know that this single patient's EEG is hardly conclusive -- and it's unlikely there'll be many other people hooked up to a brain scanner as they die -- but it does give some comfort that perhaps, this experience we will all share someday isn't as awful as we might fear.

****************************************



Friday, January 22, 2021

The mental walkabout

I don't know about you, but I have a real problem with my mind wandering.

It's not a new thing.  I can remember getting grief for daydreaming back when I was in grade school.  I'd be sitting in class, trying my damndest to concentrate on transitive verbs or the Franco-Prussian War or whatnot, but my gaze would drift off to some point in the middle distance, my auditory processing centers would switch from "external input" to "internal input" mode, and in under a minute I'd be out in interstellar space or roaming around Valhalla with Odin and the Boys or rowing a boat down the Amazon River.

Until the teacher would interrupt my reverie with some irrelevant comment like, "Gordon!  Pay attention!  Why don't you tell us how to find x in the equation 5x - 9 = 36?"  I was usually able to refrain from saying what came to mind, namely, that she was the one who lost x in the first place and it was hardly my responsibility to find it, but I usually was able to get myself together enough to take a shot at playing along and giving her a real answer.

I never outgrew the tendency (either to daydreaming or to giving authority figures sarcastic retorts).  It plagued me all through college and beyond, and during my teaching career I remember dreading faculty meetings because I knew that five minutes in I'd be doodling on the agenda despite my vain attempt to be a Good Boy and pay attention.  It's part of how I developed my own teaching style; a mentor teacher told me early along that teaching was 25% content knowledge and 75% theater, and I took that to heart.  I tried to lecture in a way that kept students wondering what the hell I was going to say or do next, because I know that's about the only thing that kept me engaged when I was sitting in the student's desk and someone else was in front of the room.

One amusing case in point -- Dr. Cusimano, who taught a British History elective I took as a senior in college.  He was notorious for working puns and jokes into his lectures, and doing it so smoothly and with such a straight face that if you weren't paying attention, it could slip right past you.  I recall early in the course, when he was talking about the fall of the Roman Empire, Dr. Cusimano said, "During that time, what was left of the Roman Empire was invaded by a series of Germanic tribal leaders -- there was Alaric, King of the Visigoths; Gunderic, King of the Vandals; Oscar Mayer, King of the Franks..."

I'd bet cold hard cash there were students in the class who wrote that down and only erased it when one by one, their classmates caught on and started laughing.

I never daydreamed in Dr. Cusimano's class.

Edward Harrison May, Daydreaming (1876) [Image is in the Public Domain]

Anyhow, all of this comes up because of a study out of the University of California - Berkeley that appeared this week in Proceedings of the National Academy of Sciences.  Entitled, "Distinct Electrophysiological Signatures of Task-Unrelated and Dynamic Thoughts," by Julia W. Y. Kam, Zachary C. Irving, Caitlin Mills, Shawn Patel, Alison Gopnik, and Robert T. Knight, this paper takes the fascinating angle of analyzing the electroencephalogram (EEG) output of test subjects when focused on the task at hand, when focusing on something unrelated, or when simply wandering from topic to topic -- what the authors call "dynamic thought," like much of the game of random free association that my brain spends a significant portion of its time in.

The authors write:

Humans spend much of their lives engaging with their internal train of thoughts.  Traditionally, research focused on whether or not these thoughts are related to ongoing tasks, and has identified reliable and distinct behavioral and neural correlates of task-unrelated and task-related thought.  A recent theoretical framework highlighted a different aspect of thinking—how it dynamically moves between topics.  However, the neural correlates of such thought dynamics are unknown. The current study aimed to determine the electrophysiological signatures of these dynamics by recording electroencephalogram (EEG) while participants performed an attention task and periodically answered thought-sampling questions about whether their thoughts were 1) task-unrelated, 2) freely moving, 3) deliberately constrained, and 4) automatically constrained...  Our findings indicate distinct electrophysiological patterns associated with task-unrelated and dynamic thoughts, suggesting these neural measures capture the heterogeneity of our ongoing thoughts.

"If you focus all the time on your goals, you can miss important information," said study co-author Zachary Irving, in an interview with Science Direct.  "And so, having a free-association thought process that randomly generates memories and imaginative experiences can lead you to new ideas and insights."

Yeah, someone should have told my elementary school teachers that.

"Babies' and young children's minds seem to wander constantly, and so we wondered what functions that might serve," said co-author Allison Gopnik.  "Our paper suggests mind-wandering is as much a positive feature of cognition as a quirk and explains something we all experience."

So my tendency to daydream might be a feature, not a bug.  Still, it can be inconvenient at times.  I know there are a lot of things that would be a hell of a lot easier if I could at least control it, like when I'm reading something that's difficult going but that I honestly want to pay attention to and understand.  Even when my intention is to concentrate, it usually doesn't take long for me to realize that my eyes are still tracking across the lines, my fingers are turning pages, but I stopped taking anything in four pages ago and since that time have been imagining what it'd be like to pilot a spaceship through the Great Red Spot.  Then I have to go back and determine when my brain went AWOL -- and start over from there until the next time I go on mental walkabout.

I guess there's one advantage to being an inveterate daydreamer; it's how I come up with a lot of the plots to my novels.  Sometimes my internal imaginary worlds are more vivid than the real world.  However, I do need to re-enter the real world at least long enough to get the story down on paper, and not end up being too distracted to write down the idea I came up with while I was distracted last time.

In any case, I guess I'd better wrap this up, because I'm about at the limits of my concentration.  I'd like to finish this post before my brain goes on walkies and I end up staring out of my office window and wondering if there's life on Proxima Centauri b.  Which I guess is an interesting enough topic, but hardly the one at hand.

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Tuesday, May 2, 2017

Aesthetic synchrony

Probably most of you have had the fortunate experience of being in a situation where you were completely engaged in what you were doing.  This can be especially powerful when you are being given the chance to experience something novel -- listening to a lecture by a truly masterful speaker, attending a performance of music or theater, visiting a place of great natural beauty -- when you are having what writer Sir Ken Robinson (speaking of masterful lecturers) calls in his talk "Changing Education Paradigms" "an aesthetic experience, when your senses are operating at their peak, when you're present in the current moment, when you're resonating with the excitement of this thing you're experiencing, when you are fully alive."

When this happens, we often say we are "on the same wavelength" with others who are sharing the experience with us.   And now, a team led by Suzanne Dikker of New York University has shown that this idiom might literally be true.

Dikker's team had thirteen test subjects -- twelve high school students and their teacher -- wear portable electroencephalogram headsets for an entire semester of biology classes.  Naturally, some of the topics and activities were more engaging than others, and the researchers had students self-report daily on such factors as how focused they were, how much they enjoyed their teacher's presentation, how much they enjoyed the students they interacted with, and their satisfaction levels about the activities they were asked to take part in.

[image courtesy of the Wikimedia Commons]

Dikker et al. write:
The human brain has evolved for group living.  Yet we know so little about how it supports dynamic group interactions that the study of real-world social exchanges has been dubbed the "dark matter of social neuroscience."  Recently, various studies have begun to approach this question by comparing brain responses of multiple individuals during a variety of (semi-naturalistic) tasks. These experiments reveal how stimulus properties, individual differences, and contextual factors may underpin similarities and differences in neural activity across people...  Here we extend such experimentation drastically, beyond dyads and beyond laboratory walls, to identify neural markers of group engagement during dynamic real-world group interactions.  We used portable electroencephalogram (EEG) to simultaneously record brain activity from a class of 12 high school students over the course of a semester (11 classes) during regular classroom activities.  A novel analysis technique to assess group-based neural coherence demonstrates that the extent to which brain activity is synchronized across students predicts both student class engagement and social dynamics.  This suggests that brain-to-brain synchrony is a possible neural marker for dynamic social interactions, likely driven by shared attention mechanisms.  This study validates a promising new method to investigate the neuroscience of group interactions in ecologically natural settings.
Put simply, what the researchers found is that when the students reported feeling the most engaged, their brain activity actually synced with that of their classmates.  It squares with our subjective experience, doesn't it?  I know when I'm bored, irritated, or angered by something I'm being required to participate in, I tend to unhook my awareness from where I am -- including being less aware of those around me who are suffering through the same thing.

It's no wonder we call this kind of response "disengaging," is it?

So apparently misery doesn't love company; what loves company is engagement, appreciation, and a sense of belonging.  "The central hub seems to be attention," Dikker says.  "But whatever determines how attentive you are can stem from various sources from personality to state of mind.  So the picture that seems to emerge is that it's not just that we pay attention to the world around us; it's also what our social personalities are, and who we're with."

All the more reason we teachers should focus as much on getting our students hooked on learning as we do on the actual content of the course.  My experience is that if you can get students to "buy in" -- if (in my case) they come away thinking biology is cool, fun, and interesting -- it doesn't matter so much if they can't remember what ribosomes do.  They can fit the facts in later, these days with a thirty-second lookup on Wikipedia.

What can't be looked up is being engaged to the point that you care what ribosomes do.

Unfortunately, in the educational world we've tended to go the other direction.  The flavor of the month is micromanagement from the top down, a set syllabus full of factlets that each student must know, an end product that can fit on a bubble sheet, "quantifiable outcomes" that generate data that the b-b stackers in the Department of Education can use to see if our teachers are teaching and our students learning.  A pity that, as usual, the people who run the business of educating children are ignoring what the research says -- that the most fundamental piece of the puzzle is student engagement.

If you have that, everything else will follow.

Saturday, March 11, 2017

Brain waves in the afterlife

It's understandable how much we cling to the hope that there's life after death.  Ceasing to exist is certainly not a comforting prospect.  Heaven knows (pun intended) I'm not looking forward to death myself, although I have to say that I'm more worried about the potential for debility and pain leading up to it than I am to death itself.  Being an atheist, I'm figuring that afterwards, I won't experience much of anything at all, which isn't scary so much as it is inconceivable.

Of course, if the orthodox view of Christianity is correct, I'll have other things to worry about than simple oblivion.

It's this tendency toward wishful thinking that pushes us in the direction of confirmation bias on the subject of survival of the soul.  Take, for example, a paper that came out just this week in PubMed called "Electroencephalographic Recordings During Withdrawal of Life-Sustaining Therapy Until 30 Minutes After Declaration of Death."  The paper was based upon studies of four patients who had died after being removed from life support, in which electroencephalogram (EEG) readings were taken as their life signs faded away.  In one case, a particular type of brain waveform -- delta waves, which are associated with deep sleep -- continued for five minutes after cardiac arrest and drop in arterial blood pressure to zero.

[image courtesy of the Wikimedia Commons]

The authors were cautious not to over-conclude; they simply reported their findings without making any kind of inference about what the person was experiencing, much less saying that this had any implications about his/her immortal soul.  In fact, it is significant that only one of the four patients showed any sort of brain wave activity following cardiac arrest; if there really was some sort of spirit-related phenomenon going on here, you'd think all four would have shown it.

That hasn't stopped the life-after-death crowd from jumping on this as if it were unequivocal proof of soul survival.  "One more piece of scientific evidence for an afterlife," one person appended to a link to the article.  "This can't be explained by ordinary brain science," said another.

The whole thing reminds me of the furor that erupted when the paper "Electrocortical Activity Associated With Subjective Communication With the Deceased," by Arnaud Delorme et al., showed up in Frontiers in Psychology four years ago.  The paper had some serious issues -- confirmation bias among the researchers, all of whom were connected in one way or another to groups more or less desperate to prove an afterlife, being only one.  The gist is that the researchers did brain scans of alleged mediums while they were attempting to access information about the dead.

To call the results equivocal is a compliment.  There were brain scans done of six mediums; of them, three scored above what you'd expect by chance.  In other words, half scored above what chance would predict, and half below -- pretty much the spread you'd expect if chance was all that was involved.  The sample size is tiny, and if you look at the questions the mediums were asked about the deceased people, you find that they include questions such as:
  • Was the discarnate more shy or more outgoing?
  • Was the discarnate more serious or more playful?
  • Was the discarnate more rational or more emotional?
  • Did death occur quickly or slowly?
Not only are these either/or questions -- meaning that even someone who was guessing would have fifty-fifty odds at getting an answer correct -- they're pretty subjective.  I wonder, for example, whether people would say I was "more rational" or "more emotional."  Being a science teacher and skeptic blogger, people who didn't know me well would probably say "rational;" my closest friends know that I'm a highly emotional, anxious bundle of nerves who is simply adept at covering it up most of the time.

Then there's this sort of thing:
  • Provide dates and times of year that were important to the discarnate.
Not to mention:
  • Does the discarnate have any messages specifically for the sitter?
Which is impossible to verify one way or the other.

Add that to the small sample size, and you have a study that is (to put it mildly) somewhat suspect.  But that didn't stop the wishful thinkers from leaping on this as if it was airtight proof of an afterlife.

Like I said, it's not that I don't understand the desire to establish the survival of the spirit.  No one would be happier than me if it turned out to be true (as long as the aforementioned hellfire and damnation isn't what is awaiting me).  But as far as the 2013 paper that was setting out to demonstrate the existence of an afterlife, and this week's paper that some folks are (unfairly) using for the same purpose -- it's just not doing it for me.

Be that as it may, I still have an open mind about the whole thing.  When there's good hard evidence available -- I'm listening.  Unless it happens after I have personally kicked the bucket, at which point I'll know one way or the other regardless.