Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label EEG. Show all posts
Showing posts with label EEG. Show all posts

Wednesday, August 4, 2021

Music on the brain

A pair of new studies last week in The Journal of Neuroscience analyzed the connections between two phenomena related to music listening that I know all too well -- our ability to replay music in our imaginations, and our capacity for anticipating what the next notes will be when we hear the first part of a melody.

The first, which I seem to excel at, is a bit of a mixed blessing.  It's my one and only superpower -- I can essentially remember tunes forever.  In my ten years as flutist in a Celtic dance band, I had just about every tune in our repertoire memorized.  I'm lousy at connecting the names to the tunes, though; so when my bandmate would say, "Next, let's play 'Drummond Castle,'" and I responded, sotto voce, "How the hell does 'Drummond Castle' go?" she'd say, "It's the one that goes, 'deedly-dum, da-deedly-dum, dum-da-deedly-deedly-deedly,'" then I'd say, "Oh, of course," and proceed to play it -- in the correct key.


[Image licensed under the Creative Commons © Nevit Dilmen, Music 01754, CC BY-SA 3.0]

The most striking example of this was a tune that I remembered literally for decades without hearing it once during that time.  When I was about 25 I took a Balkan dance class, and there was one tune I especially liked.  I intended to ask the instructor what the name of it was, but forgot (indicating that my memory in other respects isn't so great).  In those pre-internet days, searching for it was damn near impossible, so I forgot about it... sort of.  Twenty years went by, and my wife and I went to a nine-day music camp in the California redwoods, and I made friends with an awesome accordionist and all-around nice guy named Simo Tesla.  One day, Simo was noodling around on his instrument, and instantaneously, I said, "That's my tune!"  There was no doubt in my mind; this was the same tune I'd heard, a couple of times, two decades earlier.

If you're curious, this is the tune, which is called "Bojerka":


The downside, of course, is that because I never forget a tune, I can't forget one even if I want to.  I'm plagued by what are called earworms -- songs that get stuck in your head, sometimes for days at a time.  There are a few songs that are such bad earworms that if they come on the radio, I'll immediately change the channel, because even a few notes are enough to imbed the tune into my brain.  (Unfortunately, sometimes just hearing the name is enough.)

And no, I'm not going to give examples, because then I'll spend the rest of the day humming "Benny and the Jets," and heaven knows I don't want to... um...

Dammit.

The second bit -- imagining what comes next in a piece of music -- also has a positive and a negative side.  The negative bit is that it is intensely frustrating when I'm listening to a song and it gets cut off, so that I don't get to hear the resolution.  The importance of resolving a musical phrase was demonstrated by my college choral director, Dr. Tiboris, who to illustrate the concept of harmonic resolution played on the piano, "Hark, the herald angels sing, glory to the newborn..."  And stopped.

Three or four of us -- myself included -- sang out "KING!" because we couldn't stand to leave the phrase unresolved.

The positive side, though, happens when I listen to a piece of music for the first time, and it resolves -- but not in the way I expected.  That thwarting of expectations is part of the excitement of music, and when done right, can send a shiver up my spine.  One of my favorite moments in classical music is a point where you think you know what's going to happen, and... the music explodes in a completely different direction.  It occurs in the pair of pieces "Quoniam Tu Solus Sanctus" and "Cum Sancto Spiritu" from J. S. Bach's Mass in B Minor.  (If you don't have time to listen to the whole thing, go to about 5:45 and listen for the moment you get lifted bodily off the ground.)


All of which is a long-winded way to get around to last week's papers, which look at both the phenomena of imagining music and of anticipating what will happen next, through the use of an EEG to determine what the brain is actually doing.  What the researchers found is that when you are imagining a piece of music, your brain is responding in exactly the same way as it does when you're actually listening to the piece.  When there's a silent bit in the music, your brain is functionally imaging what's coming next -- whether it's real or imagined.

What was more interesting is the brain's response to the notes themselves.  Imagined notes generate a negative change in voltage in the relevant neurons; real notes generate a positive voltage change.  This may be why when our expectations and the reality of what phrase comes next match up, we can often tune it out completely -- the two voltage changes, in essence, cancel each other out.  But when there's a mismatch, it jolts our brains into awareness -- just like what happens at the end of "Quoniam Tu Solus Sanctus."

I find the whole thing fascinating, as it ties together music and neuroscience, two subjects I love.  I've often wondered about why some pieces resonate with me and others don't; why, for example, I love Stravinsky's music and dislike Brahms.  These studies don't answer that question, of course, but they do get at our ability both to remember (and replay) music in our minds, and also why we have such a strong response when music does something contrary to our expectations. 

But I think I'll wind this up, and just add one more musical track that is pure fun -- the "Polka" from Shostakovich's The Age of Gold.  This is Shostakovich letting loose with some loony light-heartedness, and I defy anyone to anticipate what this piece is gonna do next.  Enjoy!



**********************************************

Author and biochemist Camilla Pang was diagnosed with autism spectrum disorder at age eight, and spent most of her childhood baffled by the complexities and subtleties of human interactions.  She once asked her mother if there was an instruction manual on being human that she could read to make it easier.

Her mom said no, there was no instruction manual.

So years later, Pang recalled the incident and decided to write one.

The result, Explaining Humans: What Science Can Teach Us About Life, Love, and Relationships, is the best analysis of human behavior from a biological perspective since Desmond Morris's classic The Naked Ape.  If you're like me, you'll read Pang's book with a stunned smile on your face -- as she navigates through common, everyday behaviors we all engage in, but few of us stop to think about.

If you're interested in behavior or biology or simply agree with the Greek maxim "gnothi seauton" ("know yourself"), you need to put this book on your reading list.  It's absolutely outstanding.

[Note:  if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, January 22, 2021

The mental walkabout

I don't know about you, but I have a real problem with my mind wandering.

It's not a new thing.  I can remember getting grief for daydreaming back when I was in grade school.  I'd be sitting in class, trying my damndest to concentrate on transitive verbs or the Franco-Prussian War or whatnot, but my gaze would drift off to some point in the middle distance, my auditory processing centers would switch from "external input" to "internal input" mode, and in under a minute I'd be out in interstellar space or roaming around Valhalla with Odin and the Boys or rowing a boat down the Amazon River.

Until the teacher would interrupt my reverie with some irrelevant comment like, "Gordon!  Pay attention!  Why don't you tell us how to find x in the equation 5x - 9 = 36?"  I was usually able to refrain from saying what came to mind, namely, that she was the one who lost x in the first place and it was hardly my responsibility to find it, but I usually was able to get myself together enough to take a shot at playing along and giving her a real answer.

I never outgrew the tendency (either to daydreaming or to giving authority figures sarcastic retorts).  It plagued me all through college and beyond, and during my teaching career I remember dreading faculty meetings because I knew that five minutes in I'd be doodling on the agenda despite my vain attempt to be a Good Boy and pay attention.  It's part of how I developed my own teaching style; a mentor teacher told me early along that teaching was 25% content knowledge and 75% theater, and I took that to heart.  I tried to lecture in a way that kept students wondering what the hell I was going to say or do next, because I know that's about the only thing that kept me engaged when I was sitting in the student's desk and someone else was in front of the room.

One amusing case in point -- Dr. Cusimano, who taught a British History elective I took as a senior in college.  He was notorious for working puns and jokes into his lectures, and doing it so smoothly and with such a straight face that if you weren't paying attention, it could slip right past you.  I recall early in the course, when he was talking about the fall of the Roman Empire, Dr. Cusimano said, "During that time, what was left of the Roman Empire was invaded by a series of Germanic tribal leaders -- there was Alaric, King of the Visigoths; Gunderic, King of the Vandals; Oscar Mayer, King of the Franks..."

I'd bet cold hard cash there were students in the class who wrote that down and only erased it when one by one, their classmates caught on and started laughing.

I never daydreamed in Dr. Cusimano's class.

Edward Harrison May, Daydreaming (1876) [Image is in the Public Domain]

Anyhow, all of this comes up because of a study out of the University of California - Berkeley that appeared this week in Proceedings of the National Academy of Sciences.  Entitled, "Distinct Electrophysiological Signatures of Task-Unrelated and Dynamic Thoughts," by Julia W. Y. Kam, Zachary C. Irving, Caitlin Mills, Shawn Patel, Alison Gopnik, and Robert T. Knight, this paper takes the fascinating angle of analyzing the electroencephalogram (EEG) output of test subjects when focused on the task at hand, when focusing on something unrelated, or when simply wandering from topic to topic -- what the authors call "dynamic thought," like much of the game of random free association that my brain spends a significant portion of its time in.

The authors write:

Humans spend much of their lives engaging with their internal train of thoughts.  Traditionally, research focused on whether or not these thoughts are related to ongoing tasks, and has identified reliable and distinct behavioral and neural correlates of task-unrelated and task-related thought.  A recent theoretical framework highlighted a different aspect of thinking—how it dynamically moves between topics.  However, the neural correlates of such thought dynamics are unknown. The current study aimed to determine the electrophysiological signatures of these dynamics by recording electroencephalogram (EEG) while participants performed an attention task and periodically answered thought-sampling questions about whether their thoughts were 1) task-unrelated, 2) freely moving, 3) deliberately constrained, and 4) automatically constrained...  Our findings indicate distinct electrophysiological patterns associated with task-unrelated and dynamic thoughts, suggesting these neural measures capture the heterogeneity of our ongoing thoughts.

"If you focus all the time on your goals, you can miss important information," said study co-author Zachary Irving, in an interview with Science Direct.  "And so, having a free-association thought process that randomly generates memories and imaginative experiences can lead you to new ideas and insights."

Yeah, someone should have told my elementary school teachers that.

"Babies' and young children's minds seem to wander constantly, and so we wondered what functions that might serve," said co-author Allison Gopnik.  "Our paper suggests mind-wandering is as much a positive feature of cognition as a quirk and explains something we all experience."

So my tendency to daydream might be a feature, not a bug.  Still, it can be inconvenient at times.  I know there are a lot of things that would be a hell of a lot easier if I could at least control it, like when I'm reading something that's difficult going but that I honestly want to pay attention to and understand.  Even when my intention is to concentrate, it usually doesn't take long for me to realize that my eyes are still tracking across the lines, my fingers are turning pages, but I stopped taking anything in four pages ago and since that time have been imagining what it'd be like to pilot a spaceship through the Great Red Spot.  Then I have to go back and determine when my brain went AWOL -- and start over from there until the next time I go on mental walkabout.

I guess there's one advantage to being an inveterate daydreamer; it's how I come up with a lot of the plots to my novels.  Sometimes my internal imaginary worlds are more vivid than the real world.  However, I do need to re-enter the real world at least long enough to get the story down on paper, and not end up being too distracted to write down the idea I came up with while I was distracted last time.

In any case, I guess I'd better wrap this up, because I'm about at the limits of my concentration.  I'd like to finish this post before my brain goes on walkies and I end up staring out of my office window and wondering if there's life on Proxima Centauri b.  Which I guess is an interesting enough topic, but hardly the one at hand.

***********************************

I'm always amazed by the resilience we humans can sometimes show.  Knocked down again and again, in circumstances that "adverse" doesn't even begin to describe, we rise above and move beyond, sometimes accomplishing great things despite catastrophic setbacks.

In Why Fish Don't Exist: A Story of Love, Loss, and the Hidden Order of Life, journalist Lulu Miller looks at the life of David Starr Jordan, a taxonomist whose fascination with aquatic life led him to the discovery of a fifth of the species of fish known in his day.  But to say the man had bad luck is a ridiculous understatement.  He lost his collections, drawings, and notes repeatedly, first to lightning, then to fire, and finally and catastrophically to the 1906 San Francisco Earthquake, which shattered just about every specimen bottle he had.

But Jordan refused to give up.  After the earthquake he set about rebuilding one more time, becoming the founding president of Stanford University and living and working until his death in 1931 at the age of eighty.  Miller's biography of Jordan looks at his scientific achievements and incredible tenacity -- but doesn't shy away from his darker side as an early proponent of eugenics, and the allegations that he might have been complicit in the coverup of a murder.

She paints a picture of a complex, fascinating man, and her vivid writing style brings him and the world he lived in to life.  If you are looking for a wonderful biography, give Why Fish Don't Exist a read.  You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, November 9, 2019

Poisoned by preconceived notions

If you needed something else to make you worry about our capacity to make decisions based on facts, go no further than a study that came out this week from the University of Texas at Austin.

Entitled "Fake News on Social Media: People Believe What They Want to Believe When it Makes No Sense At All," the study was conducted by Patricia L. Moravec, Randall K. Minas, and Alan R. Dennis of the McCombs School of Business.  And its results should be seriously disheartening for just about everyone.

What they did was a pair of experiments using students who were "social media literate" -- i.e., they should know social media's reputation for playing fast and loose with the truth -- first having them evaluate fifty headlines as true or false, and then giving them headlines with "Fake News" flags appended.  In each case, there was an even split -- in the first experiment, between true and false headlines, and in the second, between true and false headlines flagged as "Fake."

In both experiments, the subjects were hooked up to an electroencephalogram (EEG) machine, to monitor their brain activity as they performed the task.

In the first experiment, it was found -- perhaps unsurprisingly -- that people are pretty bad at telling truth from lies when presented only with a headline.  But the second one is the most interesting, and also the most discouraging.  Because what the researchers found is that when a true headline is flagged as false, and a false headline is flagged as true, this causes a huge spike in activity of the prefrontal cortex -- a sign of cognitive dissonance as the subject tries desperately to figure out how this can be so -- but only if the labeling of the headline as such disagrees with what they already believed.


[Image is in the Public Domain]

So we're perfectly ready to believe the truth is a lie, or a lie is the truth, if it fits our preconceived notions.  And worse still, what the researchers saw is that in general, even though subjects had an uncomfortable amount of cognitive processing going on when they were confronted by something that was the opposite of what they thought was true, it didn't have much influence over what they thought was true after the experiment.

In other words, you can label the truth a lie, or a lie the truth, but it won't change people's minds if they already believed the opposite.  Our ability to discern fact from fiction, and use that information to craft our view of the world, is poisoned by our preconceived notions of what we'd like to be true.

Before you start pointing fingers, the researchers also found that there was no good predictor of how well subjects did on this test.  They were all bad -- Democrats and Republicans, higher IQ and lower IQ, male and female.

"When we’re on social media, we’re passively pursuing pleasure and entertainment," said Patricia Moravec, who was lead author of the study, in an interview with UT News.  "We’re avoiding something else...  The fact that social media perpetuates and feeds this bias complicates people’s ability to make evidence-based decisions.  But if the facts that you do have are polluted by fake news that you truly believe, then the decisions you make are going to be much worse."

This is insidious because even if we are just going on social media to be entertained, the people posting political advertisements on social media aren't.  They're trying to change our minds.  And what the Moravec et al. study shows is that we're not only lousy at telling fact from fiction, we're very likely to get suckered by a plausible-sounding lie (or, conversely, to disbelieve an inconvenient truth) if it fits with our preexisting political beliefs.

Which makes it even more incumbent on the people who run social media platforms (yeah, I'm lookin' at you, Mark Zuckerberg) to have on-staff fact checkers who are empowered to reject ads on both sides of the political aisle that are making false claims.  It's not enough to cite free speech rights as an excuse for abrogating your duty to protect people from immoral and ruthless politicians who will say or do anything to gain or retain power.  The people in charge of social media are under no obligation to run any ad someone's willing to pay for.  It's therefore their duty to establish criteria for which ads are going to show up -- and one of those criteria should surely be whether it's the truth.

The alternative is that our government will continue to be run by whoever has the cleverest, most attractive propaganda.  And as we've seen over the past three years, this is surely a recipe for disaster.

**********************************

This week's Skeptophilia book recommendation is a fun book about math.

Bet that's a phrase you've hardly ever heard uttered.

Jordan Ellenberg's amazing How Not to Be Wrong: The Power of Mathematical Thinking looks at how critical it is for people to have a basic understanding and appreciation for math -- and how misunderstandings can lead to profound errors in decision-making.  Ellenberg takes us on a fantastic trip through dozens of disparate realms -- baseball, crime and punishment, politics, psychology, artificial languages, and social media, to name a few -- and how in each, a comprehension of math leads you to a deeper understanding of the world.

As he puts it: math is "an atomic-powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength."  Which is certainly something that is drastically needed lately.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, March 11, 2017

Brain waves in the afterlife

It's understandable how much we cling to the hope that there's life after death.  Ceasing to exist is certainly not a comforting prospect.  Heaven knows (pun intended) I'm not looking forward to death myself, although I have to say that I'm more worried about the potential for debility and pain leading up to it than I am to death itself.  Being an atheist, I'm figuring that afterwards, I won't experience much of anything at all, which isn't scary so much as it is inconceivable.

Of course, if the orthodox view of Christianity is correct, I'll have other things to worry about than simple oblivion.

It's this tendency toward wishful thinking that pushes us in the direction of confirmation bias on the subject of survival of the soul.  Take, for example, a paper that came out just this week in PubMed called "Electroencephalographic Recordings During Withdrawal of Life-Sustaining Therapy Until 30 Minutes After Declaration of Death."  The paper was based upon studies of four patients who had died after being removed from life support, in which electroencephalogram (EEG) readings were taken as their life signs faded away.  In one case, a particular type of brain waveform -- delta waves, which are associated with deep sleep -- continued for five minutes after cardiac arrest and drop in arterial blood pressure to zero.

[image courtesy of the Wikimedia Commons]

The authors were cautious not to over-conclude; they simply reported their findings without making any kind of inference about what the person was experiencing, much less saying that this had any implications about his/her immortal soul.  In fact, it is significant that only one of the four patients showed any sort of brain wave activity following cardiac arrest; if there really was some sort of spirit-related phenomenon going on here, you'd think all four would have shown it.

That hasn't stopped the life-after-death crowd from jumping on this as if it were unequivocal proof of soul survival.  "One more piece of scientific evidence for an afterlife," one person appended to a link to the article.  "This can't be explained by ordinary brain science," said another.

The whole thing reminds me of the furor that erupted when the paper "Electrocortical Activity Associated With Subjective Communication With the Deceased," by Arnaud Delorme et al., showed up in Frontiers in Psychology four years ago.  The paper had some serious issues -- confirmation bias among the researchers, all of whom were connected in one way or another to groups more or less desperate to prove an afterlife, being only one.  The gist is that the researchers did brain scans of alleged mediums while they were attempting to access information about the dead.

To call the results equivocal is a compliment.  There were brain scans done of six mediums; of them, three scored above what you'd expect by chance.  In other words, half scored above what chance would predict, and half below -- pretty much the spread you'd expect if chance was all that was involved.  The sample size is tiny, and if you look at the questions the mediums were asked about the deceased people, you find that they include questions such as:
  • Was the discarnate more shy or more outgoing?
  • Was the discarnate more serious or more playful?
  • Was the discarnate more rational or more emotional?
  • Did death occur quickly or slowly?
Not only are these either/or questions -- meaning that even someone who was guessing would have fifty-fifty odds at getting an answer correct -- they're pretty subjective.  I wonder, for example, whether people would say I was "more rational" or "more emotional."  Being a science teacher and skeptic blogger, people who didn't know me well would probably say "rational;" my closest friends know that I'm a highly emotional, anxious bundle of nerves who is simply adept at covering it up most of the time.

Then there's this sort of thing:
  • Provide dates and times of year that were important to the discarnate.
Not to mention:
  • Does the discarnate have any messages specifically for the sitter?
Which is impossible to verify one way or the other.

Add that to the small sample size, and you have a study that is (to put it mildly) somewhat suspect.  But that didn't stop the wishful thinkers from leaping on this as if it was airtight proof of an afterlife.

Like I said, it's not that I don't understand the desire to establish the survival of the spirit.  No one would be happier than me if it turned out to be true (as long as the aforementioned hellfire and damnation isn't what is awaiting me).  But as far as the 2013 paper that was setting out to demonstrate the existence of an afterlife, and this week's paper that some folks are (unfairly) using for the same purpose -- it's just not doing it for me.

Be that as it may, I still have an open mind about the whole thing.  When there's good hard evidence available -- I'm listening.  Unless it happens after I have personally kicked the bucket, at which point I'll know one way or the other regardless.