Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label fMRI. Show all posts
Showing posts with label fMRI. Show all posts

Thursday, July 27, 2023

The face in the mirror

Like many people, I've at times been in the position of having to interact with narcissists.

I'll not name names, but two, in particular, stand out.  One of them frequently said things like, "What I say is the law of the land" (without any apparent awareness of irony, because this is also an excellent example of someone being a Big Fish in a Little Pond).  This individual did have "advisors" -- for a time I was one of them -- while in point of fact never actually taking a single piece of advice or admitting to being wrong about anything.  Ever.  Worse, every interaction became about being perceived as the most knowledgeable, smart, funny, edgy, savvy person in the room, so every conversation turned into a battle for dominance, unless you refused to play (which, eventually, is what I did).

The second had a different strategy, albeit one that still resulted in the role of Center of the Entire Fucking Universe.  For this person, negative attention caused a complete emotional breakdown, which resulted in everyone having to circle the wagons simply to restore order.  Worse still was when something this individual said made me upset; because then, the focus shifted to someone else's needs, which was completely unacceptable.  My expression of annoyance, anger, or frustration was turned around into my having unreasonable expectations, which precipitated another emotional breakdown, returning me to the role of caregiver and he-who-pours-oil-on-the-waters.

It's a relief that neither of these two are part of my life any more, because being around narcissists is, among other things, absolutely exhausting.  The incessant focus on self means that no one else's needs, and often no one else's opinions, ever get heard.  Both of these people did considerable damage to others around them, without there ever being any sign of concern for the chaos they were sowing or the emotional scars they were inflicting.  (There was plenty of deflection of the blame toward the ones who were hurt, however; "it's their own fault" was another phrase I heard numerous times.)  Worst of all, neither one had any apparent awareness of being narcissistic.  I heard both expressing, at one time or another, how puzzling and unfair it was that they couldn't keep friends or maintain good relationships with business associates.

Funny how that happens when you don't consider anyone but yourself, and funnier still that neither one ever seemed to realize what the common factor in all of their difficulties was.

This lack of self-awareness makes narcissism difficult to study, because it's hard to analyze a condition that the patient doesn't know (s)he's got.  But a team at the University of Graz (Austria), led by psychologist Emanuel Jauk, has not only looked at what it means to be narcissistic -- they've done neuroimaging studies to see what's going on in a narcissist's brain.  The result was an eye-opening paper that appeared in Nature.

"Narcissism is a topic of increasing interest to science and the public, probably because cultural changes in the past decades favor narcissistic behavior," Jauk says.  "Our study was aimed at taking a closer look at the self-image of narcissistic individuals using neuroscience, which might help to unveil its less conscious aspects."

The results were fascinating.  In the authors' words:
Subclinical narcissism is a personality trait with two faces: According to social-cognitive theories it is associated with grandiosity and feelings of superiority, whereas psychodynamic theories emphasize vulnerable aspects like fluctuating self-esteem and emotional conflicts...  While social-cognitive theory would predict that self-relevant processing should be accompanied by brain activity in reward-related areas in narcissistic individuals, psychodynamic theory would suggest that it should be accompanied by activation in regions pointing to negative affect or emotional conflict.  In this study, extreme groups of high and low narcissistic individuals performed a visual self-recognition paradigm during fMRI.  Viewing one’s own face (as compared to faces of friends and strangers) was accompanied by greater activation of the dorsal and ventral anterior cingulate cortex (ACC) in highly narcissistic men.  These results suggest that highly narcissistic men experience greater negative affect or emotional conflict during self-relevant processing and point to vulnerable aspects of subclinical narcissism that might not be apparent in self-report research.
The upshot is that this study suggests narcissism doesn't result in feelings of pleasure when you think of or view yourself; it increases your anxiety.  "Narcissism," Jauk explains, "in terms of an inflated self-view, goes along with negative affect towards the self on an involuntary level."

Which certainly makes sense given my interactions with narcissists.  Above all, neither of the individuals I mentioned ever seemed all that happy.  It appeared that the returning focus on self came out of insecurity, fear, and anxiety rather than conceit -- that it was more about reassurance than it was about praise.

So the condition itself is a little misnamed, isn't it?  The word "narcissism" comes from the Greek myth of Narcissus, who was a young man whose appearance was so beautiful that he fell in love with a reflection of himself, and couldn't tear his eyes away -- he eventually pined away and died, and the gods took pity on him and turned him into the flower that now bears his name.

Narcissus by Caravaggio (1598)  [Image is in the Public Domain]

The reality is sadder.  Narcissists, apparently, think of themselves not out of self-love, but out of a constant uneasy sense that they aren't actually beautiful, intelligent, competent, or desirable.

Which is kind of a miserable way to live.  Knowing this defuses a lot of the anger I harbor from my experiences with the narcissists I described earlier.  For all of their desperation for attention, at their core they were unhappy, deeply fearful people.

The authors make reference to an alternate version of the Narcissus myth that is more in line with what true narcissists experience.  They write:
In another prominent version by Pausanias, the myth has a different ending: Narcissus is gazing at himself, when suddenly a leaf falls into the water and distorts the image.  Narcissus is shocked by the ugliness of his mirror image, which ultimately leads him to death.

 This more tragic ending is much closer to what the study found:

Considering the two versions of the ancient myth of Narcissus, our results are in favor of the less prominent version, in which Narcissus is shocked to death by the ugliness of his mirror image when a leaf drops into the water.  This myth can be seen to metaphorically reflect the ongoing critical self-monitoring that narcissists display when confronted with self-relevant material, presumably due to a lowered intrinsic coupling between self-representation and self-reward/liking.
Which makes me feel like narcissists, despite the considerable harm they can do, are more to be pitied than scorned.

****************************************



Wednesday, May 3, 2023

The mind readers

In Isaac Asimov's deservedly famous short story "All the Troubles of the World," the megacomputer Multivac has so much data on each person in the world (including detailed brain scans) that it can predict ahead of time if someone is going to commit a crime.  This allows authorities to take appropriate measures -- defined, of course, in their own terms -- to prevent it from happening.

We took a step toward Asimov's dystopian vision, in which nothing you think is secret, with a paper this week in Nature Neuroscience about a new invention called a "brain activity decoder."

Developed by a team of researchers at the University of Texas at Austin, the software uses an fMRI machine to measure the neural activity in a person's brain, and is able to convert that neural activity into a continuous stream of text -- i.e., the output is what the person was thinking.

The researchers had volunteers listening to podcasts over headphones while the fMRI watched how their brains responded.  This allowed them to compare the actual text the test subjects were hearing with what the brain activity decoder picked up from them.  After only a short span of training the software, the results were scary good.  One listener heard, "I don't have my driver's license yet," and the decoder generated the output "She has not even started to learn to drive yet."  Another had the input, "I didn’t know whether to scream, cry or run away. Instead, I said, 'Leave me alone!'", which resulted in the output, "Started to scream and cry, and then she just said, 'I told you to leave me alone.'"

Not perfect, but as a proof-of-concept, it's jaw-dropping.

[Image licensed under the Creative Commons © Nevit Dilmen, Brain MRI 131058 rgbca, CC BY-SA 3.0]

The researchers touted its possible use for people who have lost the ability to communicate, in situations like locked-in syndrome.  However, I don't think it takes an overactive imagination to come up with ways such a device could be abused.  What would happen to the concept of privacy, if a machine could read your thoughts?  What about the Fifth Amendment right not to self-incriminate?  Like in Asimov's story, how could the authorities separate what a person had done from what they were contemplating doing?

Or would they?

Jerry Tang, who led the research, emphasizes that the decoder had to be trained on the person whose thoughts were going to be read; if it were trained on me, it couldn't immediately be used to figure out what you were thinking.  My response to that is: yet.  This is already leaps and bounds past previous attempts at thought-reading, which was only able to output single words and short sentences.  Given more time and further refinements, this technique will only get better.

Or scarier, as the case may be.

Tang also pointed out that even with improvements, the software would be defeated by someone putting up resistance (e.g., deliberately thinking other things to block the fMRI from getting the correct output).  He also is aware of the possibility of abuse.  "We take very seriously the concerns that it could be used for bad purposes and have worked to avoid that," he said.  "We want to make sure people only use these types of technologies when they want to and that it helps them."

Well, maybe.  I'm not a conspiracy-theory type, nor someone who thinks that all government is inherently bad.  Here, though, it seems like the potential for Orwellian thought-crime is a short step away.

Keep in mind, too, how generally inaccurate our brain's storage system is.  As we've seen over and over here at Skeptophilia, what we remember is an amalgam of what actually happened, what we were told happened, what we imagine happened, and a good dollop of falsehood.  False memories can be as convincingly real as accurate ones.  If the brain activity decoder were used on an unwilling person to extract his/her thoughts, there is no guarantee that the output would be at all reflective of reality.  In fact, it's almost certain not to be.

But since eyewitness testimony -- in other words, recall -- is considered one of the highest forms of evidence in a court of law, it's no stretch to wonder if a person's thoughts would be given the same undeserved weight.

I'm not sure what the right step is, honestly.  There are some who believe that a potential for misuse shouldn't stop scientific progress; anything, they argue, can be used for harm.  Others feel like the hazards can sometimes outweigh the benefits, and trusting the powers-that-be to do the right thing with technology this powerful is foolish.

I don't have an answer.  But I will say that my mind was forced back to the prescient quote from another seminal science fiction writer, Michael Crichton: "Scientists are preoccupied with accomplishment.  So they are focused on whether they can do something.  They never stop to ask if they should."

****************************************



Saturday, December 26, 2020

Purging memory

Last night I had a good bit of trouble sleeping.

This isn't all that uncommon.  I've had issues with insomnia ever since I was a teenager.  Sometimes the issue is physical restlessness; sometimes it's anxiety, either over something real or something imagined.

Last night, it's because my brain was shrieking, over and over, "FELIZ NAVIDAD, FELIZ NAVIDAD, FELIZ NAVIDAD, PRÓSPERO AÑO Y FELICIDAD."

At least its timing was reasonably good, being that yesterday was Christmas.  What I wonder is why it couldn't choose a song that I don't hate.  I'm not one of those Bah-Humbug curmudgeons who dislikes all Christmas music; some of it I actually rather enjoy.

I'm of the opinion, however, that listening to "Feliz Navidad" over and over would have been ruled out as a torture device by Tómas de Torquemada on the basis of being too cruel.

Leaving aside my brain's questionable choice of which song to holler at me, a more interesting question is how to get rid of it once it's stuck there.  I've found that for me, the best thing is to replace it with something less objectionable, which in this case would have been just about anything.  There are a couple of pieces of sedate classical music and a slow Irish waltz or two that I can usually use to shove away whatever gawdawful song is on repeat in my skull, if I concentrate on running them mentally in a deliberate fashion.  It eventually worked, but it did take much longer than usual.

José Feliciano is nothing if not persistent.

The reason all of this comes up -- besides my Christmas-music-based bout of insomnia -- is some research out of a team from the University of Colorado - Boulder and the University of Texas - Austin that appeared in Nature Communications this month.  Entitled, "Changes to Information in Working Memory Depend on Distinct Removal Operations," by Hyojeong Kim, Harry Smolker, Louisa Smith, Marie Banich, and Jarrod Lewis-Peacock, this research shows that the kind of deliberate pushing away I use to purge bad music from my brain works in a lot of other situations as well -- and may have applications in boosting creativity and in relieving anxiety, obsessive-compulsive disorder, and PTSD.

What they did was to put a thought into their test subjects' heads -- a photograph of a face, a bowl of fruit, or an outdoor scene -- instructed them to think about it for four seconds, and then to deliberately stop thinking about it, all the while watching what happens to their neural systems using an fMRI machine.  Ceasing to think about something without replacing it with something else is remarkably hard; it brings to mind my dad's recommended cure for hiccups, which is to run around the house three times without thinking of an elephant.

The three different sorts of things they asked the subjects to try -- to replace the thought with something else, to clear all thoughts completely, or to suppress that one thought without replacing it -- all resulted in different patterns on the fMRI.  "Replace" and "clear" both worked fairly rapidly, but both left a trace of the original thought pattern behind -- a "ghost image," as the researchers called it.  "Suppress" took longer, and subjects described it as being more difficult, but once accomplished, the original pattern had faded completely.

"We found that if you really want a new idea to come into your mind, you need to deliberately force yourself to stop thinking about the old one," said study co-author Marie Banich, in a press release from the University of Colorado.

Co-author Jarrod Lewis-Peacock concurred.  "Once we’re done using that information to answer an email or address some problem, we need to let it go so it doesn’t clog up our mental resources to do the next thing."

[Image © Michel Royon / Wikimedia Commons; used with permission]

This explains another phenomenon I've noted; that when I'm trying to think of something I've forgotten, or come up with a solution to a problem that's stumping me, it often helps if I deliberately set it aside.  Just a couple of days ago, I was working on my fiction work-in-progress, and found I'd written myself into a corner.  I'd created a situation that called for some as-yet-undreamed-of plot twist, or else rewriting a big section of it to eliminate the necessity (something I didn't want to do).  After basically beating it with a stick for an hour or two, I gave up in frustration, and went to clean up my garage.

And while working on this chore, and not thinking about my writing at all, a clever solution to the problem simply popped into my head, seemingly out of nowhere.

This is far from the first time this sort of thing has happened to me, and the Kim et al. paper at least gives a first-order approximation as to how this occurs.  Pushing aside what you're thinking about, either consciously and deliberately or else (as in my garage-cleaning example) by replacing it with something unrelated, clears the cognitive thought patterns and gives your brain room to innovate.

Now, where exactly the creative solution comes from is another matter entirely.  I've described before how often my ideas for writing seem to originate from outside my own head.  I don't subscribe to a belief in any sort of Jungian collective unconscious, but sometimes it sure feels that way.

In any case, all of this gives us a lens into how to make our own thought processes more efficient -- in cases of clogged creativity as well as situations where errant thoughts are themselves causing problems, as in PTSD.  What the Kim et al. research suggests is that the first thing to work on is consciously purging the brain in order to create space for more positive and beneficial thoughts.

It's not necessarily easy, of course.  For example, my brain has finally stopped screaming "Feliz Navidad" at me, but has replaced it with "Let it Snow, Let it Snow, Let it Snow," which is only fractionally less annoying.  My considered opinion is that whoever wrote "Let it Snow, Let it Snow, Let it Snow" should be pitched, bare-ass naked, head-first into a snowbank.

Okay, so maybe I am a Bah-Humbug curmudgeon.  God bless us every one anyhow, I suppose.

****************************************

Not long ago I was discussing with a friend of mine the unfortunate tendency of North Americans and Western Europeans to judge everything based upon their own culture -- and to assume everyone else in the world sees things the same way.  (An attitude that, in my opinion, is far worse here in the United States than anywhere else, but since the majority of us here are the descendants of white Europeans, that attitude didn't come out of nowhere.)  

What that means is that people like me, who live somewhere WEIRD -- white, educated, industrialized, rich, and democratic -- automatically have blinders on.  And these blinders affect everything, up to and including things like supposedly variable-controlled psychological studies, which are usually conducted by WEIRDs on WEIRDs, and so interpret results as universal when they might well be culturally-dependent.

This is the topic of a wonderful new book by anthropologist Joseph Henrich called The WEIRDest People in the World: How the West Became Psychologically Peculiar and Particularly Prosperous.  It's a fascinating lens into a culture that has become so dominant on the world stage that many people within it staunchly believe it's quantifiably the best one -- and some act as if it's the only one.  It's an eye-opener, and will make you reconsider a lot of your baseline assumptions about what humans are and the ways we see the world -- of which science historian James Burke rightly said, "there are as many different versions of that as there are people."

[Note:  If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Monday, August 3, 2020

The writing brain

As a writer of fiction, I have wondered for years where creative ideas come from.  Certainly a great many of the plots I've written have seemed to spring fully-wrought from my brain (although as any writer will tell you, generating an idea is one thing, and seeing it to fruition quite another).

What has always struck me as odd about all of this is how... unconscious it all feels.  Oh, there's a good bit of front-of-the-brain cognition that goes into it -- background knowledge, visualization of setting, and sequencing, not to mention the good old-fashioned ability to construct solid prose.  But at its base, there's always seemed to me something mysterious about creativity, something ineffable and (dare I say it?) spiritual.  It is no surprise, even to me, that many have ascribed the source of creativity to divine inspiration or, at least, to a collective unconscious.

Take, for example, the origin of the novel I just completed two weeks ago (well, the first draft, anyhow).  Descent into Ulthoa is a dark, Lovecraftian piece about a haunted forest and a man obsessed with finding out what happened to his identical twin brother, who vanished ten years earlier on a hiking trip, but the inspiration for it seemed to come out of nowhere.  In fact, at the time, I wasn't even thinking about writing at all -- but was suddenly hit by a vivid, powerful image that seemed to beg for a story.  (If you want to read more about my experience of having that idea wallop me over the head, I did a post about it over at my fiction blog last August.)

So something is going on neurologically when stuff like this happens, but what?  Martin Lotze, a neuroscientist at the University of Griefswald (Germany), has taken the first steps toward understanding what is happening in the brains of creative writers -- and the results that he and his team have uncovered are fascinating.

One of the difficulties in studying the creative process is that during any exercise of creativity, the individual generally has to be free to move around.  Writing, especially, would be hard to do in a fMRI machine, where your head has to be perfectly still, and your typical writing device, a laptop, would be first wiped clean and then flung across the room by the electromagnets.  But Lotze and his team rigged up a setup wherein subjects could lie flat, with their heads encased in the fMRI tube, and have their arms supported so that they could write with the tried-and-true paper-and-pencil method, using a set of mirrors to see what they were doing.

[Image courtesy of Martin Lotze and the University of Griefswald]

Each subject was given a minute to brainstorm, and then two minutes to write.  While all of the subjects activated their visual centers and hippocampus (a part of the brain involved in memory and spatial navigation) during the process, there was a striking difference between veteran and novice writers.  Novice writers tended to activate their visual centers first; brainstorming, for them, started with thinking of images.  Veteran writers, on the other hand, started with their speech production centers.

"I think both groups are using different strategies,” Lotze said.  "It’s possible that the novices are watching their stories like a film inside their heads, while the writers are narrating it with an inner voice."

The other contrast between veterans and novices was in the level of activity of the caudate nucleus, a part of the brain involved in the coordination of activities as we become more skilled.  The higher the level of activity in the caudate nucleus, the more fluent we have become at it, and the less conscious effort it takes -- leading to the conclusion (no surprise to anyone who is a serious writer) that writing, just like anything, becomes better and easier the more you do it.  Becoming an excellent writer, like becoming a concert pianist or a star athlete, requires practice.

All of this is also interesting from the standpoint of artificial intelligence -- because if you don't buy the Divine Inspiration or Collective Unconscious Models, or something like them (which I don't), then any kind of creative activity is simply the result of patterns of neural firings -- and therefore theoretically should be able to be emulated by a computer.  I say "theoretically," because our current knowledge of AI is in its most rudimentary stages.  (As a friend of mine put it, "True AI is ten years in the future, and always will be.")  But just knowing what is happening in the brains of writers is the first step toward both understanding it, and perhaps generating a machine that is capable of true creativity.

All of that, of course is far in the future (maybe even more than ten years), and Lotze himself is well aware that this is hardly the end of the story.  As for me, I find the whole thing fascinating, and a little humbling -- that something so sophisticated is going on in my skull when I think up a scene in a story.  It brings to mind something one of my neurology students once said, after a lecture on the workings of the brain: "My brain is so much smarter than me, I don't know how I manage to think at all!"

Indeed.

************************************

This week's Skeptophilia book recommendation is a fun and amusing discussion of a very ominous topic; how the universe will end.

In The End of Everything (Astrophysically Speaking) astrophysicist Katie Mack takes us through all the known possibilities -- a "Big Crunch" (the Big Bang in reverse), the cheerfully-named "Heat Death" (the material of the universe spread out at uniform density and a uniform temperature of only a few degrees above absolute zero), the terrifying -- but fortunately extremely unlikely -- Vacuum Decay (where the universe tears itself apart from the inside out), and others even wilder.

The cool thing is that all of it is scientifically sound.  Mack is a brilliant theoretical astrophysicist, and her explanations take cutting-edge research and bring it to a level a layperson can understand.  And along the way, her humor shines through, bringing a touch of lightness and upbeat positivity to a subject that will take the reader to the edges of the known universe and the end of time.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Wednesday, July 22, 2020

The rhythmic brain

Regular readers of Skeptophilia know that I've been a musician for a very long time.  I started on the flute when I was a teenager, more or less self-taught.  In my twenties, for several years while I lived in Seattle, I was fortunate enough to study the classical flute repertoire with a brilliant flutist and teacher named Margaret Vitus, who did wonders for my technique.  Shortly after that I became fascinated with Celtic music (due in no small measure to the wonderful radio program The Thistle & Shamrock, which thirty-some-odd years later is still going strong), and was for years part of a Celtic music quartet called Taradiddle that performed at the Seattle Folklife Festival four years running.

Along the way, though, I fell in love with Balkan music.  I'm not sure why it was such a draw for me -- I don't have a drop of eastern European blood and certainly hadn't heard it growing up.  But something about Bulgarian, Serbian, and Macedonian music was absolutely magnetic, and still is.

[Image is licensed under the Creative Commons Walter from Tampa/St Petersburg, Florida, Serb Fest DSC 1084 pp (31031644011), CC BY 2.0]

A lot of it was the asymmetrical rhythms.  People who are unfamiliar with this style of music often can't figure out how to count it or tap their feet to it, because it has rhythms you almost never hear in western music.  For example, what time signature would you say this tune is in?  (Listen to it before reading further, and see if you can figure it out.)


Ready for the answer?

It, like all kopanicas (pronounced ko-pa-neetsa; the kopanica is a Balkan dance, so they all have the same time signature) is in 11/16.  But you don't have to count up to 11 and then start over from 1; Balkan music is in combinations of 2s and 3s.  The 2s are the short, quick dance steps, and the 3s the longer steps; and a kopanica has the form of quick-quick-slow-quick-quick.  An easy way to count it out is to use a two-syllable word (I use apple) and a three-syllable word (I use cinnamon) to represent the 2s and 3s respectively.

So a kopanica is apple-apple-cinnamon-apple-apple.  Pretty tasty.  You might want to go back and listen again, and see if you can count it out.

11/16 isn't the craziest it gets, though.  The Macedonian tune and dance "Dvajspetorka" is in 25/16.  (Broken up 3-2-2, 3-2-2, 2-2-3-2-2.)

Okay, it's not as hard as it sounds.  Really it isn't.

This weird rhythmic stuff comes up because of a study that came out last week in the journal Neuropsychologia that looked at fMRI studies of three groups of people -- musicians trained in the western European classical tradition, musicians trained in the Japanese classical tradition, and non-musicians.  In particular they were looking for responses in a part of the left hemisphere that is associated with processing auditory rhythm.

Unsurprisingly, both groups of trained musicians showed greater responsiveness in that part of the brain than non-musicians.  What was more interesting, though, was that the western and Japanese musicians didn't respond the same way, especially to asymmetrical beat patterns.  Most western classical music has until recently confined itself to symmetrical 2-, 3-, or 4-beat patterns; this is why the exceptions stand out, like the movement "Mars" from Gustav Holst's The Planets, which is in 5/4; and brilliant lunacy like Igor Stravinsky's The Rite of Spring.  (The joke amongst musicians is that when Stravinsky was asked what time signature The Rite of Spring was in, he responded, "Yes.")

Classical Japanese music, however, such as the music used in Noh and Kabuki, often use beat-lengthening, called ma (間), which sounds pretty peculiar to a lot of western ears, as if the stretched beats were random.  They're not, of course, any more than the 25/16 rhythm of "Dvajspetorka" is; it's just not rhythmically like what most of us are used to.  And the musicians trained in classical Japanese music responded to that as a natural, comprehensible rhythm, whereas the musicians trained in western classical music did not.

"We expected that musicians would exhibit strong statistical learning of unfamiliar rhythm sequences compared to non-musicians," said study lead author Tatsuya Daikoku, of the University of Tokyo, in a press release.  "This has been observed in previous studies which looked at responses to unfamiliar melodies. So this in itself was not such a surprise.  What is really interesting, however, is that we were able to pick out differences in the neural responses between those trained in Japanese or Western classical music."

So our brains really do respond differently to music that we grok.  I'd love to see if the same holds true for Balkan musicians as compared to musicians from other cultures; I know that there are talented musician friends of mine who listen to Balkan music with a puzzled frown, and never can quite catch hold of what the rhythm is doing, even once it's explained.  But it does show that learning can modulate what's happening all the way down to the neural level.

All of which wants me to get my flute out and play some wacky Bulgarian tunes.  Like a neat one called "Nenadova Igra," which is in 9/8.  But it's not three 3s; no, that'd be too simple.  It's 2-2-2-3.  Apple-apple-apple-cinnamon.

Because of course it is.

*************************************

This week's Skeptophilia book recommendation of the week is about as cutting-edge as you can get, and is as scary as it is fascinating.  A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution, by Jennifer Doudna and Samuel Sternberg, is a crash course in the new genetic technology called CRISPR-Cas9 -- the gene-editing protocol that Doudna herself discovered.  This technique allows increasingly precise cut-and-paste of DNA, offering promise in not just treating, but curing, deadly genetic diseases like cystic fibrosis and Huntington's disease.

But as with most new discoveries, it is not without its ethical impact.  The cautious are already warning us about "playing God," manipulating our genes not to eliminate disease, but to enhance intelligence or strength, to change personal appearance -- or personality.

A Crack in Creation is an unflinching look at the new science of gene editing, and tries to tease out the how much of what we're hearing is unwarranted fear-talk, and how much represents a genuine ethical minefield.  Doudna and Sternberg give the reader a clear understanding of what CRISPR-Cas9 is likely to be able to do, and what it won't, and maps out a direction for the discussion to take based on actual science -- neither panic and alarmism, nor a Panglossian optimism that everything will sort itself out.  It's a wonderful introduction to a topic that is sure to be much in the news over the next few years.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Monday, January 27, 2020

Jump scare preparation

When I was about twelve, I was lying on the sofa in my living room one evening watching a horror movie called Gargoyles.

From the perspective of a few more decades of living, I can say now that Gargoyles was a pretty derpy movie.  The general gist was that the people who put gargoyle statues on Gothic cathedrals were sculpting from life, and that all over the world there were caves occupied by the great-great-great-etc. grandchildren of those medieval monsters.  So of course there's the intrepid scientist character who is convinced that gargoyles exist but can't get his supervisors to believe him, but he goes and investigates them anyhow, and in the process hits one of them with his pickup truck.  (The gargoyles, not his supervisors.)

[Image licensed under the Creative Commons Florian Siebeck, Paris Gargoyle, CC BY-SA 3.0]

Well, the scientist is just thrilled by this development.  He thinks, "Wow, now I have proof!", loads the deceased gargoyle into his truck, and then stops at a motel for the night.  Then he does what you would do if you had never ever ever watched a horror movie in your life, namely: he decides that he can't leave a gargoyle corpse in the open bed of his pickup truck in the parking lot of a motel, so he drags it into the room with him.

He gets undressed for bed, turns out the lights, and -- of course -- it turns out the gargoyle isn't dead.  There's a soft, stealthy noise, and then a vaguely humanoid-shaped shadow rises, looming over the foot of the sleeping scientist's bed.

This was when my father, who was sitting in the recliner next to the couch, reached out and grabbed my shoulder and yelled, "THERE'S ONE NOW!"

After he peeled me off the ceiling with a spatula and my heart rate began to return to normal, I at least was thankful that I hadn't pissed my pants.  It was a close-run thing.

It's a wonder that I actually watch horror movies at all, because I am seriously suggestible.  When the movie The Sixth Sense first was released on DVD, my girlfriend (now wife) and I watched it at her house.  Then I had to make a forty-five minute drive, alone in my car at around midnight, then go (still alone) into my cold, dark, empty house.  I might actually have jumped into bed from four feet away so the evil little girl ghost wouldn't reach out from underneath and grab my ankle.  I also might have pulled the blankets up as high over me as I could without suffocating, following the time-tested rule that monsters' claws can't pierce a down comforter.

So yeah.  I might be a skeptic, but I am also a great big coward.

This was why I found some research that was published in the journal Neuroimage last week so fascinating.  It comes out of the University of Turku (Finland), where a team led by neuroscientist Lauri Nummenmaa had people watching movies like The Devil's Backbone and The Conjuring while hooked to an fMRI scanner.

They had participants (all of whom said they watched at least one horror movie every six months) rate the movies they watched for suspense and scariness, count the number of "jump scares," and evaluate their overall quality.  The scientists then looked at the fMRI results to see what parts of the brain were active when, and found some interesting patterns.

As the tension is increasing -- points where you're thinking, "Something scary is going to happen soon" -- the parts of the brain involved in visual and auditory processing ramp up activity.  Makes sense; if you were in a situation with real threats, and were worried about some imminent danger, you would begin to pay more attention to your surroundings, looking for cues to whether your fears were justified.  Then at the moment of jump scares, the parts of the brain involved in decision-making and fight-or-flight response spike in activity, as you make the split-second decision whether to run, fight the monster, or (most likely in my case) just have a stroke and drop dead on the spot.

Nummenmaa and his team found, however, that all through the movie, the sensory processing and rapid-response parts of the brain were in continuous cross-talk.  Apparently the brain is saying, "Okay, we're in a horror movie, so something terrifying is bound to happen sooner or later.  May as well prepare for it now."

What I still find fascinating, though, is why people actually like this sensation.  Even me.  I mean, my favorite Doctor Who episode -- the one that got me hooked on the series in the first place -- is the iconic episode "Blink," featuring the terrifying Weeping Angels, surely one of the scariest fictional monsters ever invented.


Maybe it's so when the movie's over, we can reassure ourselves that we might have problems in our lives, but at least we're not being disemboweled by a werewolf or abducted by aliens or whatnot.  I'm not sure if this is true for me, though.  Because long after the movie's over, I'm still convinced that whatever horrifying creature was rampaging through the story, it's still out there.

And it's looking for me.

So maybe I shouldn't watch scary movies.  It definitely takes a toll on me.  And that's even without my practical joker father scaring me out of five years of my life expectancy when the monster appears.

**********************************

The brilliant, iconoclastic physicist Richard Feynman was a larger-than-life character -- an intuitive and deep-thinking scientist, a prankster with an adolescent sense of humor, a world traveler, a wild-child with a reputation for womanizing.  His contributions to physics are too many to list, and he also made a name for himself as a suspect in the 1950s "Red Scare" despite his work the previous decade on the Manhattan Project.  In 1986 -- two years before his death at the age of 69 -- he was still shaking the world, demonstrating to the inquiry into the Challenger disaster that the whole thing could have happened because of an o-ring that shattered from cold winter temperatures.

James Gleick's Genius: The Life and Science of Richard Feynman gives a deep look at the man and the scientist, neither glossing over his faults nor denying his brilliance.  It's an excellent companion to Feynman's own autobiographical books Surely You're Joking, Mr. Feynman! and What Do You Care What Other People Think?  It's a wonderful retrospective of a fascinating person -- someone who truly lived his own words, "Nobody ever figures out what life is all about, and it doesn't matter.  Explore the world.  Nearly everything is really interesting if you go into it deeply enough."

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, November 8, 2019

To see ourselves

The brilliant Scottish poet Robert Burns packed a lot of truth in these four lines:
O, would some power the giftie gi'e us
To see ourselves as others see us;
It would frae many a blunder free us,
And foolish notion.
It's almost a cliché that we don't see ourselves very accurately, both in the positive and negative sense.  We sometimes overestimate our own capacities (resulting in the infamous Dunning-Kruger effect, the tendency of people to think they understand things way better than they actually do).  At the same time, we often undersell our own abilities, lacking confidence in areas where we really are talented -- sometimes through false modesty, but sometimes because we really, honestly don't realize that we have an unusual skill.

I remember this last bit happening to me.  I have one ability my wife calls my "superpower" -- I remember melodies, essentially indefinitely.  The craziest example of this happened when I was taking a Balkan dance class when I was in my early twenties, and heard a tune I really liked.  I was going to ask the instructor what the name of the tune was, but clean forgot (so remembering other things is not really my forté).  But I remembered the tune itself, after hearing it only a couple of times while we were learning the dance that went with it.

Fast forward thirty years.  I was at Lark Camp, a week-long folk music gathering in the Mendocino Redwoods, and I was heading to lunch when I heard a fiddler and an accordion player playing a tune.  My ears perked up immediately.

There was no doubt in my mind.  That was "my" dance tune.

Turns out it's a Serbian melody called Bojarka.  (If you want to hear it, here's a lovely live performance of it by flutist Bora Dugić.)  I had remembered it, without trying or even playing it again, for thirty years.

What's funny is that I never thought there was anything particularly unusual about this.  With no context, I always simply assumed everyone could do it.  It was only when I started playing with other musicians that I found that my musical memory was pretty uncommon.  (It bears mention, however, that my remembering a tune doesn't mean I can play it perfectly.  Technically, I'm an average musician at best.)

This all comes up because of a recent study that looked at how our close friends think of us -- and even more interestingly, what their brains look like when they're doing it -- and suggests that our pals are way more aware of our core strengths, flaws, talents, and personalities than we might have thought.

[Image licensed under the Creative Commons FOTO:FORTEPAN / Korenchy László, Portrait, woman, mirror, reflection, smile, headscarf Fortepan 29523, CC BY-SA 3.0]

In "The Neural Representation of Self is Recapitulated in the Brains of Friends: A Round-Robin fMRI Study," which appeared this week in The Journal of Personality and Social Psychology, psychologists Robert Chavez and Dylan Wagner of Ohio State University took a group of eleven close friends and had each of them think about first themselves then the ten others, one at a time, evaluating each on the degree of accuracy of forty-eight different descriptors (including lonely, sad, cold, lazy, overcritical, trustworthy, enthusiastic, clumsy, fashionable, helpful, smart, punctual, and nice), and while they were doing this task an fMRI machine was recording how their brains responded.  The results were nothing short of fascinating.  The authors write:
Using functional MRI and a multilevel modeling approach, we show that multivoxel brain activity patterns in the MPFC [medial prefrontal cortex] during a person’s self-referential thought are correlated with those of friends when thinking of that same person.  Moreover, the similarity of neural self–other patterns was itself positively associated with the similarity of self–other trait judgments ratings as measured behaviorally in a separate session.  These findings suggest that accuracy in person perception may be predicated on the degree to which the brain activity pattern associated with an individual thinking about their own self-concept is similarly reflected in the brains of others.
So while everyone doesn't see you completely accurately, in aggregate your friends have a pretty clear picture of you.

"Each one of your friends gets to see a slightly different side of you," said study lead author Robert Chavez.  "When you put them all together, it is a better approximation of how you see yourself than any one person individually."

So Robert Burns's famous quip is both true and misleading; the way others see us is largely accurate, but if you take a large enough sample size, it agrees pretty well with how we see ourselves.  We may not be so unaware of our own foibles and unusual skills as it might appear at first, and it seems like our attempts to hide who we truly are from our friends aren't quite as successful as we like to think.

**********************************

This week's Skeptophilia book recommendation is a fun book about math.

Bet that's a phrase you've hardly ever heard uttered.

Jordan Ellenberg's amazing How Not to Be Wrong: The Power of Mathematical Thinking looks at how critical it is for people to have a basic understanding and appreciation for math -- and how misunderstandings can lead to profound errors in decision-making.  Ellenberg takes us on a fantastic trip through dozens of disparate realms -- baseball, crime and punishment, politics, psychology, artificial languages, and social media, to name a few -- and how in each, a comprehension of math leads you to a deeper understanding of the world.

As he puts it: math is "an atomic-powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength."  Which is certainly something that is drastically needed lately.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Wednesday, September 18, 2019

The most beautiful brain network

A couple of weeks ago, I wrote a piece here at Skeptophilia about some fascinating new research suggesting that there are links between our perceptions of artistic, musical, and mathematical beauty, and expressed some puzzlement about how those could possibly connect.  In one of those lovely near-synchronicities that happen sometimes, today I happened upon some new(er) research showing what the underlying connection might be -- in one single region of the brain.

In a paper published this week in the Proceedings of the National Academy of Sciences, a team made up of Edward A. Vessel and Ayse Ilkay Isik (of the Max Planck Institute), Amy M. Belfi (of the Missouri University of Science and Technology), Jonathan L. Stahl (of Ohio State University), and G. Gabrielle Starr (of Pomona College) showed that with different sorts of visual stimuli, our sense of aesthetic pleasure comes from activation of a part of the brain called the default-mode network.  The authors write:
Despite being highly subjective, aesthetic experiences are powerful moments of interaction with one’s surroundings, shaping behavior, mood, beliefs, and even a sense of self.  The default-mode network (DMN), which sits atop the cortical hierarchy and has been implicated in self-referential processing, is typically suppressed when a person engages with the external environment.  Yet not only is the DMN surprisingly engaged when one finds a visual artwork aesthetically moving, here we present evidence that the DMN also represents aesthetic appeal in a manner that generalizes across visual aesthetic domains, such as artworks, landscapes, or architecture.  This stands in contrast to ventral occipitotemporal cortex (VOT), which represents the content of what we see, but does not contain domain-general information about aesthetic appeal.
Using fMRI studies, the researchers compared the responses of the brains of volunteers to three types of visual stimuli; art, architecture, and photographs of natural landscapes.  The responses of the visual cortices of the test subjects showed great variation between these three different types -- evidently the brain's effort to categorize and interpret what it's seeing, so it's no great surprise that you'd respond differently while seeing the Mona Lisa than you would looking at Chartres Cathedral.

What was surprising, though, is that while viewing visual stimuli the test subjects found aesthetically pleasing, all of them had a high response in the default-mode network, which is usually associated with contemplation, imagination, self-reflection, and inward thought.  It's uncertain if the DMN actually encodes the basics of aesthetic response, but this certainly suggests a critical role.  "We don't know yet if DMN actually computes this representation," said Edward Vessel, lead author of the paper, in an interview in EurekAlert.  "But it clearly has access to abstract information about whether we find an experience aesthetically appealing or not."

This suggests to me a couple of interesting directions this research could go.  Obviously, it'd be intriguing to find out of the DMN is also active with other types of aesthetic appreciation (such as musical and mathematical aesthetics, the subject of the previous research).  What I'd find even more fascinating, though, is to see if there's a difference in the activity of the DMN depending upon how strongly the individual is aesthetically moved.  Those responses are so highly individual that finding a biological underpinning would be amazingly cool.  Why, for example, was my wife moved to tears while looking at paintings in a Van Gogh exhibition we attended a couple of years ago in New York City?  Why do I find Édouard Manet's 1882 masterpiece A Bar at the Folies-Bergère so emotionally evocative, while a lot of other art from the same period doesn't really grab me one way or the other?

[Image is in the Public Domain]

So this could be a window into finding out -- at least from a neurological standpoint -- how our brain modulates our aesthetic response.  The "why," of course, is more inscrutable -- demonstrating in an fMRI that I go into rapture hearing Stravinsky's Firebird isn't telling me anything I didn't already know, after all, and doesn't answer why I don't have the same response hearing Rachmaninoff's Piano Concerto #2.


But at least finding a neurological basis for such judgments would be a step forward.  The Vessel et al. research is a fascinating first step into understanding the sweetest of human behaviors -- our perception of beauty in the world around us.

**********************************

This week's Skeptophilia book recommendation made the cut more because I'd like to see what others think of it than because it bowled me over: Jacques Vallée's Passport to Magonia.

Vallée is an interesting fellow, and certainly comes with credentials; he has an M.S. in astrophysics from the University of Lille and a Ph.D. in computer science from Northwestern University.  He's at various times been an astronomer, a computer scientist, and a venture capitalist, and apparently was quite successful at all three.  But if you know his name, it's probably because of his connection to something else -- UFOs.

Vallée became interested in UFOs early, when he was 16 and saw one in his home town of Pontoise, France.  After earning his degree in astrophysics, he veered off into the study of the paranormal, especially allegations of alien visitation, associating himself with some pretty reputable folks (J. Allen Hynek, for example) and some seriously questionable ones (like the fraudulent Israeli spoon-bender, Uri Geller).

Vallée didn't really get the proof he was looking for (of course, because if he had we'd probably all know about it), but his decades of research compiles literally hundreds -- perhaps thousands -- of alleged sightings and abductions.  And that's what Passport to Magonia is about.  To Vallée's credit, he doesn't try to explain them -- he doesn't have a favorite hypothesis he's trying to convince you of -- he simply says that there are two things that are significant: (1) the number of claims from otherwise reliable and sane folks is too high for there not to be something to it; and (2) the similarity between the claims, going all the way back to medieval claims of abductions by spirits and "elementals," is great enough to be significant.

I'm not saying I necessarily agree with him, but his book is lucid and fascinating, and the case studies he cites make for pretty interesting reading.  I'd be curious to see what other Skeptophiles think of his work.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]






Wednesday, July 24, 2019

Meaning in music

As someone fascinated by neuroscience, language, and music, you can imagine how excited I was to find some new research that combined all three.

A link sent to me by a loyal reader of Skeptophilia describes a study that is the subject of a paper in Nature Neuroscience last week with the rather intimidating title "Divergence in the Functional Organization of Human and Macaque Auditory Cortex Revealed by fMRI Responses to Harmonic Tones."  Written by Sam V. Norman-Haignere (Columbia University), Nancy Kanwisher (MIT), Josh H. McDermott (MIT), and Bevil R. Conway (National Institute of Health), the paper shows evidence that even our close primate relatives don't have the capacity for discriminating harmonic tones that humans have -- that our perception of music may well be a uniquely human capacity.

"We found that a certain region of our brains has a stronger preference for sounds with pitch than macaque monkey brains," said Bevil Conway, senior author of the study.  "The results raise the possibility that these sounds, which are embedded in speech and music, may have shaped the basic organization of the human brain."

Monkeys, apparently, respond equally to atonal/aharmonic sounds, while humans have a specific neural module that lights up on an fMRI scan when the sounds they hear are tonal in nature.  "These results suggest the macaque monkey may experience music and other sounds differently," Conway said.  "In contrast, the macaque's experience of the visual world is probably very similar to our own.  It makes one wonder what kind of sounds our evolutionary ancestors experienced."

[Image is in the Public Domain]

It immediately put me in mind of tonal languages (such as Thai and Chinese) where the same syllable spoken with a rising, falling, or steady tone completely changes its denotative meaning.  Even non-tonal languages (like English) express connotation with tone, such as the rising tone at the end of a question.  And subtleties like stress patterns can substantially change the meaning.  For example, consider the sentence "She told me to give you the money today?"  Now, read it aloud while stressing the words as follows:
  • SHE told me to give you the money today?
  • She TOLD me to give you the money today?
  • She told ME to give you the money today?
  • She told me to GIVE you the money today?
  • She told me to give YOU the money today?
  • She told me to give you the MONEY today?
  • She told me to give you the money TODAY?
No two of these connote the same idea, do they?

I'm reminded of how the brilliant neuroscientist David Eagleman describes the concept of the umwelt of an organism:
In 1909, the biologist Jakob von Uexküll introduced the concept of the umwelt.  He wanted a word to express a simple (but often overlooked) observation: different animals in the same ecosystem pick up on different environmental signals.  In the blind and deaf world of the tick, the important signals are temperature and the odor of butyric acid. For the black ghost knifefish, it's electrical fields.  For the echolocating bat, it's air-compression waves.  The small subset of the world that an animal is able to detect is its umwelt... 
The interesting part is that each organism presumably assumes its umwelt to be the entire objective reality "out there."  Why would any of us stop to think that there is more beyond what we can sense?
So tone, apparently, is part of the human umwelt, but not that of macaques (and probably other primate species).  Perhaps other animals include tone in their umwelt, but that point is uncertain.  I'd guess that these would include many bird species, which communicate using (often very complex) songs.  Echolocating cetaceans and bats, maybe.  Other than that, probably not many.

"This finding suggests that speech and music may have fundamentally changed the way our brain processes pitch," Conway said.  "It may also help explain why it has been so hard for scientists to train monkeys to perform auditory tasks that humans find relatively effortless."

I wonder what music sounds like to my dogs?  I get a curious head-tilt when I play the piano or flute, and I once owned a dog who would curl up at my feet while I practiced.  Both my dogs, however, immediately remember other pressing engagements and leave the premises as soon as I take out my bagpipes.

Although most humans do the same thing, so maybe that part's not about tonal perception per se.

************************************

The subject of Monday's blog post gave me the idea that this week's Skeptophilia book recommendation should be a classic -- Konrad Lorenz's Man Meets Dog.  This book, written back in 1949, is an analysis of the history and biology of the human/canine relationship, and is a must-read for anyone who owns, or has ever owned, a doggy companion.

Given that it's seventy years old, some of the factual information in Man Meets Dog has been superseded by new research -- especially about the genetic relationships between various dog breeds, and between domestic dogs and other canid species in the wild.  But his behavioral analysis is impeccable, and is written in his typical lucid, humorous style, with plenty of anecdotes that other dog lovers will no doubt relate to.  It's a delightful read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]