Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label brain. Show all posts
Showing posts with label brain. Show all posts

Saturday, August 16, 2025

Facing facts

"I'm sorry, but I have no idea who you are."

I can't tell you how many times I've had to utter that sentence.  Regular readers of Skeptophilia know why; I have a peculiar disability called prosopagnosia, or "face blindness."  I have a nearly complete inability to recognize faces, even of people I've known for some time.

Well, that's not exactly true.  I recognize people differently than other people do.  I remember the people I know as lists of features.  I know my wife has curly brown hair and freckles and an infectious smile, but I honestly have no mental image of her.  I can't picture my own face, although -- like with my wife -- I could list some of my features.

That system doesn't have a high success rate, however, and a lot of the time I have no idea who the people around me are, especially in a place where there are few clues from context.  I have pretty serious social anxiety, and my condition makes it worse, having put me in the following actual situations:
  • introducing myself twice to the same person at a party
  • getting a big, enthusiastic hug and an "it's been so long!" from someone in our local gym, and never figuring out who I was talking to
  • having two of my students switch seats and not realizing it for three weeks, until finally they 'fessed up
  • going to see a movie, and not knowing until the credits rolled that the main characters were played by Kenneth Branagh, Penelope Cruz, Judi Dench, Derek Jacobi, Michelle Pfeiffer, Daisy Ridley, and Johnny Depp
  • countless incidents of my fishing for clues ("so, how's your, um... spouse, parents, kids, pets, job..."), sometimes fruitlessly
My anxiety has made me really good at paying attention to, and recalling, other cues like voice, manner of dress, posture, walk, hair style, and so on.  But when one or more of those change -- such as with the student I had one year who cut her hair really short during the summer, and whom I didn't recognize when she showed up in one of my classes on the first day of school the following year -- it doesn't always work.

One up side to the whole thing is that I do get asked some funny questions.  One student asked me if when I looked at people, their faces were invisible.  Another asked me if when I look in the bathroom mirror in the morning, I don't know that's me.  (It's a pretty shrewd guess that it is me, since there's generally no one else in there at the time.)

But at least it's not as bad as the dumb questions that my former students who are identical triplets sometimes get.  One of them was once asked by a friend how she kept track of which triplet she was.

No, I'm not kidding.  Neither, apparently, was the person who asked the question.

[Image licensed under the Creative Commons Randallbritten, FaceMachine screenshots collage, CC BY-SA 3.0]

In any case, all of this comes up because of some research that came out in the journal Cortex that tried to parse what's happening (or what's not happening) in the brains of people like me.  Some level of prosopagnosia affects about one person in fifty; some of them lose their facial recognition ability because of a stroke or other damage to the fusiform gyrus, the part of the brain that seems to be a dedicated face-memory module.  Others, like me, were born this way.  Interestingly, a lot of people who have lifelong prosopagnosia take a while to figure it out; for years, I just thought I was unobservant, forgetful, or a little daft.  (All three of those might be true as well, of course.)  It was only after I had enough embarrassing incidents occur, and -- most importantly -- saw an eye-opening piece about face blindness by Leslie Stahl on 60 Minutes, that I realized what was going on.

In any case, the paper in Cortex looked at trying to figure out why people who are face-blind often do just fine on visual perception tests, then fail utterly when it comes to remembering photographs of faces.  The researchers specifically tried to parse whether the difference was coming from an inability to connect context cues to the face you're seeing (e.g., looking at someone and thinking, "She's the woman who was behind the counter at the library last week") versus simple familiarity (the more nebulous and context-free feeling of "I've seen that person before").  They showed each test subject (some of whom weren't face-blind) a series of 120 faces, then a second series of 60 faces where some of them were new and some of them were in the previous series.  The researchers were not only looking for whether the subjects could correctly pick out the old faces, but how confident they were in their answers -- the surmise being that low confidence on correct answers was an indicator of relying on familiarity rather than context memory.

The prosopagnosics in the test group not only were bad at identifying which faces were old and which ones they'd seen before; but their confidence was really low, even on the ones they got right.  Normally-sighted people showed a great deal more certainty in their answers.  What occurs to me, though, is that knowing they're face-blind would skew the results, in that we prosopagnosics are always doubtful we're recalling correctly.  So these data could be a result of living with the condition, not some kind of underlying mechanism at work.  I almost never greet someone first, because even if I think I might know them, I'm never certain.  A lot of people think I'm aloof because of this, but the reality is that I honestly don't know which of the people I'm seeing are friends and which are total strangers.

One thing about the researchers' conclusion does ring true, however.  The subconscious "feeling of familiarity" is definitely involved.  My experience of face blindness isn't that I feel like I'm surrounded by strangers; it's more that everyone looks vaguely familiar.  The problem is, that feeling is no stronger when I see a close friend than when I see someone I've never met before, so the intensity of that sense -- what apparently most people rely on -- doesn't help me.

So that's the view of the world through the eyes of someone who more often than not doesn't know who he's looking at.  Fortunately for me, (1) at this point in my life I'm unembarrassed by my condition, and (2) most of the people in my little village know I'm face-blind and will say, "Hi, Gordon, it's Steve..." when they walk up, and spare me the awkwardness of fishing for clues.  (Nota bene: This only works if it actually is Steve.  Otherwise it would be even more awkward.)  But hopefully some good will come from this research, because face blindness is kind of a pain in the ass.

"Our results underscore that prosopagnosia is a far more complex disorder that is driven by more than deficits in visual perception," said study first author Anna Stumps, a researcher in the Boston Attention Learning Laboratory at VA Boston.  "This finding can help inform the design of new training approaches for people with face blindness."

Which would be really, really nice.

****************************************


Saturday, February 1, 2025

Remembrance of things past

"The human brain is rife with all sorts of ways of getting it wrong."

This quote is from a talk by eminent astrophysicist Neil deGrasse Tyson, and is just about spot on.  Oh, sure, our brains work well enough, most of the time; but how many times have you heard people say things like "I remember that like it was yesterday!" or "Of course it happened that way, I saw it with my own eyes"?

Anyone who knows something about neuroscience should immediately turn their skepto-sensors up to 11 as soon as they hear either of those phrases.

fMRI scan of a human brain during working memory tasks [Image is in the Public Domain courtesy of the Walter Reed National Military Medical Center]

Our memories and sensory-perceptual systems are selective, inaccurate, heavily dependent on what we're doing at the time, and affected by whether we're tired or distracted or overworked or (even mildly) inebriated.  Sure, what you remember might have happened that way, but -- well, let's just say it's not as much of a given as we'd like to think.  An experiment back in 2005 out of the University of Portsmouth looked memories of the Tavistock Square (London) bus bombing, and found that a full forty percent of the people questioned had "memories" of the event that were demonstrably false -- including a number of people who said they recalled details from CCTV footage of the explosion, down to what people were wearing, who showed up to help the injured, when police arrived, and so on.

Oddly enough, there is no CCTV footage of the explosion.  It doesn't exist and has never existed.

Funny thing that eyewitness testimony is considered some of the most reliable evidence in courts of law, isn't it?

There are a number of ways our brains can steer us wrong, and the worst part of it all is that they leave us simultaneously convinced that we're remembering things with cut-crystal clarity.  Here are a few interesting memory glitches that commonly occur in otherwise mentally healthy people, that you might not have heard of:

  • Cryptomnesia.  Cryptomnesia occurs when something from the past recurs in your brain, or arises in your external environment, and you're unaware that you've already experienced it.  This has resulted in several probably unjustified accusations of plagiarism; the author in question undoubtedly saw the text they were accused of plagiarizing some time earlier, but honestly didn't remember they'd read it and thought that what they'd come up with was entirely original.  It can also result in some funnier situations -- while the members of Aerosmith were taking a break from recording their album Done With Mirrors, they had a radio going, and the song "You See Me Crying" came on.  Steven Tyler said he thought that was a pretty cool song, and maybe they should record a cover of it.  Joe Perry turned to him in incredulity and said, "That's us, you fuckhead."
  • Semantic satiation.  This is when a word you know suddenly looks unfamiliar to you, often because you've seen it repeatedly over a fairly short time.  Psychologist Chris Moulin of Leeds University did an experiment where he had test subjects write the word door over and over, and found that after a minute of this 68% of the subjects began to feel distinctly uneasy, with a number of them saying they were doubting that "door" was a real word.  I remember being in high school writing an exam in an English class, and staring at the word were for some time because I was convinced that it was spelled wrong (but couldn't, of course, remember how it was "actually" spelled).
  • Confabulation.  This is the recollection of events that never happened -- along with a certainty that you're remembering correctly.  (The people who claimed false memories of the Tavistock Square bombing were suffering from confabulation.)  The problem with this is twofold; the more often you think about the false memory or tell your friends and family about it, the more sure you are of it; and often, even when presented with concrete evidence that you're recalling incorrectly, somehow you still can't quite believe it.  A friend of mine tells the story of trying to help her teenage son find his car keys, and that she was absolutely certain that she'd seen them that day lying on a blue surface -- a chair, tablecloth, book, she wasn't sure which, but it was definitely blue.  They turned the house upside down, looking at every blue object they could find, and no luck.  Finally he decided to walk down to the bus stop and take the bus instead, and went to the garage to get his stuff out of the car -- and the keys were hanging from the ignition, where he'd left them the previous evening.  "Even after telling me this," my friend said, "I couldn't accept it.  I'd seen those keys sitting on a blue surface earlier that day, and remembered it as clearly as if they were in front of my face."
  • Declinism.  This is the tendency to remember the past as more positive than it actually was, and is responsible both for the "kids these days!" thing and "Make America Great Again."  There's a strong tendency for us to recall our own past as rosy and pleasant as compared to the shitshow we're currently immersed in, irrespective of the fact that violence, bigotry, crime, and general human ugliness are hardly new inventions.  (A darker aspect of this is that some of us -- including a great many MAGA types -- are actively longing to return to the time when straight White Christian men were in charge of everything; whether this is itself a mental aberration I'll leave you to decide.)  A more benign example is what I've noticed about travel -- that after you're home, the bad memories of discomfort and inconveniences and delays and questionable food fade quickly, leaving behind only the happy feeling of how much you enjoyed the experience.
  • The illusion of explanatory depth.  This is a dangerous one; it's the certainty that you understand deeply how something works, when in reality you don't.  This effect was first noted back in 2002 by psychologists Leonid Rozenblit and Frank Keil, who took test subjects and asked them to rank from zero to ten their understanding of how common devices worked, including zippers, bicycles, electric motors, toasters, and microwave ovens, and found that hardly anyone gave themselves a score lower than five on anything.  Interestingly, the effect vanished when Rozenblit and Keil asked the volunteers actually to explain how the devices worked; after trying to describe in writing how a zipper works, for example, most of test subjects sheepishly realized they actually had no idea.  This suggests an interesting strategy for dealing with self-styled experts on topics like climate change -- don't argue, ask questions, and let them demonstrate their ignorance on their own.
  • Presque vu.  Better known as the "tip-of-the-tongue" phenomenon -- the French name means "almost seen" -- this is when you know you know something, but simply can't recall it.  It's usually accompanied by a highly frustrating sense that it's right there, just beyond reach.  Back in the days before The Google, I spent an annoyingly long time trying to recall the name of the Third Musketeer (Athos, Porthos, and... who???).  I knew the memory was in there somewhere, but I couldn't access it.  It was only after I gave up and said "to hell with it" that -- seemingly out of nowhere -- the answer (Aramis) popped into my head.  Interestingly, neuroscientists are still baffled as to why this happens, and why turning your attention to something else often makes the memory reappear.

So be a little careful about how vehemently you argue with someone over whether your recollection of the past or theirs is correct.  Your version might be right, or theirs -- or it could easily be that both of you are remembering things incompletely or incorrectly.  I'll end with a further quote from Neil deGrasse Tyson: "We tend to have great confidence in our own brains, when in fact we should not.  It's not that eyewitness testimony by experts or people in uniform is better than that of the rest of us; it's all bad....  It's why we scientists put great faith in our instruments.  They don't care if they've had their morning coffee, or whether they got into an argument with their spouse -- they get it right every time."

****************************************

Saturday, December 28, 2024

Face forward

Life with prosopagnosia is peculiar sometimes.

Better known as "face blindness," it's a partial or complete inability to recognize people's faces.  I'm not sure where I fall on the spectrum -- I'm certainly nowhere as bad as neuroscientist and author Oliver Sacks, who didn't recognize his own face in the mirror.  Me, I'm hampered by it, but have learned to compensate by being very sensitive to people's voices and how they move.  (I've noticed that I'm often more certain who someone is if I see them walking away than I am if they're standing right in front of me.)

Still, it results in some odd situations sometimes.  I volunteer once a week as a book sorter at our local Friends of the Library book sale, and there's this one guy named Rich who is absolutely a fixture -- he always seems to be there.  I've seen him and spoken with him at least a hundred times.  Well, a couple of weeks ago, I was working, and there was this guy who was behind the counter, messing with stuff.  I was about to ask who he was and what he was doing, when he said something, and I realized it was Rich -- who had shaved off his facial hair.

Until he opened his mouth, I honestly had no idea I'd ever seen him in my life.

Then, a couple of nights ago, my wife and I were watching the Doctor Who Christmas episode "Joy to the World," and afterward got to see a thirty-second teaser trailer for season two, which is being released next spring.  Well, in season one, there was this mysterious recurring character named Mrs. Flood (played by British actress Anita Dobson) whose role we have yet to figure out, and who has the Who fandom in quite the tizzy.  And in the trailer, there's a quick clip of an old woman in formal attire watching a theater performance through opera glasses, and until another fan said, "What did you think about the appearance of Mrs. Flood in the trailer?" I had no clue -- not the least suspicion -- that it was her.

So it's kind of inconvenient, sometimes.  When people post still shots from movies or television shows on social media, I usually not only don't know who the actors are, I have no idea what film it's from (unless there's an obvious clue from the setting).  And as I've related before, there are times when even my voice-recognition strategy hasn't worked, and I've had entire conversations with people and then left still not knowing who it was I'd been talking to.

The reason the topic comes up (again) is some research out of Toyohashi University of Technology that was the subject of a paper in the Journal of Vision last week.  The researchers were trying to figure out if humans have a better innate ability to filter out extraneous visual distractions when it comes to facial recognition than they do for recognizing other objects.  Using a technique called "continuous flash suppression" (CFS), they presented volunteers with fast-moving high-contrast images in one eye, and a target image in the other, then using an fMRI measured how long it took the brain's visual recognition centers to "break through" the distraction and recognize the target image.

If the target image was a face -- or "face-like" -- that breakthrough happened much faster than it did with any other sort of image.  And, interestingly, the breakthrough time was significantly slowed for faces that were upside-down.

We're wired, apparently, to recognize right-side-up human faces faster than just about anything else.

"Our study shows that even vague, face-like images can trigger subconscious processing in the brain, demonstrating how deeply rooted facial recognition is in our visual system," said Makoto Michael Martinsen, who co-authored the study.  "This ability likely evolved to help us prioritize faces, which are critical for social interaction, even when visual information is scarce...  [However] we didn’t consider factors like emotion or attractiveness, which can affect facial perception...  Despite this, our study highlights the brain’s incredible ability to extract important information from minimal cues, especially when it comes to faces.  It emphasizes the importance of facial features in both conscious and subconscious perception and raises interesting questions about how this mechanism evolved."

Naturally, I found myself wondering how face-blind people like myself would do in this task.  After all, it's not that we can't tell something is a face; it's that the visual information in a face doesn't trigger the same instantaneous recall it does in other people.  When I do recognize someone visually, it's more that I remember a list of their features -- he's the guy with square plastic frame glasses and curly gray hair, she's the woman with a round face and dark brown eyes who favors brightly-colored jewelry.  This, of course, only takes me so far.  When someone changes their appearance -- like Rich shaving off his beard and mustache -- it confounds me completely.

So I'm curious whether I'd be like the rest of the test subjects and have faster recognition times for faces than for non-face objects, or if perhaps my peculiar wiring means my brain weights all visual stimuli equally.  I'd be happy to volunteer to go to Japan to participate, if anyone wants to find out the answer badly enough to spot me for a plane ticket.

No?  Oh, well, perhaps that'll be the next phase of Martinsen et al.'s research.  I'm willing to wait.

Until then -- if I know you, and happen to run into you in the local café, keep in mind I may have no idea who you are.  It helps if you start the conversation with, "I'm _____" -- I'm not embarrassed by my odd neurological condition, and it's better than spending the day wondering who the person was who came up and gave me a hug and asked about my wife and kids and dogs and whatnot.

****************************************

Saturday, December 7, 2024

Talking in your sleep

A little over a year ago, I decided to do something I've always wanted to do -- learn Japanese.

I've had a fascination with Japan since I was a kid.  My dad lived there for a while during the 1950s, and while he was there collected Japanese art and old vinyl records of Japanese folk and pop music, so I grew up surrounded by reminders of the culture.  As a result, I've always wanted to learn more about the country and its people and history, and -- one day, perhaps -- visit.

So in September of 2023 I signed up for Duolingo, and began to inch my way through learning the language.

[Image is in the Public Domain]

It's a challenge, to say the least.  Japanese usually shows up on lists of "the five most difficult languages to learn."  Not only are there the three different scripts you have to master in order to be literate, the grammatical structure is really different from English.  The trickiest part, at least thus far, is managing particles -- little words that follow nouns and indicate how they're being used in the sentence.  They're a bit like English prepositions, but there's a subtlety to them that is hard to grok.  Here's a simple example:

Watashi wa gozen juuji ni tokoshan de ane aimasu.

(I) (particle indicating the subject of the sentence) (A.M.) (ten o'clock) (particle indicating movement or time) (library) (particle indicating where something is happening) (my sister) (am meeting with) = "I am meeting my sister at ten A.M. at the library."

Get the particles wrong, and the sentence ends up somewhere between grammatically incorrect and completely incomprehensible.

So I'm coming along.  Slowly.  I have a reasonably good affinity for languages -- I grew up bilingual (English/French) and have a master's degree in linguistics -- but the hardest part for me is simply remembering the vocabulary.  The grammar patterns take some getting used to, but once I see how they work, they tend to stick.  The vocabulary, though?  Over and over again I'll run into a word, and I'm certain I've seen it before and at one point knew what it meant, and it will not come back to mind.  So I look it up...

... and then go, "Oh, of course.  Duh.  I knew that."

But according to a study this week out of the University of South Australia, apparently what I'm doing wrong is simple: I need more sleep.

Researchers in the Department of Neuroscience took 35 native English speakers and taught them "Mini-Pinyin" -- an invented pseudolanguage that has Mandarin Chinese vocabulary but English sentence structure.  (None of them had prior experience with Mandarin.)  They were sorted into two groups; the first learned the language in the morning and returned twelve hours later to be tested, and the second learned it in the evening, slept overnight in the lab, and were tested the following morning.

The second group did dramatically better than the first.  Significantly, during sleep their brains showed a higher-than-average level of brain wave patterns called slow oscillations and sleep spindles, that are thought to be connected with memory consolidation -- uploading short-term memories from the hippocampus into long-term storage in the cerebral cortex.  Your brain, in effect, talks in its sleep, routing information from one location to another.

"This coupling likely reflects the transfer of learned information from the hippocampus to the cortex, enhancing long-term memory storage," said Zachariah Cross, who co-authored the study.  "Post-sleep neural activity showed unique patterns of theta oscillations associated with cognitive control and memory consolidation, suggesting a strong link between sleep-induced brainwave co-ordination and learning outcomes."

So if you're taking a language class, or if -- like me -- you're just learning another language for your own entertainment, you're likely to have more success in retention if you study in the evening, and get a good night's sleep before you're called upon to use what you've learned.

Of course, many of us could use more sleep for a variety of other reasons.  Insomnia is a bear, and poor sleep is linked with a whole host of health-related woes.  But a nice benefit of dedicating yourself to getting better sleep duration and quality is an improvement in memory.

And hopefully for me, better scores on my Duolingo lessons.

****************************************

Monday, March 18, 2024

Memory boost

About two months ago I signed up with Duolingo to study Japanese.

I've been fascinated with Japan and the Japanese culture pretty much all my life, but I'm a total novice with the language, so I started out from "complete beginner" status.  I'm doing okay so far, although the fact that it's got three writing systems is a challenge, to put it mildly.  Like most Japanese programs, it's beginning with the hiragana system -- a syllabic script that allows you to work out the pronunciation of words -- but I've already seen a bit of katakana (used primarily for words borrowed from other languages) and even a couple of kanji (the ideographic script, where a character represents an entire word or concept).

[Image licensed under the Creative Commons 663highland, 140405 Tsu Castle Tsu MIe pref Japan01s, CC BY-SA 3.0]

While Duolingo focuses on getting you listening to spoken Japanese right away, my linguistics training has me already looking for patterns -- such as the fact that wa after a noun seems to act as a subject marker, and ka at the end of a sentence turns it into a question.  I'm still perplexed by some of the pronunciation patterns -- why, for example, vowel sounds sometimes don't get pronounced.  The first case of this I noticed is that the family name of the brilliant author Akutagawa Ryūnosuke is pronounced /ak'tagawa/ -- the /u/ in the second syllable virtually disappears.  I hear it happening fairly commonly in spoken Japanese, but I haven't been able to deduce what the pattern is.  (If there is one.  If there's one thing my linguistics studies have taught me, it's that all languages have quirks.  Try explaining to someone new to English why, for instance, the -ough combination in cough, rough, through, bough, and thorough are all pronounced differently.) 

Still and all, I'm coming along.  I've learned some useful phrases like "Sushi and water, please" (Sushi to mizu, kudasai) and "Excuse me, where is the train station?" (Sumimasen, eki wa doko desu ka?), as well as less useful ones like "Naomi Yamaguchi is cute" (Yamaguchi Naomi-san wa kawaii desu), which is only critical to know if you have a cute friend who happens to be named Naomi Yamaguchi.

The memorization, however, is often taxing to my 63-year-old brain.  Good for it, I have no doubt -- a recent study found that being bi- or multi-lingual can delay the onset of dementia by four years or more -- but it definitely is a challenge.  I go through my hiragana flash cards at least once a day, and have copious notes for what words mean and for any grammatical oddness I happen to notice.  Just the sheer amount of memorization, though, is kind of daunting.

Maybe what I should do is find a way to change the context in which I have to remember particular words, phrases, or characters.  That seems to be the upshot of a study I ran into a couple of days ago in Proceedings of the National Academy of Sciences, about a study by a group from Temple University and the University of Pittsburgh about how to improve retention.

I'm sure all of us have experienced the effects of cramming for a test -- studying like hell the night before, and then you do okay on the test but a week later barely remember any of it.  This practice does two things wrong; not only stuffing all the studying into a single session, but doing it all the same way.

What this study showed was two factors that significantly improved long-term memory.  One was spacing out study sessions -- doing shorter sessions more often definitely helped.  I'm already approaching Duolingo this way, usually doing a lesson or two over my morning coffee, then hitting it again for a few more after dinner.  But the other interesting variable they looked at was that test subjects' memories improved substantially when the context was changed -- when, for example, you're trying to remember as much as you can of what a specific person is wearing, but instead of being shown the same photograph over and over, you're given photographs of the person wearing the same clothes but in a different setting each time.

"We were able to ask how memory is impacted both by what is being learned -- whether that is an exact repetition or instead, contains variations or changes -- as well as when it is learned over repeated study opportunities," said Emily Cowan, lead author of the study.  "In other words... we could examine how having material that more closely resembles our experiences of repetition in the real world -- where some aspects stay the same but others differ -- impacts memory if you are exposed to that information in quick succession versus over longer intervals, from seconds to minutes, or hours to days."

I can say that this is one of the things Duolingo does right.  Words are repeated, but in different combinations and in different ways -- spoken, spelled out using the English transliteration, or in hiragana only.  Rather than always seeing the same word in the same context, there's a balance between the repetition we all need when learning a new language and pushing your brain to generalize to slightly different usages or contexts.

So all things considered, Duolingo had it figured out even before the latest research came out.  I'm hoping it pays off, because my son and I would like to take a trip to Japan at some point and be able to get along, even if we don't meet anyone cute named Naomi Yamaguchi.  But I should wind this up, so for now I'll say ja ane, mata ashita (goodbye, see you tomorrow).

****************************************



Thursday, March 14, 2024

In memoriam

I want you to recall something simple.  A few to choose from:
  • your own middle name
  • the street you grew up on
  • your best friend in elementary school
  • the name of your first pet
  • your second-grade teacher's name
Now, I'm presuming that none of you were actively thinking about any of those before I asked.  So, here are a couple of questions:

Where was that information before I asked you about it?  And how did you retrieve it from wherever that was?

The simple answer is, "we don't know."  Well, we have a decent idea about where in the brain specific kinds of information are stored, mostly from looking at what gets lost when people have strokes or traumatic brain injury.  (A technique my Anatomy and Physiology professor described as "figuring out how a car functions by smashing parts of it with a hammer, and then seeing what doesn't work anymore.")

But how exactly is that information is encoded?  That's an ongoing area of research, and one we're only beginning to see results from.  The prevailing idea for a long time has been that interactions between networks of neurons in the brain allow the storage and retrieval of memories -- for example, you have networks that encode memory of faces, ones that involve familiarity, ones that activate when you feel positive emotions, possibly ones that fire for particular stimuli like gray hair, glasses, being female, being elderly, or tone of voice -- and the intersection of these activate to retrieve the memory of your grandmother.

The problem is, all attempts to find a Venn-diagram-like cross-connected network in the brain have failed.  Even so, the idea that there could be a much smaller and more specific neural cluster devoted to a particular memory was ridiculed as the "grandmother cell model" -- the term was coined by neuroscientist Jerome Lettvin in the 1960s -- it was thought to be nonsense that we could have anything like a one-to-one correlation between memories and neurons.  As neuroscientist Charles Edward Connor put it, the grandmother cell model had "become a shorthand for invoking all of the overwhelming practical arguments against a one-to-one object coding scheme.  No one wants to be accused of believing in grandmother cells."

[Image is in the Public Domain courtesy of photographer Michel Royon]

The problem came roaring back, though, when neurosurgeons Itzhak Fried and Rodrigo Quian Quiroga were working with an epileptic patient who had electrical brain-monitoring implants, and found that when he was shown a photograph of Jennifer Aniston, a specific neuron fired in his brain.  Evidently, we do encode specific memories in only a tiny number of neurons -- but how it works is still unknown.  

We have over eighty billion neurons in the brain -- so even discounting the ones involved in autonomic functioning, you'd still think there's plenty to encode specific memories.  But... and this is a huge but... there's no evidence whatsoever that when you learn something new, somehow you're doing any kind of neural rewiring, much less growing new neurons.

The upshot is that we still don't know.

The reason this comes up is because of a study at Columbia University that was published last week in Nature Human Behavior, that looked at a newly-discovered type of brain wave, a traveling wave -- which sweeps across the cerebrum during certain activities.  And what the researchers, led by biomedical engineer Joshua Jacobs, found is that when memories are formed, traveling waves tend to move from the back of the cerebrum toward the front, and in the opposite direction when memories are retrieved.

Of course, nothing in the brain is quite that simple.  Some people's brain waves went the other direction; it seems like the change in direction is what was critical.  "I implemented a method to label waves traveling in one direction as basically 'good for putting something into memory,'" said Uma Mohan, who co-authored the paper.  "Then we could see how the direction switched over the course of the task.  The waves tended to go in the participant’s encoding direction when that participant was putting something into memory and in the opposite direction right before they recalled the word.  Overall, this new work links traveling waves to behavior by demonstrating that traveling waves propagate in different directions across the cortex for separate memory processes."

The other limitation of the study is that it doesn't discern whether the traveling waves, and the change in direction, are a cause or an effect -- if the change in direction causes recall, or if the shift in wave direction is caused by some other process that is the actual trigger for recall -- so the direction change is merely a byproduct.  But it certainly is an intriguing start on a vexing question in neuroscience.

Me, I want to know what's going on with the "tip of the tongue" phenomenon.  Just about everyone experiences it -- you know the memory is in there somewhere, you can almost get it, but... nope.  Most puzzling (and frustrating), I find that giving up and going to The Google often triggers the memory to appear before I have the chance to look it up.  This happened not long ago -- for some reason I was trying to come up with the name of the third Musketeer.  Athos, Porthos, and... who?  I pondered on it, and then finally went, "to hell with it," and did a search, but before I could even hit "return" my brain said, "Aramis."

What the fuck, brain?  Do you do this just to taunt me?

At least I comfort myself in knowing that we don't really understand how any of this works.  Which is slim consolation -- but at least it means that my own brain is no more baffling than anyone else's.

****************************************



Monday, March 11, 2024

Turning the focus knob

I am really distractible.

To say I have "squirrel brain" is a deep injustice to squirrels.  At least squirrels have the focus to accomplish their purpose every day, which is to make sure our bird feeders are constantly empty.  If I was a squirrel, I'd probably clamber my way up the post and past the inaccurately-named "squirrel baffle" and finally get to the feeder, and then just sit there with a puzzled look, thinking, "Why am I up here, again?"

My "Oh, look, something shiny" approach to life has at least a few upsides.  I tend to make weird connections between things really fast, which long-time readers of Skeptophilia probably know all too well.  If someone mentions something -- say, an upcoming visit to England -- in about 3.8 milliseconds my brain goes, England > Cornwall > Tintagel > King Arthur > Monty Python > the "bring out yer dead" scene > the Black Death > mass burials > a weird study I read a while back about how nettle plants need high calcium and phosphorus soils, so they're often found where skeletons have decomposed, and I'll say, cheerfully, "Did you know that nettles are edible?  You can cook 'em like spinach," and it makes complete sense to me even though everyone else in the room is giving me a look like this:


Talking to me is like the conversational equivalent of riding the Tilt-O-Whirl.

Which, now that I come to think of it, is not really an upside after all.

A more significant downside, though, is that my inability to focus makes it really hard in noisy or chaotic environments.  When I'm in a crowded restaurant or bar, I can pay attention for a while to what the people I'm with are saying, but there comes a moment -- and it usually does happen quite suddenly -- when my brain just goes, "Nope.  Done," and the entire thing turns into a wall of white noise in which I'm unable to pick out a single word.  

All of the above perhaps explains why I don't have much of a social life.

However, as a study last week in Nature Human Behavior shows, coordinating all the inputs and outputs the brain has to manage is an exceedingly complex task, and one a lot of us find daunting.  And, most encouragingly, that capacity for focus is not related to intelligence.  "When people talk about the limitations of the mind, they often put it in terms of, 'humans just don't have the mental capacity' or 'humans lack computing power,'" said Harrison Ritz, of Brown University, who led the study, in an interview with Science Daily.  "[Our] findings support a different perspective on why we're not focused all the time.  It's not that our brains are too simple, but instead that our brains are really complicated, and it's the coordination that's hard."

The researchers ran volunteers through a battery of cognitive tests while hooked up to fMRI machines, to observe what parts of their brain were involved in mental coordination and filtering.  In one of them, they had to estimate the percentage of purple dots in a swirling maelstrom of mixed purple and green dots -- a task that makes me anxious just thinking about it.  The researchers found two parts of the brain, the intraparietal sulcus and the anterior cingulate cortex, that seemed to be involved in the task, but each was functioning in different ways.

"You can think about the intraparietal sulcus as having two knobs on a radio dial: one that adjusts focusing and one that adjusts filtering," Ritz said.  "In our study, the anterior cingulate cortex tracks what's going on with the dots.  When the anterior cingulate cortex recognizes that, for instance, motion is making the task more difficult, it directs the intraparietal sulcus to adjust the filtering knob in order to reduce the sensitivity to motion.

"In the scenario where the purple and green dots are almost at 50/50, it might also direct the intraparietal sulcus to adjust the focusing knob in order to increase the sensitivity to color.  Now the relevant brain regions are less sensitive to motion and more sensitive to the appropriate color, so the participant is better able to make the correct selection."

The applications to understanding disorders like ADHD are obvious, although of course identifying the parts of the brain that are responsible is only the beginning.  The question then becomes, "But what do you do about it?", and the truth is that current treatments for ADHD are a crapshoot at best.  Even so, it'd have been nice if this understanding had come sooner -- it might have saved me from being told by my third grade teacher, unkindly if accurately, "You have the attention span of a gnat."

I apparently haven't changed much, because recalling this comment made me go, gnats > a scene in one of Carlos Castaneda's books where the main character was high on mushrooms and hallucinated a giant man-eating gnat > edible mushrooms, which my wife hates > food preferences > licorice, another thing a lot of people hate > a study I read about using licorice extract to treat psoriasis.

Hey, did you know that the word psoriasis comes from the Greek word ψώρα, meaning "itch"?  I bet you didn't know that.

****************************************



Thursday, February 29, 2024

The dying of the light

In July of 2004, my father died.  I was at his bedside in Our Lady of Lourdes General Hospital in Lafayette, Louisiana when it happened.  He'd  been declining for a while -- his once razor-sharp mental faculties slipping into a vague cloudiness, his gait slowing and becoming halting and cautious, his former rapier wit completely gone.  The most heartbreaking thing was his own awareness of what he had lost and would continue to lose.  It looked like a slow slide into debility.

Then, in June, he had what the doctors described as a mini-stroke.  Afterward, he was still fairly lucid, but was having trouble walking.  It had long been his deepest fear (one I share) that he'd become completely dependent on others for his care, and it was obvious to us (and probably to him as well) that this was the direction things were going.

What happened next was described in three words by my mother: "He gave up."

Despite the fact that the doctors could find no obvious direct cause of it, his systems one by one started to shut down.  Three weeks after the mini-stroke and fall that precipitated his admission into the hospital, he died at age 83.

I had never been with someone as they died before (and haven't since).  I was out of state when my beloved grandma died in 1986; and when my mother died, eight months after my father, it was so sudden I didn't have time to get there.  But I was by my father's side as his breathing slowed and finally stopped.  The event itself wasn't at all dramatic; the transition between life and death was subtle, gentle, and peaceful.  However wrenching it was on my mother and me, for him there seemed to be hardly a boundary between "here" and "not here."

Of course, I'm judging that from the outside.  No one knows -- no one can know -- what the experience was like for him.  It's funny, really; death is one of the experiences that unites us as human, and one which we all will ultimately share, but none of us knows what it actually is.

Noël LeMire, La Mort et le Mourant (ca. 1770) [Image is in the Public Domain]

A study in the journal Frontiers in Aging Neuroscience, though, may be the first clue as to what the experience is like.  An 87-year-old Canadian epilepsy patient was set up for an electroencephalogram to try and get a picture of what was causing his seizures, when he unexpectedly had a severe heart attack.  The man was under a DNR (Do Not Resuscitate) order, so when his heart stopped beating, they let him die...

... but he was still hooked up to the EEG.

This gave his doctors our first glimpse into what is happening in the brain of someone as they die.  And they found a sudden increase in activity in the parts of the brain involved in memory, recall, and dreaming -- which lasted for thirty seconds after his heart stopped, then gradually faded.

"Through generating oscillations involved in memory retrieval, the brain may be playing a last recall of important life events just before we die, similar to the ones reported in near-death experiences," said Ajmal Zemmar, a neurosurgeon who was the study's lead author.  "As a neurosurgeon, I deal with loss at times.  It is indescribably difficult to deliver the news of death to distraught family members.  Something we may learn from this research is that although our loved ones have their eyes closed and are ready to leave us to rest, their brains may be replaying some of the nicest moments they experienced in their lives."

Which is a pleasant thought.  Many of us -- even, for some reason, the devoutly religious, who you'd think would be positively eager for the experience -- are afraid of death.  Me, I'm not looking forward to it; I rather like being alive, and as a de facto atheist I have no particular expectation that there'll be anything afterwards.  Being with my father as he died did, however, have the effect of making me less afraid of death.  The usual lead-up, with its frequent pain and debility and illness, is still deeply terrifying to me, but crossing the boundary itself seemed fairly peaceful.

And the idea that our brains give us one last go-through of our pleasant memories is kind of nice.  I know that this single patient's EEG is hardly conclusive -- and it's unlikely there'll be many other people hooked up to a brain scanner as they die -- but it does give some comfort that perhaps, this experience we will all share someday isn't as awful as we might fear.

****************************************



Monday, January 15, 2024

An MRI built for two

Some years ago, I injured my left knee doing martial arts, and a couple of weeks later found myself inside an MRI machine.  The technician, who would be the odds-on favorite for the least personable medical professional I've ever met, started out by telling me "strip down to your underwear" in tones that would have done a drill sergeant proud, then asking me if I had any metal items on my person.

"I don't think so," I said, as I shucked shirt, shoes, socks, and pants.  "Why?"

His eyes narrowed.  "Because when I turn these magnets on, anything made of metal will be ripped from your body, along with any limbs to which they might be attached."

I decided to check a second time for metal items.

After reassuring myself I was unlikely to get my leg torn off because I had forgotten I was wearing a toe ring, or something, I got up on a stretcher, and he cinched my leg down with straps.  Then he said, "Would you like to listen to music?"

Surprised at this unexpected gentle touch, I said, "Sure."

"What style?"

"Something soothing.  Classical, maybe."  So he gave me some headphones, tuned the radio to a classical station, and the dulcet tones of Mozart floated across me.

Then, he turned the machine on, and it went, and I quote:

BANG BANG BANG CRASH CRASH CRASH CRASH *whirrrrrr* BANG BANG BANG BANG BANG BANG BANG BANG BANG etc.

It was deafening.  The nearest thing I can compare it to is being inside a jackhammer.  It lasted a half-hour, during which time I heard not a single note of Mozart.  Hell, I doubt I'd have heard it if he'd tuned in to the Rage Against the Machine station and turned the volume up to eleven.

The upshot of it was that I had a torn meniscus, and ended up having surgery on it, and after a long and frustrating recovery period I'm now mostly back to normal.

But the MRI process still strikes me as one of those odd experiences that are entirely painless and still extremely unpleasant.  I'm not claustrophobic, but loud noises freak me out, especially when I'm wearing nothing but my boxer briefs and have one leg tied down with straps and am being watched intently by someone who makes the T-1000 from Terminator 2 seem huggable.  I mean, call me hyper-sensitive, but there you are.

So it was rather a surprise when I found out courtesy of the journal Science that the latest thing is...

... an MRI scanner built to accommodate two people.

My first thought was that hospitals were trying to double their profits by processing through patients in pairs, and that I might be there getting my leg scanned while old Mrs. Hasenpfeffer was being checked for slipped discs in her neck.  But no, it turns out it's actually for a good -- and interesting -- reason, entirely unconnected with money and efficiency.

They want to see how people's brains react when they interact with each other.

Among other things, the scientists had people talk to each other, make sustained eye contact, and even tap each other on the lips, all the while watching what was happening in each of their brains and even on their faces.  This is certainly a step up from previous solo MRI studies having to do with emotional reactions; when the person is in the tube by him/herself, any kind of interpersonal interaction -- such as might be induced by looking at a photo or video clip -- is bound to be incomplete and inaccurate.

Still, I can't help but think that the circumstance of being locked into a tube, nose to nose with someone, for an hour or more is bound to create data artifacts on its own.  I mean, look at the thing:


One of the hardest things for me at the men's retreat I attended a couple of years ago, and about which I wrote a while back, was an exercise where we made sustained eye contact at close quarters -- so you're basically standing there, staring into a stranger's eyes, from only six inches or so away.  I'm not exactly an unfriendly person, per se, but locking gazes with a guy I'd only met hours earlier was profoundly uncomfortable.

And we weren't even cinched down to a table with a rigid collar around our necks, with a noise like a demolition team echoing in our skulls.

So as much as I'm for the advancement of neuroscience, I am not volunteering for any of these studies.  I wish the researchers the best of luck, but... nope.

Especially since I wouldn't only be anxious about whether I'd removed all my metal items, I'd have to worry whether my partner had, too.  Although I do wonder what would show up on my brain MRI if I was inside a narrow tube and was suddenly smacked in the face by a detached arm.

****************************************



Tuesday, August 15, 2023

Out of your mind

One of the most striking pieces from neuroscientist David Eagleman's brilliant TED Talk "Can We Create New Senses for Humans?" centers around what is really happening when we experience something.

Regardless what it feels like, all that's going on -- the internal reality, as it were -- are some fairly weak voltage changes bouncing around in the brain.  The brain is locked inside the skull, and on its own is blind and deaf.  It needs the sense organs (Eagleman calls them our "peripherals") to send electrical signals in via input nerves to the right places in the brain, and that stimulates changes in the voltage in those areas.

That's it.  Everything you've ever experienced -- good and bad, pleasant and unpleasant -- boils down to that.  And if something messes around with any step in that process, that altered electrical state in the brain becomes the basis of what you see, hear, feel, and think.  If the wiring is faulty (thought by some researchers to be the cause of the peculiar disorder synesthesia), if there's a problem with the levels of neurotransmitters, the chemicals that either pass signals along or else block them (probably involved in schizophrenia, depression, and anxiety, among others), or if you've taken drugs that change the electrical activity of the brain -- that becomes your reality.

I was reminded of this sobering observation when I read an article sent to me my a friend and loyal reader of Skeptophilia.  Entitled, "Have Scientists Found the Source of Out-of-Body Experiences?", it describes research into a part of the cerebrum called the anterior precuneus, which appears to be involved in our sensations of conscious awareness.  Neuroscientist Josef Parvizi of Stanford University was working with epilepsy patients who were experiencing drug-resistant seizures, and found that when the anterior precuneus was electrically stimulated (the patients already had electrodes implanted in their brains to try to reduce the frequency and severity of their seizures), they had sensations of floating, and of dissociation and disorientation.

"All of them reported something weird happening to their sense of physical self," Parvizi said in an interview in Scope, Stanford Medicine’s blog.  "In fact, three of them reported a clear sense of depersonalization, similar to taking psychedelics."

Luigi Schiavonetti, The Soul Leaving the Body (1808) [Image is in the Public Domain]

What it made me wonder is if the anterior precuneus might be involved in other types of dissociation.  It's one thing when you artificially trigger a part of the brain to malfunction (or at least, alter its function) using electrodes or chemicals; but what about when it just kind of... happens?  I know I've had this experience while listening to music.  When I was about twelve, my grandma gave me a little portable radio, and I listened to it constantly.  One evening, I happened upon a radio station playing classical music, and just as I tuned in, I heard the wild, joyous trumpets and violins of the overture to J. S. Bach's Magnificat in D.

Then the chorus came in.

Three minutes later, I remembered where (and who) I was.  My face was wet with tears.  I don't know where I'd been during that time, but it wasn't in my attic bedroom in my grandma's house, with its creaky wood-plank floors and pervasive smell of dust and old books.

It was such a powerful and overwhelming event in my life that I wrote it into one of my novels, The Hand of the Hunter -- with setting and character changes, of course -- but to this day when someone says they had a "spiritual experience," this is what I think of.  It's happened to me more than once since then, always associated with music (the first hearings of Ralph Vaughan Williams's Fantasia on a Theme by Thomas Tallis, Stravinsky's Firebird, Debussy's The Drowned Cathedral, Arvo Pärt's Spiegel im Spiegel, and Mozart's Requiem had similar effects on me), but that first encounter was by far the most striking.

I wonder if the mental and physical sensations that accompanied it had something to do with the anterior precuneus?  And if, by extension, it might be the source of all such transcendent experiences?

If so, what possible purpose could this serve?

Figuring that out is considerably above my pay grade, but considering the similarities -- a loss of awareness of where your body is, dissociation, the feeling of a "time slip" -- it did bring the question up.

In any case, finding a part of the brain that, when stimulated, it makes you lose connection to the outside world is pretty staggering.  I recall one of my mentors Cornell University Professor Emeritus Rita Calvo (of the Department of Human Genetics) saying that if she were going into biology today, she'd choose neuroscience instead of genetics.  "With respect to the brain, we're right now where we were with the gene a hundred years ago.  We have an idea of some of the 'wheres' and 'hows,' but little understanding of the mechanisms behind them.  Think of what was on the horizon for geneticists in 1923 -- that's what the neuroscientists have to look forward to."

****************************************