Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label neuroscience. Show all posts
Showing posts with label neuroscience. Show all posts

Wednesday, January 21, 2026

Remembrance of things past

Almost all of us implicitly trust our own memories.

Experiment after experiment, however, has shown that this trust is misplaced.  Even if you leave out people with obvious memory deficits -- victims of dementia, for example -- the rest of us give far too much credence to our brain's version of the past.  In truth, what we remember is a conglomerate of what actually did happen, what we were told happened, what we imagine happened based upon the emotions associated with the event, and pure (if inadvertent) fabrication.  And the scariest part is that absent hard evidence (a video, for example), there's no way to tell which parts are what.

It all feels true.

If you don't believe this, consider what happened to cognitive researcher Elizabeth Loftus, of the University of California - Irvine, whose experiments establishing the unreliability of memory are described in neuroscientist David Eagleman's wonderful book The Brain: The Story of You:
We're all susceptible to this memory manipulation -- even Loftus herself.  As it turned out, when Elizabeth was a child, her mother had drowned in a swimming pool.  Years later, a conversation with a relative brought out an extraordinary fact: that Elizabeth had been the one to find her mother's body in the pool.  That news came as a shock to her; she hadn't known that, and in fact didn't believe it.  But, she describes, "I went home from that birthday and I started to think: maybe I did.  I started to think about other things that I did remember -- like when the firemen came, they gave me oxygen.  Maybe I needed the oxygen because I was so upset I found the body?"  Soon, she could visualize her mother in the swimming pool.

But then, her relative called to say he had made a mistake.  It wasn't the young Elizabeth after all who had found the body.  It had been Elizabeth's aunt.  And that's how Loftus had the experience what it was like to possess her own false memory, richly detailed and deeply felt.
So it's not like false memories seem vague or tentative.  They're so vivid you can't tell them from real ones.

Which brings us to the strange story of an arcade video game called "Polybius."

In the early 1980s, a rumor began to circulate that there was an arcade game that combined some very frightening effects.  Its visuals and sounds were dark, surreal, and suggestive.  Children who played the game sometimes had seizures or hallucinations, and afterwards experienced periods of amnesia and night terrors.  Worse, there was something about it that was strangely addictive.  People who played it more than two or three times were likely to become obsessed by it, and keep coming back over and over.  Some, they said, finally could think of nothing else and went incurably mad.  Some committed suicide.

Some simply... vanished.

The FBI launched an investigation, removing Polybius from arcades wherever they could find it.  The "Men in Black" got involved, and there were reports of mysterious strangers showing up and demanding that arcade owners provide lists of the names (or at least initials) of high scorers in the game.  Those unfortunates were rounded up for psychological testing -- and some of them never returned, either.

There are webpages and subreddits devoted to people's memories of Polybius, their experiences of playing it, and scary things that happened subsequently.  There's just one problem with all this, and you've guessed it:

Polybius never existed.  Despite many, many people searching, there has never been a single Polybius cabinet found, nor even a photograph from the time period showing one.  Oh, sure, we have mock-ups people made long after the fact:

[Image licensed under the Creative Commons DocAtRS, Polybius Arcade 1 cropped, CC BY-SA 3.0]

But hard evidence of the real deal?  Zero.  Nada.  Zip.  Zilch.

So what happened here?

Part of it, of course, was a deliberate hoax; an "urban legend."  Part of it was confabulation of memory with a real event, when an arcade in Portland, Oregon removed a game that had triggered a couple of kids to have a seizure.  There was also an incident in 1981 where the FBI raided arcades that had converted game stations into illegal gambling machines.  There was a 1980 New York Times article citing research (later largely called into question) that playing violent video games predisposes kids to commit violence themselves.  And in 1982, there was a widely-reported incident that a teenager had died while playing the game Berzerk in a Calumet City, Illinois arcade -- the story was true, but his heart failure was caused by a physical defect, and had nothing to do with playing the game.

Put all that together, and there are still people now -- forty-some-odd years later -- who are certain they remember Polybius, and what it was like to play it.

It's another example of the "Mandela Effect," isn't it?  This phenomenon got its name from certain people's memories that Nelson Mandela died in jail -- when in fact, the reality is that he survived, eventually became president of South Africa, and died peacefully in his home in Johannesburg in 2013.  Other examples are that the "Berenstain Bears" -- the annoyingly moralistic cartoon characters who preach such eternal truths as Be Nice To Your Siblings Even When You Feel Like Punching The Shit Out Of Them and Your Parents Are Always Right About Everything and Pay Attention In School Or Else You Are Bad -- were originally the Berenstein Bears (with an "e," not an "a"), that the Fruit of the Loom logo originally had a cornucopia (not just a bunch of fruit), and (I shit you not) that Sri Lanka and New Zealand "should be" in different places.

Almost no one who experiences the Mandela Effect, though, laughs it off and says, "Wow, memory sure is unreliable, isn't it?"  Those memories feel completely real, just as real as memories of stuff you know occurred, that you have incontrovertible hard evidence for.  The idea that you could be so certain of something that never happened is profoundly disconcerting, to the extent that people have looked for some explanation, any explanation, for how their memories ended up with information that is demonstrably false.  Some have even cited the "Many-Worlds" Model of quantum mechanics, and posited that there really is a timeline where Mandela died in prison, the cartoon bears were the "Berenstein Bears," Fruit of the Loom had a cornucopia in its logo, and Sri Lanka and New Zealand were somewhere other than where they now are.  It's just that we've side-slipped into a parallel universe, bringing along our memories of the one where we started -- where all those things were dramatically different.

That's how certain people are that their memories are flawless.  They'd rather believe that the entire universe bifurcated than that they're simply remembering wrong. 

How many times have you been in an argument with a friend, relative, or significant other, and one of you has said, "I know what happened!  I was there!", often with a self-righteous tone that how dare anyone question that they might be recalling things incorrectly?  Well, the truth is that none of us are remembering things correctly; what remains in our mind is a partial record, colored by emotions and second-hand contamination and imagination, blended so well there's no way to tease apart the accurate parts from the inaccurate.  What our memories for sure are not is a factual, blow-by-blow account, a mental video of the past that misses nothing and mistakes nothing.

I know this is kind of a terrifying thing.  Our memories are a huge part of our sense of self; if you want a brilliant (fictional) example of the chaos that happens when our memories become unmoored from reality, watch the fantastic movie Memento, in which the main character (played to perfection by Guy Pearce) has anterograde amnesia, a cognitive disorder where he can't form any new short-term memories.  To compensate for this, he takes Polaroid photographs of stuff he thinks is important, and if it's really important he tattoos it onto his skin.  But then the problem is, how does he know the contents of the photos and tattoos are true?  He has no touchstone for what truth about the past actually is.

Although Pearce's character has an extreme form of this problem, in reality, all of us have the same issue.  Those neural firings in the memory centers of our brain are all we have left of the past -- that, and certain fragmentary records, objects, and writings.  

So, how accurate is our view of the past?

No way to tell.  Better than zero, but certainly far less than one hundred percent.

And there's not even any need for a cursed arcade game to screw around with your perception.  We're built like this -- like it or not.

****************************************


Saturday, November 22, 2025

Mental maps

Picture a place you know well.  Your house, your apartment, a park, a church, a school.  You can probably imagine it, remember what it's like to wander around in it, maybe even visualize it to a high level of detail.

Now, let's change the perspective to one you probably have never taken.  Would you be able to draw a map of the layout -- as seen from above?  An aerial view?

Here's a harder task.  In a large room, there are various obstacles, all fairly big and obvious.  Tables, chairs, sofas, the usual things you might find in a living room or den.  You're standing in one corner, and from that perspective are allowed to study it for as long as you like.

Once you were done, could you walk from that corner to the diagonally opposite one without running into anything -- while blindfolded?

Both of these tasks require the use of a part of your brain called the hippocampus.  The name of the structure comes from the Greek word ἱππόκαμπος -- literally, "seahorse" -- because of its shape.  The hippocampus has a role in memory formation, conflict avoidance... and spatial navigation.

Like the other structures in the brain, the hippocampus seems to be better developed in some people than others.  My wife, for example, has something I can only describe as an internal GPS.  To my knowledge, she has never been lost.  When we took a trip to Spain and Portugal a few years ago, we rented a car in Madrid and she studied a map -- once.  After that, she navigated us all over the Iberian Peninsula with only very infrequent checks to make sure we were taking the correct turns, which because of her navigational skills, we always were.

I, on the other hand, get lost walking around a tree.

[Image licensed under the Creative Commons Edward Betts, Bloomsbury - map 1, CC BY-SA 2.0]

The topic comes up because of a paper I came across in the journal Cell that showed something absolutely fascinating.  It's called "Targeted Activation of Hippocampal Place Cells Drives Memory-Guided Spatial Behavior," and was written by a team led by Nick T. M. Robinson of University College London.  But to understand what they did, you have to know about something called optogenetics.

Back in 2002, a pair of geneticists, Boris Zemelman and Gero Miesenböck, developed an amazing technique.  They genetically modified mammalian nerve tissue to express a protein called rhodopsin, which is one of the light-sensitive chemicals in the retina of your eye.  By hitching the rhodopsin to ion-sensitive gateway channels in the neural membrane, they created neurons that literally could be turned on and off using a beam of light.

Because the brain is encased in bone, animals that express this gene don't respond any time the lights are on; you have to shine light directly on the neurons that contain rhodopsin.  This involves inserting fiber optics into the brain of the animal -- but once you do that, you have a set of neurons that fire when you shine a light down the fibers.  Result: remote-control mice.

Okay, if you think that's cool, wait till you hear what Robinson et al. did.

So you create some transgenic mice that express rhodopsin in the hippocampus.  Fit them out with fiber optics.  Then let the mice learn how to run a maze for a reward, in this case sugar water in a feeder bottle.  Watch through an fMRI and note which hippocampal neurons are firing when they learn -- and especially when they recall -- the layout of the maze.

Then take the same mice, put them in a different maze.  But switch the lights on in their brain to activate the neurons you saw firing when they were recalling the map of the first maze.

The result is that the mice picture the first maze, and try to run that pattern even though they can see that they are now in a different maze.  The light activation has switched on a memory of the layout of the maze they'd learned that then overrode all the other sensory information they had access to.

It's as if you moved from Tokyo to London, and then tried to use your knowledge of the roads of Tokyo to find your way from St. Paul's Cathedral to the Victoria & Albert Museum.

This is pretty astonishing from a number of standpoints.  First, the idea that you can switch a memory on and off like that is somewhere between fascinating and freaky.  Second, that the neural firing pattern is so specific -- that pattern corresponds to that map, and no other.  And third, that the activation of the map made the mice doubt the information coming from their own eyes.

So once again, we have evidence of how plastic our brains are, and how easy they are to fool.  What you're experiencing right now is being expressed in your brain as a series of neural firings; in a way, the neural firing pattern is the experience.  If you change the pattern artificially, you experience something different.

More disturbing still is that our sense of self is also deeply tied to our neural links (some would say that our sense of self is nothing more than neural links; to me, the jury's still out on where consciousness comes from, so I'm hesitant to go that far).  So not only what you perceive, but who you are can change if you alter the pattern of neural activation.

We're remarkable, complex, amazing, and fragile beasts, aren't we?

So that's today's contribution from the Not Science Fiction department.  I'm wondering if I might be able to get one of those fiber optics things to activate my hippocampus.  Sounds pretty extreme, but I am really tired of getting lost all the time.  There are trees everywhere around here.

****************************************


Thursday, October 2, 2025

Color my world

When you think about it, color perception is really strange.

Just about all of us have wondered whether we all see colors the same way -- if, for example, what you see as blue is the same as what I see as yellow, but we both identify them using the same word because there's no way to know we're not seeing them the same way.  I've always thought that unlikely.  After all, with few exceptions (other than genetic or structural abnormalities, about which q. v.) our eyes and brain are all built on the same basic plan.  I guess it's possible that we each see the world's colors differently, but the most parsimonious explanation is that because the underlying structures are the same, we're all pretty much perceiving identical color palettes.

Of course, there's no way to know for certain, and I ran into two things just in the last couple of days that leave me wondering.

The first is a curious conversation I had with my friend, the awesome writer Andrew Butters, whose books -- especially the staggeringly good Known Order Girls -- should be on everyone's TBR list.  It started out with an amusing discussion of words that sound like they should mean something else.  One of Andrew's was ambulatory, which to him sounds like "someone who is so incapacitated they need an ambulance."  I personally believe that pulchritude should mean "something that makes you want to puke," and not what it actually does, which is "beauty."  And then Andrew mentioned that he always thought the color words vermilion and chartreuse were wrong, and in fact backwards -- that vermilion should mean a light green and chartreuse a bright orangey-red.

This struck me as really weird, because those two words have never given me that sense.  This may be because I've known them both since I was little.  I knew vermilion because I grew up a mile away from Vermilion Bayou, so named because the red mud of southern Louisiana stains the water reddish brown.  Chartreuse I knew because my grandma's employer, Father John Kemps, was an eccentric, bookish, cigar-smoking Dutch expat who was very fond of a post-meal tipple and loved chartreuse, the pale green herbal French liqueur from which the color got its name.

So I asked Andrew where his misapprehension came from.  He said he wasn't sure, but that perhaps the vermilion one came from the French vert (green); Andrew, like most Canadians, is English-French bilingual.  But where his thinking chartreuse should mean "red" came from, he had no idea.

What baffled me further, though, was when he pointed out that he's not alone in this.  There's a whole page on Reddit about thinking that vermilion and chartreuse are backwards, and an astonishing number of people chimed in to say, "Yeah, me too!"  So why those particular words, and not another pair?  Why not citron and azure, or something?

The second is that I'm finally getting around to reading Oliver Sacks's book An Anthropologist on Mars, which has to do with the intersection between neurological disorders and creativity.  The very first chapter is about a painter who was in a car accident that resulted in brain damage causing cerebral achromotopsia -- complete colorblindness due not to abnormalities in the cones of the retina, but because of damage to a region of the brain called the V4 prestriate cortex.  Afterward, he saw the world in shades of gray -- but with some distinct oddities, because pure white surfaces looked "dirty" or "smudged" to him, red looked black, and blue looked a pale gray.

This brought up an interesting discussion about how we see color in the first place, and that color perception (even within a single, normally-sighted individual) isn't absolute, but comparative; we assess the color value of a region by comparison to the entire visual field.  If the whole "what color is this dress?" thing that was going around a few years ago didn't convince you of that, try this one out:


Every one of these spheres is exactly the same color; they were, in fact, cut-and-pasted from a single image.  The only thing that differs is the color of the foreground stripes that cross each one.  But since your eyes judge color based on context, it's impossible to see them as the same even once you cognitively know what's going on.

Don't believe it?  If you go to the link provided, the article author (the wonderful Phil Plait) created an animation that cycles between the image with and without the stripes.  It's mind-blowing.

All of this circles around to the weird topic of synesthesia, which is a still-unexplained sensory phenomenon where people have a sort of cross-wiring between two senses.  Russian composer Alexander Scriabin was a synesthete, who experienced sensations of colors when he heard chords; C# minor, for example, looked a bright emerald green.  (If you want to find out more, the amazing book by Richard Cytowic, The Man Who Tasted Shapes, is still considered one of the seminal works on this odd disorder.)

I wonder if what Andrew (and the others with the vermilion-chartreuse switch) are experiencing is a form of synesthesia.  A former student of mine is a synesthete for whom printed letters (and whole words) evoke sensations of colors, so his word choices while writing took into account whether the colors were harmonious, not just that the words made semantic sense.  (I hasten to add that he was and is one of the most brilliant people I've ever known, so his synesthesia didn't cause his writing to lack any clarity to non-synesthetes like myself -- although it has been known to slow him down as he struggled to find words that satisfied both meaning and appearance.)  So perhaps the "vermilion = light green" thing comes from the fact that for Andrew and the others on the Reddit page, the word looks green irrespective of its association with an actual (different) color.

What I find odd still, though, is that so many people have those two particular color words backwards.  Synesthesia is remarkably individual; while one of its hallmarks is a complete consistency within a particular person (Scriabin always saw C# minor as green, for example), it varies greatly from person to person.  The fact that vermilion and chartreuse are reversed for so many people is just plain peculiar.

So there's still a lot we don't know about how exactly we perceive color, and maybe my "parsimonious" explanation that (other than those with colorblindness, synesthesia, and other visual disorders) we're all seeing colors more or less the same way fails to capture the complexity of the real world.  Wouldn't be the first time I've thought things were simpler than they turned out to be.  Maybe it's just my perception because I'm a non-synesthete with intact color vision.

But until we're somehow able to see things through someone else's eyes and brain, that's a limitation I can't escape except for in my imagination.

****************************************


Thursday, September 18, 2025

Mechanical brain transplant

New from the "Well, I can't see any way that could go wrong, do you?" department, we have: scientists growing Neanderthal brain fragments in petri dishes and then connecting them to crab-like robots.

My first thought was, "Haven't you people ever watched a science fiction movie?"  This feeling may have been enhanced by the fact that just a couple of days ago I watched the Doctor Who episode "The End of the World," wherein the Doctor and his companion are damn near killed (along with everyone else on a space station) when a saboteur makes the shields malfunction using little scuttling metallic bugs.


The creator of the Neanderthal brain bits is Alysson Muotri, geneticist at the University of California - San Diego's School of Medicine.  He and his team isolated genes that belonged to our closest cousins, Homo sapiens neanderthalensis, and transferred them into stem cells.  Then, they allowed the cells to grow into proto-brains to see what sorts of connections would form.

Muotri says, "We're trying to recreate Neanderthal minds."  So far, they've noticed an abnormally low number of synapses (as compared to modern humans), and have speculated that this may indicate a lower capacity for sophisticated social behavior.

But Muotri and his team are going one step further.  They are taking proto-brains (he calls them "organoids") with no Neanderthal genes, and wiring them and his "neanderthalized" versions into robots, to make comparisons about how they learn.  Simon Fisher, a geneticist for the Department of Psycholinguistics at the Max Planck Institute, said, "It's kind of wild.  It's creative science."

That it is.

I have to admit there's a cool aspect to this.  I've always wondered about the Neanderthals.  During the peak of their population, they actually had a brain capacity larger than modern humans.  They clearly had culture -- they ceremonially buried their dead, probably had language (as they had the same variant of the "linguistic gene" FOXP2 that we do), and may have even made music, to judge by what appears to be a piece of a 43,000 bone flute that was found in Slovenia.


All that said, I'm not sure how smart it would be to stick a Neanderthal brain inside a metallic crab.  If this was a science fiction movie, the next thing that happened would be that Muotri would be in his lab late at night working with his Crab Cavemen, and he'd turn his back and they'd swarm him, and the next morning all that would be found is his skeleton, minus his femur, which would have been turned into a clarinet.

Okay, I know I'm probably overreacting here.  But it must be admitted that our track record of thinking through our decisions is not exactly unblemished.  Muotri assures us that these little "organoids" have no blood supply and therefore no potential for developing into an actual brain, but still.  I hope he knows what he's doing.  As for me, I'm going to go watch Doctor Who.

Let's see, what's the next episode?  "Dalek."  *reads description*  "A superpowerful mutant intelligence controlling a mechanical killing device goes on a rampage and attempts to destroy humanity."

Um, never mind.  *switches channel to Looney Tunes*

****************************************


Saturday, August 16, 2025

Facing facts

"I'm sorry, but I have no idea who you are."

I can't tell you how many times I've had to utter that sentence.  Regular readers of Skeptophilia know why; I have a peculiar disability called prosopagnosia, or "face blindness."  I have a nearly complete inability to recognize faces, even of people I've known for some time.

Well, that's not exactly true.  I recognize people differently than other people do.  I remember the people I know as lists of features.  I know my wife has curly brown hair and freckles and an infectious smile, but I honestly have no mental image of her.  I can't picture my own face, although -- like with my wife -- I could list some of my features.

That system doesn't have a high success rate, however, and a lot of the time I have no idea who the people around me are, especially in a place where there are few clues from context.  I have pretty serious social anxiety, and my condition makes it worse, having put me in the following actual situations:
  • introducing myself twice to the same person at a party
  • getting a big, enthusiastic hug and an "it's been so long!" from someone in our local gym, and never figuring out who I was talking to
  • having two of my students switch seats and not realizing it for three weeks, until finally they 'fessed up
  • going to see a movie, and not knowing until the credits rolled that the main characters were played by Kenneth Branagh, Penelope Cruz, Judi Dench, Derek Jacobi, Michelle Pfeiffer, Daisy Ridley, and Johnny Depp
  • countless incidents of my fishing for clues ("so, how's your, um... spouse, parents, kids, pets, job..."), sometimes fruitlessly
My anxiety has made me really good at paying attention to, and recalling, other cues like voice, manner of dress, posture, walk, hair style, and so on.  But when one or more of those change -- such as with the student I had one year who cut her hair really short during the summer, and whom I didn't recognize when she showed up in one of my classes on the first day of school the following year -- it doesn't always work.

One up side to the whole thing is that I do get asked some funny questions.  One student asked me if when I looked at people, their faces were invisible.  Another asked me if when I look in the bathroom mirror in the morning, I don't know that's me.  (It's a pretty shrewd guess that it is me, since there's generally no one else in there at the time.)

But at least it's not as bad as the dumb questions that my former students who are identical triplets sometimes get.  One of them was once asked by a friend how she kept track of which triplet she was.

No, I'm not kidding.  Neither, apparently, was the person who asked the question.

[Image licensed under the Creative Commons Randallbritten, FaceMachine screenshots collage, CC BY-SA 3.0]

In any case, all of this comes up because of some research that came out in the journal Cortex that tried to parse what's happening (or what's not happening) in the brains of people like me.  Some level of prosopagnosia affects about one person in fifty; some of them lose their facial recognition ability because of a stroke or other damage to the fusiform gyrus, the part of the brain that seems to be a dedicated face-memory module.  Others, like me, were born this way.  Interestingly, a lot of people who have lifelong prosopagnosia take a while to figure it out; for years, I just thought I was unobservant, forgetful, or a little daft.  (All three of those might be true as well, of course.)  It was only after I had enough embarrassing incidents occur, and -- most importantly -- saw an eye-opening piece about face blindness by Leslie Stahl on 60 Minutes, that I realized what was going on.

In any case, the paper in Cortex looked at trying to figure out why people who are face-blind often do just fine on visual perception tests, then fail utterly when it comes to remembering photographs of faces.  The researchers specifically tried to parse whether the difference was coming from an inability to connect context cues to the face you're seeing (e.g., looking at someone and thinking, "She's the woman who was behind the counter at the library last week") versus simple familiarity (the more nebulous and context-free feeling of "I've seen that person before").  They showed each test subject (some of whom weren't face-blind) a series of 120 faces, then a second series of 60 faces where some of them were new and some of them were in the previous series.  The researchers were not only looking for whether the subjects could correctly pick out the old faces, but how confident they were in their answers -- the surmise being that low confidence on correct answers was an indicator of relying on familiarity rather than context memory.

The prosopagnosics in the test group not only were bad at identifying which faces were old and which ones they'd seen before; but their confidence was really low, even on the ones they got right.  Normally-sighted people showed a great deal more certainty in their answers.  What occurs to me, though, is that knowing they're face-blind would skew the results, in that we prosopagnosics are always doubtful we're recalling correctly.  So these data could be a result of living with the condition, not some kind of underlying mechanism at work.  I almost never greet someone first, because even if I think I might know them, I'm never certain.  A lot of people think I'm aloof because of this, but the reality is that I honestly don't know which of the people I'm seeing are friends and which are total strangers.

One thing about the researchers' conclusion does ring true, however.  The subconscious "feeling of familiarity" is definitely involved.  My experience of face blindness isn't that I feel like I'm surrounded by strangers; it's more that everyone looks vaguely familiar.  The problem is, that feeling is no stronger when I see a close friend than when I see someone I've never met before, so the intensity of that sense -- what apparently most people rely on -- doesn't help me.

So that's the view of the world through the eyes of someone who more often than not doesn't know who he's looking at.  Fortunately for me, (1) at this point in my life I'm unembarrassed by my condition, and (2) most of the people in my little village know I'm face-blind and will say, "Hi, Gordon, it's Steve..." when they walk up, and spare me the awkwardness of fishing for clues.  (Nota bene: This only works if it actually is Steve.  Otherwise it would be even more awkward.)  But hopefully some good will come from this research, because face blindness is kind of a pain in the ass.

"Our results underscore that prosopagnosia is a far more complex disorder that is driven by more than deficits in visual perception," said study first author Anna Stumps, a researcher in the Boston Attention Learning Laboratory at VA Boston.  "This finding can help inform the design of new training approaches for people with face blindness."

Which would be really, really nice.

****************************************


Friday, March 14, 2025

In the blink of an eye

One of the things I love about science is how it provides answers to questions that are so ordinary that few of us appreciate how strange they are.

I remember how surprised I was when I first heard a question about our vision that had honestly never occurred to me.  You know how images jump around when you're filming with a hand-held videocamera?  Even steady-handed people make videos that are seriously nausea-inducing, and when the idea is to make it look like it's filmed by amateurs -- such as in the movie The Blair Witch Project -- the result looks like it was produced by strapping a camera to the head of a kangaroo on crack.

What's a little puzzling is why the world doesn't appear to jump around like that all the time.  I mean, think about it; if you walk down the hall holding a videocamera on your shoulder, and watch the video and compare it to the way the hall looked while you were walking, you'll see the image bouncing all over the place on the video, but won't have experienced that with your eyes.  Why is that?

The answer certainly isn't obvious.  One guess scientists have is that we stabilize the images we see, and compensate for small movements of our head, by using microsaccades -- tiny, involuntary, constant jitters of the eyes.  The thought is that those little back-and-forth movements allow your brain to smooth out the image, keeping us from seeing the world as jumping around every time we move.

Another question about visual perception that I had never thought about was the subject of some research out of New York University and the University Medical Center of Göttingen that was published in the journal Current Biology.  Why don't you have the perception of the world going dark for a moment when you blink?  After all, most of us blink about once every five seconds, and we don't have the sense of a strobe effect.  In fact, most of us are unaware of any change in perception whatsoever.

[Image licensed under the Creative Commons Mcorrens, Iris of the Human Eye, CC BY-SA 3.0]

By studying patients who had lesions in the cerebrum, and comparing them to patients with intact brains, the scientists were not only able to answer this question, but to pinpoint exactly where this phenomenon happens -- the dorsomedial prefrontal cortex, a part of the brain immediately behind the forehead.  What they found was that individuals with an intact dmPFC store a perceptual memory of what they've just seen, and use that to form the perception they're currently seeing, so the time during which there's no light falling on the retina -- when you blink -- doesn't even register.  On the other hand, a patient with a lesion in the dmPFC lost that ability, and didn't store immediate perceptual memories.  The result?  Every time she blinked, it was like a shutter closed on the world.

"We were able to show that the prefrontal cortex plays an important role in perception and in context-dependent behavior," said neuroscientist Caspar Schwiedrzik, who was lead author of the study.  "Our research shows that the medial prefrontal cortex calibrates current visual information with previously obtained information and thus enables us to perceive the world with more stability, even when we briefly close our eyes to blink...  This is not only true for blinking but also for higher cognitive functions.  Even when we see a facial expression, this information influences the perception of the expression on the next face that we look at."

All of which highlights that all of our perceptual and integrative processes are way more sophisticated than they seem at first.  It also indicates something that's a little scary; that what we're perceiving is partly what's really out there, and partly what our brain is telling us it thinks is out there.  Which is right more often than not, of course.  If that weren't true, natural selection would have finished us off a long time ago.  But that fraction of the times that it's wrong, it can create some seriously weird sensations -- or make us question things that we'd always taken for granted.

****************************************


Wednesday, March 5, 2025

Watch your tone!

You probably know that there are many languages -- the most commonly-cited are Mandarin and Thai -- that are tonal.  The pitch, and pitch change across a syllable, alter its meaning.  For example, in Mandarin, the syllable "ma" spoken with a high steady tone means "mother;" with a falling then rising tone, it means "horse."

If your mother is anything like mine was, confusing these is not a mistake you'd make twice.

A pitch vs. time graph of the five tones in Thai [Image licensed under the Creative Commons Thtonesen.jpg: Lemmy Laffer derivative from Bjankuloski06en, Thai tones, CC BY-SA 3.0]

English is not tonal, but there's no doubt that pitch and stress change can communicate meaning.  The difference is that pitch alterations in English don't change the denotative (explicit) meaning, but can drastically change the connotative (implied) meaning.  Consider the following sentence:

He told you he gave the package to her?

Spoken with a neutral tone, it's simply an inquiry about a person's words and actions.  Now, one at a time, change which word is stressed:

  • He told you he gave the package to her?  (Implies the speaker was expecting someone else to do it.)
  • He told you he gave the package to her? (Implies surprise that you were told about the action.)
  • He told you he gave the package to her? (Implies surprise that you were the one told about it)
  • He told you he gave the package to her? (Implies the speaker expected the package should have been paid for)
  • He told you he gave the package to her? (Implies that some different item was expected to be given)
  • He told you he gave the package to her? (Implies surprise at the recipient of the package)

Differences in word choice can also create sentences with identical denotative meanings and drastically different connotative meanings.  Consider "Have a nice day" vs. "I hope you manage to enjoy your next twenty-four hours," and "Forgive me, Father, for I have sinned" vs. "I'm sorry, Daddy, I've been bad."

You get the idea.

All of this is why mastery of a language you weren't born to is a long, fraught affair.

The topic comes up because of some new research out of Northwestern University that identified the part of the brain responsible for recognizing and abstracting meaning from pitch and inflection -- what linguists call the prosody of a language.  A paper this week in Nature Communications showed that Heschl's gyrus, a small structure in the superior temporal lobe, actively analyzes spoken language for subtleties of rhythm and tone and converts those perceived differences into meaning.

"Our study challenges the long-standing assumptions how and where the brain picks up on the natural melody in speech -- those subtle pitch changes that help convey meaning and intent," said G. Nike Gnanataja, who was co-first author of the study.  "Even though these pitch patterns vary each time we speak, our brains create stable representations to understand them."

"The results redefine our understanding of the architecture of speech perception," added Bharath Chandrasekaran, the other co-first author.  "We've spent a few decades researching the nuances of how speech is abstracted in the brain, but this is the first study to investigate how subtle variations in pitch that also communicate meaning are processed in the brain."

It's fascinating that we have a brain area dedicated to discerning alterations in the speech we hear, and curious that similar research on other primates shows that while they have a Heschl's gyrus, it doesn't respond to changes in prosody.  (What exact role it does have in other primates is still a subject of study.)  This makes me wonder if it's yet another example of preaptation -- where a structure, enzyme system, or gene evolves in one context, then gets co-opted for something else.  If so, our ancestors' capacity for using their Heschl's gyri to pick up on subtleties of speech drastically enriched their abilities to encode meaning in language.

But I should wrap this up, because I need to go do my Japanese language lessons for the day.  Japanese isn't tonal, but word choice strongly depends on the relative status of the speaker and the listener, so which words you use is critical if you don't want to be looked upon as either boorish on the one hand, or putting on airs on the other.

I wonder how the brain figures all that out?

****************************************


Saturday, February 1, 2025

Remembrance of things past

"The human brain is rife with all sorts of ways of getting it wrong."

This quote is from a talk by eminent astrophysicist Neil deGrasse Tyson, and is just about spot on.  Oh, sure, our brains work well enough, most of the time; but how many times have you heard people say things like "I remember that like it was yesterday!" or "Of course it happened that way, I saw it with my own eyes"?

Anyone who knows something about neuroscience should immediately turn their skepto-sensors up to 11 as soon as they hear either of those phrases.

fMRI scan of a human brain during working memory tasks [Image is in the Public Domain courtesy of the Walter Reed National Military Medical Center]

Our memories and sensory-perceptual systems are selective, inaccurate, heavily dependent on what we're doing at the time, and affected by whether we're tired or distracted or overworked or (even mildly) inebriated.  Sure, what you remember might have happened that way, but -- well, let's just say it's not as much of a given as we'd like to think.  An experiment back in 2005 out of the University of Portsmouth looked memories of the Tavistock Square (London) bus bombing, and found that a full forty percent of the people questioned had "memories" of the event that were demonstrably false -- including a number of people who said they recalled details from CCTV footage of the explosion, down to what people were wearing, who showed up to help the injured, when police arrived, and so on.

Oddly enough, there is no CCTV footage of the explosion.  It doesn't exist and has never existed.

Funny thing that eyewitness testimony is considered some of the most reliable evidence in courts of law, isn't it?

There are a number of ways our brains can steer us wrong, and the worst part of it all is that they leave us simultaneously convinced that we're remembering things with cut-crystal clarity.  Here are a few interesting memory glitches that commonly occur in otherwise mentally healthy people, that you might not have heard of:

  • Cryptomnesia.  Cryptomnesia occurs when something from the past recurs in your brain, or arises in your external environment, and you're unaware that you've already experienced it.  This has resulted in several probably unjustified accusations of plagiarism; the author in question undoubtedly saw the text they were accused of plagiarizing some time earlier, but honestly didn't remember they'd read it and thought that what they'd come up with was entirely original.  It can also result in some funnier situations -- while the members of Aerosmith were taking a break from recording their album Done With Mirrors, they had a radio going, and the song "You See Me Crying" came on.  Steven Tyler said he thought that was a pretty cool song, and maybe they should record a cover of it.  Joe Perry turned to him in incredulity and said, "That's us, you fuckhead."
  • Semantic satiation.  This is when a word you know suddenly looks unfamiliar to you, often because you've seen it repeatedly over a fairly short time.  Psychologist Chris Moulin of Leeds University did an experiment where he had test subjects write the word door over and over, and found that after a minute of this 68% of the subjects began to feel distinctly uneasy, with a number of them saying they were doubting that "door" was a real word.  I remember being in high school writing an exam in an English class, and staring at the word were for some time because I was convinced that it was spelled wrong (but couldn't, of course, remember how it was "actually" spelled).
  • Confabulation.  This is the recollection of events that never happened -- along with a certainty that you're remembering correctly.  (The people who claimed false memories of the Tavistock Square bombing were suffering from confabulation.)  The problem with this is twofold; the more often you think about the false memory or tell your friends and family about it, the more sure you are of it; and often, even when presented with concrete evidence that you're recalling incorrectly, somehow you still can't quite believe it.  A friend of mine tells the story of trying to help her teenage son find his car keys, and that she was absolutely certain that she'd seen them that day lying on a blue surface -- a chair, tablecloth, book, she wasn't sure which, but it was definitely blue.  They turned the house upside down, looking at every blue object they could find, and no luck.  Finally he decided to walk down to the bus stop and take the bus instead, and went to the garage to get his stuff out of the car -- and the keys were hanging from the ignition, where he'd left them the previous evening.  "Even after telling me this," my friend said, "I couldn't accept it.  I'd seen those keys sitting on a blue surface earlier that day, and remembered it as clearly as if they were in front of my face."
  • Declinism.  This is the tendency to remember the past as more positive than it actually was, and is responsible both for the "kids these days!" thing and "Make America Great Again."  There's a strong tendency for us to recall our own past as rosy and pleasant as compared to the shitshow we're currently immersed in, irrespective of the fact that violence, bigotry, crime, and general human ugliness are hardly new inventions.  (A darker aspect of this is that some of us -- including a great many MAGA types -- are actively longing to return to the time when straight White Christian men were in charge of everything; whether this is itself a mental aberration I'll leave you to decide.)  A more benign example is what I've noticed about travel -- that after you're home, the bad memories of discomfort and inconveniences and delays and questionable food fade quickly, leaving behind only the happy feeling of how much you enjoyed the experience.
  • The illusion of explanatory depth.  This is a dangerous one; it's the certainty that you understand deeply how something works, when in reality you don't.  This effect was first noted back in 2002 by psychologists Leonid Rozenblit and Frank Keil, who took test subjects and asked them to rank from zero to ten their understanding of how common devices worked, including zippers, bicycles, electric motors, toasters, and microwave ovens, and found that hardly anyone gave themselves a score lower than five on anything.  Interestingly, the effect vanished when Rozenblit and Keil asked the volunteers actually to explain how the devices worked; after trying to describe in writing how a zipper works, for example, most of test subjects sheepishly realized they actually had no idea.  This suggests an interesting strategy for dealing with self-styled experts on topics like climate change -- don't argue, ask questions, and let them demonstrate their ignorance on their own.
  • Presque vu.  Better known as the "tip-of-the-tongue" phenomenon -- the French name means "almost seen" -- this is when you know you know something, but simply can't recall it.  It's usually accompanied by a highly frustrating sense that it's right there, just beyond reach.  Back in the days before The Google, I spent an annoyingly long time trying to recall the name of the Third Musketeer (Athos, Porthos, and... who???).  I knew the memory was in there somewhere, but I couldn't access it.  It was only after I gave up and said "to hell with it" that -- seemingly out of nowhere -- the answer (Aramis) popped into my head.  Interestingly, neuroscientists are still baffled as to why this happens, and why turning your attention to something else often makes the memory reappear.

So be a little careful about how vehemently you argue with someone over whether your recollection of the past or theirs is correct.  Your version might be right, or theirs -- or it could easily be that both of you are remembering things incompletely or incorrectly.  I'll end with a further quote from Neil deGrasse Tyson: "We tend to have great confidence in our own brains, when in fact we should not.  It's not that eyewitness testimony by experts or people in uniform is better than that of the rest of us; it's all bad....  It's why we scientists put great faith in our instruments.  They don't care if they've had their morning coffee, or whether they got into an argument with their spouse -- they get it right every time."

****************************************

Friday, January 3, 2025

Word search

I've always wondered why words have the positive or negative connotations they do.

Ask people what their favorite and least-favorite sounding words are, and you'll find some that are easily explicable (vomit regularly makes the "least-favorite" list), but others are kind of weird.  A poll of linguists identified the phrase cellar door as being the most beautiful-sounding pair of words in the English language -- and look at how many names from fantasy novels have the same cadence (Erebor, Aragorn, Celeborn, Glorfindel, Valinor, to name just a handful from the Tolkien mythos).  On the other hand, I still recall passing a grocery store with my son one day and seeing a sign in the window that said, "ON SALE TODAY: moist, succulent pork."

"There it is," my son remarked.  "A single phrase made of the three ugliest words ever spoken."

Moist, in fact, is one of those universally loathed words; my surmise is the rather oily sound of the /oi/ combination, but that's hardly a scholarly analysis.  The brilliant British comedian Miranda Hart had her own unique take on it:


Another question is why some words are easier to bring to mind than others. This was the subject of a fascinating paper in Nature Human Behavior titled, "Memorability of Words in Arbitrary Verbal Associations Modulates Memory Retrieval in the Anterior Temporal Lobe," by neuroscientists Weizhen Xie, Wilma A. Bainbridge, Sara K. Inati, Chris I. Baker, and Kareem A. Zaghloul of the National Institute of Health.  Spurred by a conversation at a Christmas party about why certain faces are memorable and others are not, study lead author Weizhen Xie wondered if the same was true for words -- and if so, that perhaps it could lead to more accuracy in cognitive testing for patients showing memory loss or incipient dementia.

"Our memories play a fundamental role in who we are and how our brains work," Xie said in an interview with Science Daily.  "However, one of the biggest challenges of studying memory is that people often remember the same things in different ways, making it difficult for researchers to compare people's performances on memory tests.  For over a century, researchers have called for a unified accounting of this variability.  If we can predict what people should remember in advance and understand how our brains do this, then we might be able to develop better ways to evaluate someone's overall brain health."

What the team did is as fascinating as it is simple; they showed test subjects pairs of functionally-unrelated words (say, "hand" and "apple"), and afterward, tested them by giving them one word and asking them to try to recall what word it was paired with.  What they found is that some words were easy to recall regardless of what they were paired with and whether they came first or second in the pair; others were more difficult, again irrespective of position or pairing.

"We saw that some things -- in this case, words -- may be inherently easier for our brains to recall than others," said study senior author Kareem Zaghloul.  "These results also provide the strongest evidence to date that what we discovered about how the brain controls memory in this set of patients may also be true for people outside of the study."

[Image licensed under the Creative Commons Mandeep SinghEmotions wordsCC BY 4.0]

Neither the list of easy-to-remember words nor the list of harder-to-remember ones show any obvious commonality (such as abstract versus concrete nouns, or long words versus short ones) that would explain the difference.  Each list included some extremely common words and some less common ones -- tank, doll, and pond showed up on the memorable list, and street, couch, and cloud on the less-memorable list.  It was remarkable how consistent the pattern was; the results were unequivocal even when the researchers controlled for such factors as educational level, age, gender, and so on.

"We thought one way to understand the results of the word pair tests was to apply network theories for how the brain remembers past experiences," Xie said.  "In this case, memories of the words we used look like internet or airport terminal maps, with the more memorable words appearing as big, highly trafficked spots connected to smaller spots representing the less memorable words.  The key to fully understanding this was to figure out what connects the words."

The surmise is that it has to do with the way our brains network information.  Certain words might act as "nodes" -- memory points that connect functionally to a great many different concepts -- so the brain more readily lands on those words when searching.  Others, however familiar and common they might be, act more as "dead-ends" in brain networking, making only a few conceptual links.  Think of it as trying to navigate through a city -- some places are easy to get to because there are a great many paths that lead there, while others require a specific set of roads and turns.  In the first case, you can get to your destination even if you make one or two directional goofs; in the second, one wrong turn and you're lost.

All of which is fascinating. I know as I've gotten older I've had the inevitable memory slowdown, which most often manifests as my trying to recall a word I know that I know. I often have to (with some degree of shame) resort to googling something that's a synonym and scanning down the list until I find the word I'm looking for, but it makes me wonder why this happens with some words and not with others.  Could it be that in my 64-year-old brain, bits of the network are breaking down, and this affects words with fewer working functional links than ones with a great many of them?

All speculation, of course. I can say that whatever it is, it's really freakin' annoying.  But I need to wrap up this post, because it's time for lunch.  Which is -- I'm not making this up -- leftover moist, succulent pork.

I'll try not to think about it.

****************************************