Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label memory. Show all posts
Showing posts with label memory. Show all posts

Saturday, July 5, 2025

Out of time

A friend of mine recently posted, "And poof!  Just like that, 1975 is fifty years ago."

My response was, "Sorry.  Wrong.  1975 is 25 years ago.  In five years, 1975 will still be 25 years ago.  That's my story, and I'm stickin' to it."

I've written here before about how plastic human memory is, but mostly I've focused on the content -- how we remember events.  But equally unreliable is how we remember time.  It's hard for me to fathom the fact that it's been six years since I retired from teaching.  On the other hand, the last overseas trip I took -- to Iceland, in 2022 -- seems like it was a great deal longer ago than that.  And 1975... well....  My own sense of temporal sequencing is, in fact, pretty faulty, and there have been times I've had to look up a time-stamped photograph, or some other certain reference point, to be sure when exactly some event had occurred.

Turns out, though, that just about all of us have inaccurate mental time-framing.  And the screw-up doesn't even necessarily work the way you'd think.  The assumption was -- and it makes some intuitive sense -- that memories of more recent events would be stronger than those from longer ago, and that's how your brain keeps track of when things happened.  It's analogous to driving at night, and judging the distance to a car by the brightness of its headlights; dimmer lights = the oncoming car is farther away.

But just as this sense can be confounded -- a car with super-bright halogen headlights might be farther away than it seems to be -- your brain's time sequencing can be muddled by the simple expedient of repetition.  Oddly, though, repetition has the unexpected effect of making an event seems like it happened further in the past than it actually did.

[Image licensed under the Creative Commons Isabelle Grosjean ZA, MontreGousset001, CC BY-SA 3.0]

A new study out of Ohio State University, published this week in the journal Psychological Science, shows that when presented with the same stimulus multiple times, the estimate of when the test subject saw it for the first time became skewed by as much as twenty-five percent.  It was a robust result -- holding across the majority of the hundreds of volunteers in the study -- and it came as a surprise to the researchers.

"We all know what it is like to be bombarded with the same headline day after day after day," said study co-author Sami Yousif.  "We wondered whether this constant repetition of information was distorting our mental timelines...  Images shown five times were remembered as having occurred even further back than those shown only two or three times.  This pattern persisted across all seven sets of image conditions...  We were surprised at how strong the effects were.  We had a hunch that repetition might distort temporal memory, but we did not expect these distortions to be so significant."

So when someone says "I know it happened that way, I remember it," it should be as suspect with respect to timing as it is to content.

"People should take away two things," Yousif said.  "(1) Time perception is illusory.  That is, our sense of when things occurred is systematically distorted in predictable ways.  (2) These distortions can be substantial, even if their causes are simple (i.e., the mere repetition of information)."

More and more it's seeming like what we think of as our rock-solid memory is an elaborate but rickety house of cards, composed of bits of accurate recollections mixed in with partial truths (real memories in the wrong sequence, or correctly sequenced memories that are being remembered imprecisely), along with a heaping helping of complete fiction.  Add to that the unsettling truth that unless you have a fixed, factual reference point, there's no way to tell the difference.

Makes you wonder how eyewitness testimony can still be used as the sine qua non of evidence in courts of law.

****************************************


Tuesday, February 18, 2025

Misremembering the truth

There are two distinct, but similar-sounding, cognitive biases that I've written about many times here at Skeptophilia because they are such tenacious barriers to rational thinking.

The first, confirmation bias, is our tendency to uncritically accept claims when they fit with our preconceived notions.  It's why a lot of conservative viewers of Fox News and liberal viewers of MSNBC sit there watching and nodding enthusiastically without ever stopping and saying, "... wait a moment."

The other, dart-thrower's bias, is more built-in.  It's our tendency to notice outliers (because of their obvious evolutionary significance as danger signals) and ignore, or at least underestimate, the ordinary as background noise.  The name comes from the thought experiment of being in a bar while there's a darts game going on across the room.  You'll tend to notice the game only when there's an unusual throw -- a bullseye, or perhaps impaling the bartender in the forehead -- and not even be aware of it otherwise.

Well, we thought dart-thrower's bias was more built into our cognitive processing system and confirmation bias more "on the surface" -- and the latter therefore more culpable, conscious, and/or controllable.  Now, it appears that confirmation bias might be just as hard-wired into our brains as dart-thrower's bias is.

I recently read a paper that shed some light on this rather troubling finding in Human Communication Research, describing a study conducted by a team led by Jason Coronel of Ohio State University.  In "Investigating the Generation and Spread of Numerical Misinformation: A Combined Eye Movement Monitoring and Social Transmission Approach," Coronel, along with Shannon Poulsen and Matthew D. Sweitzer, did a fascinating series of experiments that showed we not only tend to accept information that agrees with our previous beliefs without question, we honestly misremember information that disagrees -- and we misremember it in such a way that in our memories, it further confirms our beliefs!

The location of memories (from Memory and Intellectual Improvement Applied to Self-Education and Juvenile Instruction, by Orson Squire Fowler, 1850) [Image is in the Public Domain]

What Coronel and his team did was to present 110 volunteers with passages containing true numerical information on social issues (such as support for same-sex marriage and rates of illegal immigration).  In some cases, the passages agreed with what (according to polls) most people believe to be true, such as that the majority of Americans support same-sex marriage.  In other cases, the passages contained information that (while true) is widely thought to be untrue -- such as the fact that illegal immigration across the Mexican border has been dropping for years and in the last five years has been at its lowest rates since the mid-1990s.

Across the board, people tended to recall the information that aligned with the conventional wisdom correctly, and the information that didn't incorrectly.  Further -- and what makes this experiment even more fascinating -- is that when people read the unexpected information, data that contradicted the general opinion, eye-tracking monitors recorded that they hesitated while reading, as if they recognized that something was strange.  In the immigration passage, for example, they read that the rate of immigration had decreased from 12.8 million in 2007 to 11.7 million in 2014, and the readers' eyes bounced back and forth between the two numbers as if their brains were saying, "Wait, am I reading that right?"

So they spent longer on the passage that conflicted with what most people think -- and still tended to remember it incorrectly.  In fact, the majority of people who did remember wrong got the numbers right -- 12.8 million and 11.7 million -- showing that they'd paid attention and didn't just scoff and gloss over it when they hit something they thought was incorrect.  But when questioned afterward, they remembered the numbers backwards, as if the passage had actually supported what they'd believed prior to the experiment!

If that's not bad enough, Coronel's team then ran a second experiment, where the test subjects read the passage, then had to repeat the gist to another person, who then passed it to another, and so on.  (Remember the elementary school game of "Telephone?")  Not only did the data get flipped -- usually in the first transfer -- subsequently, the difference between the two numbers got greater and greater (thus bolstering the false, but popular, opinion even more strongly).  In the case of the immigration statistics, the gap between 2007 and 2014 not only changed direction, but by the end of the game it had widened from 1.1 million to 4.7 million.

This gives you an idea what we're up against in trying to counter disinformation campaigns.  And it also illustrates that I was wrong in one of my preconceived notions; that people falling for confirmation bias are somehow guilty of locking themselves deliberately into an echo chamber.  Apparently, both dart-thrower's bias and confirmation bias are somehow built into the way we process information.  We become so certain we're right that our brain subconsciously rejects any evidence to the contrary.

Why our brains are built this way is a matter of conjecture.  I wonder if perhaps it might be our tribal heritage at work; that conforming to the norm, and therefore remaining a member of the tribe, has a greater survival value than being the maverick who sticks to his/her guns about a true but unpopular belief.  That's pure speculation, of course.  But what it illustrates is that once again, our very brains are working against us in fighting Fake News -- which these days is positively frightening, given how many powerful individuals and groups are, in a cold and calculated fashion, disseminating false information in an attempt to mislead us, frighten us, or anger us, and so maintain their positions of power.

****************************************

Saturday, February 1, 2025

Remembrance of things past

"The human brain is rife with all sorts of ways of getting it wrong."

This quote is from a talk by eminent astrophysicist Neil deGrasse Tyson, and is just about spot on.  Oh, sure, our brains work well enough, most of the time; but how many times have you heard people say things like "I remember that like it was yesterday!" or "Of course it happened that way, I saw it with my own eyes"?

Anyone who knows something about neuroscience should immediately turn their skepto-sensors up to 11 as soon as they hear either of those phrases.

fMRI scan of a human brain during working memory tasks [Image is in the Public Domain courtesy of the Walter Reed National Military Medical Center]

Our memories and sensory-perceptual systems are selective, inaccurate, heavily dependent on what we're doing at the time, and affected by whether we're tired or distracted or overworked or (even mildly) inebriated.  Sure, what you remember might have happened that way, but -- well, let's just say it's not as much of a given as we'd like to think.  An experiment back in 2005 out of the University of Portsmouth looked memories of the Tavistock Square (London) bus bombing, and found that a full forty percent of the people questioned had "memories" of the event that were demonstrably false -- including a number of people who said they recalled details from CCTV footage of the explosion, down to what people were wearing, who showed up to help the injured, when police arrived, and so on.

Oddly enough, there is no CCTV footage of the explosion.  It doesn't exist and has never existed.

Funny thing that eyewitness testimony is considered some of the most reliable evidence in courts of law, isn't it?

There are a number of ways our brains can steer us wrong, and the worst part of it all is that they leave us simultaneously convinced that we're remembering things with cut-crystal clarity.  Here are a few interesting memory glitches that commonly occur in otherwise mentally healthy people, that you might not have heard of:

  • Cryptomnesia.  Cryptomnesia occurs when something from the past recurs in your brain, or arises in your external environment, and you're unaware that you've already experienced it.  This has resulted in several probably unjustified accusations of plagiarism; the author in question undoubtedly saw the text they were accused of plagiarizing some time earlier, but honestly didn't remember they'd read it and thought that what they'd come up with was entirely original.  It can also result in some funnier situations -- while the members of Aerosmith were taking a break from recording their album Done With Mirrors, they had a radio going, and the song "You See Me Crying" came on.  Steven Tyler said he thought that was a pretty cool song, and maybe they should record a cover of it.  Joe Perry turned to him in incredulity and said, "That's us, you fuckhead."
  • Semantic satiation.  This is when a word you know suddenly looks unfamiliar to you, often because you've seen it repeatedly over a fairly short time.  Psychologist Chris Moulin of Leeds University did an experiment where he had test subjects write the word door over and over, and found that after a minute of this 68% of the subjects began to feel distinctly uneasy, with a number of them saying they were doubting that "door" was a real word.  I remember being in high school writing an exam in an English class, and staring at the word were for some time because I was convinced that it was spelled wrong (but couldn't, of course, remember how it was "actually" spelled).
  • Confabulation.  This is the recollection of events that never happened -- along with a certainty that you're remembering correctly.  (The people who claimed false memories of the Tavistock Square bombing were suffering from confabulation.)  The problem with this is twofold; the more often you think about the false memory or tell your friends and family about it, the more sure you are of it; and often, even when presented with concrete evidence that you're recalling incorrectly, somehow you still can't quite believe it.  A friend of mine tells the story of trying to help her teenage son find his car keys, and that she was absolutely certain that she'd seen them that day lying on a blue surface -- a chair, tablecloth, book, she wasn't sure which, but it was definitely blue.  They turned the house upside down, looking at every blue object they could find, and no luck.  Finally he decided to walk down to the bus stop and take the bus instead, and went to the garage to get his stuff out of the car -- and the keys were hanging from the ignition, where he'd left them the previous evening.  "Even after telling me this," my friend said, "I couldn't accept it.  I'd seen those keys sitting on a blue surface earlier that day, and remembered it as clearly as if they were in front of my face."
  • Declinism.  This is the tendency to remember the past as more positive than it actually was, and is responsible both for the "kids these days!" thing and "Make America Great Again."  There's a strong tendency for us to recall our own past as rosy and pleasant as compared to the shitshow we're currently immersed in, irrespective of the fact that violence, bigotry, crime, and general human ugliness are hardly new inventions.  (A darker aspect of this is that some of us -- including a great many MAGA types -- are actively longing to return to the time when straight White Christian men were in charge of everything; whether this is itself a mental aberration I'll leave you to decide.)  A more benign example is what I've noticed about travel -- that after you're home, the bad memories of discomfort and inconveniences and delays and questionable food fade quickly, leaving behind only the happy feeling of how much you enjoyed the experience.
  • The illusion of explanatory depth.  This is a dangerous one; it's the certainty that you understand deeply how something works, when in reality you don't.  This effect was first noted back in 2002 by psychologists Leonid Rozenblit and Frank Keil, who took test subjects and asked them to rank from zero to ten their understanding of how common devices worked, including zippers, bicycles, electric motors, toasters, and microwave ovens, and found that hardly anyone gave themselves a score lower than five on anything.  Interestingly, the effect vanished when Rozenblit and Keil asked the volunteers actually to explain how the devices worked; after trying to describe in writing how a zipper works, for example, most of test subjects sheepishly realized they actually had no idea.  This suggests an interesting strategy for dealing with self-styled experts on topics like climate change -- don't argue, ask questions, and let them demonstrate their ignorance on their own.
  • Presque vu.  Better known as the "tip-of-the-tongue" phenomenon -- the French name means "almost seen" -- this is when you know you know something, but simply can't recall it.  It's usually accompanied by a highly frustrating sense that it's right there, just beyond reach.  Back in the days before The Google, I spent an annoyingly long time trying to recall the name of the Third Musketeer (Athos, Porthos, and... who???).  I knew the memory was in there somewhere, but I couldn't access it.  It was only after I gave up and said "to hell with it" that -- seemingly out of nowhere -- the answer (Aramis) popped into my head.  Interestingly, neuroscientists are still baffled as to why this happens, and why turning your attention to something else often makes the memory reappear.

So be a little careful about how vehemently you argue with someone over whether your recollection of the past or theirs is correct.  Your version might be right, or theirs -- or it could easily be that both of you are remembering things incompletely or incorrectly.  I'll end with a further quote from Neil deGrasse Tyson: "We tend to have great confidence in our own brains, when in fact we should not.  It's not that eyewitness testimony by experts or people in uniform is better than that of the rest of us; it's all bad....  It's why we scientists put great faith in our instruments.  They don't care if they've had their morning coffee, or whether they got into an argument with their spouse -- they get it right every time."

****************************************

Saturday, December 7, 2024

Talking in your sleep

A little over a year ago, I decided to do something I've always wanted to do -- learn Japanese.

I've had a fascination with Japan since I was a kid.  My dad lived there for a while during the 1950s, and while he was there collected Japanese art and old vinyl records of Japanese folk and pop music, so I grew up surrounded by reminders of the culture.  As a result, I've always wanted to learn more about the country and its people and history, and -- one day, perhaps -- visit.

So in September of 2023 I signed up for Duolingo, and began to inch my way through learning the language.

[Image is in the Public Domain]

It's a challenge, to say the least.  Japanese usually shows up on lists of "the five most difficult languages to learn."  Not only are there the three different scripts you have to master in order to be literate, the grammatical structure is really different from English.  The trickiest part, at least thus far, is managing particles -- little words that follow nouns and indicate how they're being used in the sentence.  They're a bit like English prepositions, but there's a subtlety to them that is hard to grok.  Here's a simple example:

Watashi wa gozen juuji ni tokoshan de ane aimasu.

(I) (particle indicating the subject of the sentence) (A.M.) (ten o'clock) (particle indicating movement or time) (library) (particle indicating where something is happening) (my sister) (am meeting with) = "I am meeting my sister at ten A.M. at the library."

Get the particles wrong, and the sentence ends up somewhere between grammatically incorrect and completely incomprehensible.

So I'm coming along.  Slowly.  I have a reasonably good affinity for languages -- I grew up bilingual (English/French) and have a master's degree in linguistics -- but the hardest part for me is simply remembering the vocabulary.  The grammar patterns take some getting used to, but once I see how they work, they tend to stick.  The vocabulary, though?  Over and over again I'll run into a word, and I'm certain I've seen it before and at one point knew what it meant, and it will not come back to mind.  So I look it up...

... and then go, "Oh, of course.  Duh.  I knew that."

But according to a study this week out of the University of South Australia, apparently what I'm doing wrong is simple: I need more sleep.

Researchers in the Department of Neuroscience took 35 native English speakers and taught them "Mini-Pinyin" -- an invented pseudolanguage that has Mandarin Chinese vocabulary but English sentence structure.  (None of them had prior experience with Mandarin.)  They were sorted into two groups; the first learned the language in the morning and returned twelve hours later to be tested, and the second learned it in the evening, slept overnight in the lab, and were tested the following morning.

The second group did dramatically better than the first.  Significantly, during sleep their brains showed a higher-than-average level of brain wave patterns called slow oscillations and sleep spindles, that are thought to be connected with memory consolidation -- uploading short-term memories from the hippocampus into long-term storage in the cerebral cortex.  Your brain, in effect, talks in its sleep, routing information from one location to another.

"This coupling likely reflects the transfer of learned information from the hippocampus to the cortex, enhancing long-term memory storage," said Zachariah Cross, who co-authored the study.  "Post-sleep neural activity showed unique patterns of theta oscillations associated with cognitive control and memory consolidation, suggesting a strong link between sleep-induced brainwave co-ordination and learning outcomes."

So if you're taking a language class, or if -- like me -- you're just learning another language for your own entertainment, you're likely to have more success in retention if you study in the evening, and get a good night's sleep before you're called upon to use what you've learned.

Of course, many of us could use more sleep for a variety of other reasons.  Insomnia is a bear, and poor sleep is linked with a whole host of health-related woes.  But a nice benefit of dedicating yourself to getting better sleep duration and quality is an improvement in memory.

And hopefully for me, better scores on my Duolingo lessons.

****************************************

Saturday, August 24, 2024

Pet warp

In recent posts we have dealt with the Earth being invaded by giant alien bugs, the possibility that Bigfoot and other cryptids are actually ghosts, and a claim that some soldiers in World War I were saved by the appearance of either an angel or else St. George, depending on which version you go for.  So I'm sure that what you're all thinking is, "Yes, Gordon, but what about pet teleportation?"

At this point, I should stop being surprised at the things that show up on websites such as the one in the link above, from the site Mysterious Universe.  In this particular article, by Brent Swancer (this is not his first appearance here at Skeptophilia, as you might imagine), we hear about times that Fido and Mr. Fluffums evidently took advantage of nearby wormholes to leap instantaneously across spacetime.

In one such instance, Swancer tells us, a woman had been taking a nap with her kitty, and got up, leaving the cat sleeping in bed.  Ten minutes later, she went back into the bedroom, and the cat was gone.  At that point, the phone rang.  It was a friend who lived across town -- calling to tell her that the cat had just showed up on their doorstep.

Another person describes having his cat teleporting from one room in the house to another, after which the cat "seemed terrified:" 
All the fur on his back was standing up and he was crouched low to the ground. He looked like he had no idea what just happened, either.  That was about ten minutes ago.  He won’t leave my side now, which is strange in itself, because he likes independence, but he is still very unsettled and so am I.
And Swancer tells us that it's not just cats.  He recounts a tale by "the great biologist... Ivan T. Sanderson," wherein he was working with leafcutter ants and found sometimes the queen mysteriously disappears from the ant nest.  "Further digging in some sites within hours," Sanderson tells us, "brought to light, to the dumbfoundment of everybody, apparently the same queen, all duly dyed with intricate identifying marks, dozens of feet away in another super-concrete-hard cell, happily eating, excreting and producing eggs!"

However, in the interest of honesty it must be said that Sanderson might not be the most credible witness in the world.  He did a good bit of writing about nature and biology, but is best known for his work in cryptozoology.  According to the Wikipedia article on him (linked above), he gave "special attention to the search for lake monsters, sea serpents, Mokèlé-mbèmbé, giant penguins, Yeti, and Sasquatch."  And amongst his publications are Abominable Snowman: Legend Come to Life and the rather vaguely-named Things, which the cover tells us is about "monsters, mysteries, and marvels uncanny, strange, but true."

So I'm inclined to view Sanderson's teleporting ants with a bit of a wry eye.

What strikes me about all of this is the usual problem of believing anecdotal evidence.  It's not that I'm accusing anyone of lying (although that possibility does have to be admitted); it's easy enough, given our faulty sensory processing equipment and plastic, inaccurate memory, to be absolutely convinced of something that actually didn't happen that way.  A study by New York University psychological researcher Elizabeth Phelps showed that people's memories of 9/11 -- surely a big enough event to recall accurately -- only got 63% of the details right, despite study participants' certainty they were remembering what actually happened.  Worse, a study by Joyce W. Lacy (Azusa Pacific University) and Craig E. L. Stark (University of California-Irvine) showed that even how a question is asked by an interviewer can alter a person's memory -- and scariest of all, the person has no idea it's happened.  They remain convinced that what they "recall" is accurate.

Plus, there's the little problem of the lack of a mechanism.  How, exactly, could anything, much less your pet kitty, vanish from one place and simultaneously reappear somewhere else?  I have a hard time getting my dog Rosie even to move at sub-light speeds sometimes, especially when she's walking in front of me at a pace we call "the Rosie Mosey." In fact, most days her favorite speed seems to be "motionless," especially if she has her favorite plush toy to snuggle with:


Given all that, it's hard to imagine she'd have the motivation to accomplish going anywhere at superluminal velocity.

As intriguing as those stories are, I'm inclined to be a bit dubious.  Which I'm sure you predicted.  So you don't need to spend time worrying about how you'll deal with it when Rex and Tigger take a trip through warped space.  If they mysteriously vanish only to show up elsewhere, chances are they were traveling in some completely ordinary fashion, and the only thing that's awry is your memory of what happened.

****************************************



Monday, March 18, 2024

Memory boost

About two months ago I signed up with Duolingo to study Japanese.

I've been fascinated with Japan and the Japanese culture pretty much all my life, but I'm a total novice with the language, so I started out from "complete beginner" status.  I'm doing okay so far, although the fact that it's got three writing systems is a challenge, to put it mildly.  Like most Japanese programs, it's beginning with the hiragana system -- a syllabic script that allows you to work out the pronunciation of words -- but I've already seen a bit of katakana (used primarily for words borrowed from other languages) and even a couple of kanji (the ideographic script, where a character represents an entire word or concept).

[Image licensed under the Creative Commons 663highland, 140405 Tsu Castle Tsu MIe pref Japan01s, CC BY-SA 3.0]

While Duolingo focuses on getting you listening to spoken Japanese right away, my linguistics training has me already looking for patterns -- such as the fact that wa after a noun seems to act as a subject marker, and ka at the end of a sentence turns it into a question.  I'm still perplexed by some of the pronunciation patterns -- why, for example, vowel sounds sometimes don't get pronounced.  The first case of this I noticed is that the family name of the brilliant author Akutagawa RyÅ«nosuke is pronounced /ak'tagawa/ -- the /u/ in the second syllable virtually disappears.  I hear it happening fairly commonly in spoken Japanese, but I haven't been able to deduce what the pattern is.  (If there is one.  If there's one thing my linguistics studies have taught me, it's that all languages have quirks.  Try explaining to someone new to English why, for instance, the -ough combination in cough, rough, through, bough, and thorough are all pronounced differently.) 

Still and all, I'm coming along.  I've learned some useful phrases like "Sushi and water, please" (Sushi to mizu, kudasai) and "Excuse me, where is the train station?" (Sumimasen, eki wa doko desu ka?), as well as less useful ones like "Naomi Yamaguchi is cute" (Yamaguchi Naomi-san wa kawaii desu), which is only critical to know if you have a cute friend who happens to be named Naomi Yamaguchi.

The memorization, however, is often taxing to my 63-year-old brain.  Good for it, I have no doubt -- a recent study found that being bi- or multi-lingual can delay the onset of dementia by four years or more -- but it definitely is a challenge.  I go through my hiragana flash cards at least once a day, and have copious notes for what words mean and for any grammatical oddness I happen to notice.  Just the sheer amount of memorization, though, is kind of daunting.

Maybe what I should do is find a way to change the context in which I have to remember particular words, phrases, or characters.  That seems to be the upshot of a study I ran into a couple of days ago in Proceedings of the National Academy of Sciences, about a study by a group from Temple University and the University of Pittsburgh about how to improve retention.

I'm sure all of us have experienced the effects of cramming for a test -- studying like hell the night before, and then you do okay on the test but a week later barely remember any of it.  This practice does two things wrong; not only stuffing all the studying into a single session, but doing it all the same way.

What this study showed was two factors that significantly improved long-term memory.  One was spacing out study sessions -- doing shorter sessions more often definitely helped.  I'm already approaching Duolingo this way, usually doing a lesson or two over my morning coffee, then hitting it again for a few more after dinner.  But the other interesting variable they looked at was that test subjects' memories improved substantially when the context was changed -- when, for example, you're trying to remember as much as you can of what a specific person is wearing, but instead of being shown the same photograph over and over, you're given photographs of the person wearing the same clothes but in a different setting each time.

"We were able to ask how memory is impacted both by what is being learned -- whether that is an exact repetition or instead, contains variations or changes -- as well as when it is learned over repeated study opportunities," said Emily Cowan, lead author of the study.  "In other words... we could examine how having material that more closely resembles our experiences of repetition in the real world -- where some aspects stay the same but others differ -- impacts memory if you are exposed to that information in quick succession versus over longer intervals, from seconds to minutes, or hours to days."

I can say that this is one of the things Duolingo does right.  Words are repeated, but in different combinations and in different ways -- spoken, spelled out using the English transliteration, or in hiragana only.  Rather than always seeing the same word in the same context, there's a balance between the repetition we all need when learning a new language and pushing your brain to generalize to slightly different usages or contexts.

So all things considered, Duolingo had it figured out even before the latest research came out.  I'm hoping it pays off, because my son and I would like to take a trip to Japan at some point and be able to get along, even if we don't meet anyone cute named Naomi Yamaguchi.  But I should wind this up, so for now I'll say ja ane, mata ashita (goodbye, see you tomorrow).

****************************************



Thursday, March 14, 2024

In memoriam

I want you to recall something simple.  A few to choose from:
  • your own middle name
  • the street you grew up on
  • your best friend in elementary school
  • the name of your first pet
  • your second-grade teacher's name
Now, I'm presuming that none of you were actively thinking about any of those before I asked.  So, here are a couple of questions:

Where was that information before I asked you about it?  And how did you retrieve it from wherever that was?

The simple answer is, "we don't know."  Well, we have a decent idea about where in the brain specific kinds of information are stored, mostly from looking at what gets lost when people have strokes or traumatic brain injury.  (A technique my Anatomy and Physiology professor described as "figuring out how a car functions by smashing parts of it with a hammer, and then seeing what doesn't work anymore.")

But how exactly is that information is encoded?  That's an ongoing area of research, and one we're only beginning to see results from.  The prevailing idea for a long time has been that interactions between networks of neurons in the brain allow the storage and retrieval of memories -- for example, you have networks that encode memory of faces, ones that involve familiarity, ones that activate when you feel positive emotions, possibly ones that fire for particular stimuli like gray hair, glasses, being female, being elderly, or tone of voice -- and the intersection of these activate to retrieve the memory of your grandmother.

The problem is, all attempts to find a Venn-diagram-like cross-connected network in the brain have failed.  Even so, the idea that there could be a much smaller and more specific neural cluster devoted to a particular memory was ridiculed as the "grandmother cell model" -- the term was coined by neuroscientist Jerome Lettvin in the 1960s -- it was thought to be nonsense that we could have anything like a one-to-one correlation between memories and neurons.  As neuroscientist Charles Edward Connor put it, the grandmother cell model had "become a shorthand for invoking all of the overwhelming practical arguments against a one-to-one object coding scheme.  No one wants to be accused of believing in grandmother cells."

[Image is in the Public Domain courtesy of photographer Michel Royon]

The problem came roaring back, though, when neurosurgeons Itzhak Fried and Rodrigo Quian Quiroga were working with an epileptic patient who had electrical brain-monitoring implants, and found that when he was shown a photograph of Jennifer Aniston, a specific neuron fired in his brain.  Evidently, we do encode specific memories in only a tiny number of neurons -- but how it works is still unknown.  

We have over eighty billion neurons in the brain -- so even discounting the ones involved in autonomic functioning, you'd still think there's plenty to encode specific memories.  But... and this is a huge but... there's no evidence whatsoever that when you learn something new, somehow you're doing any kind of neural rewiring, much less growing new neurons.

The upshot is that we still don't know.

The reason this comes up is because of a study at Columbia University that was published last week in Nature Human Behavior, that looked at a newly-discovered type of brain wave, a traveling wave -- which sweeps across the cerebrum during certain activities.  And what the researchers, led by biomedical engineer Joshua Jacobs, found is that when memories are formed, traveling waves tend to move from the back of the cerebrum toward the front, and in the opposite direction when memories are retrieved.

Of course, nothing in the brain is quite that simple.  Some people's brain waves went the other direction; it seems like the change in direction is what was critical.  "I implemented a method to label waves traveling in one direction as basically 'good for putting something into memory,'" said Uma Mohan, who co-authored the paper.  "Then we could see how the direction switched over the course of the task.  The waves tended to go in the participant’s encoding direction when that participant was putting something into memory and in the opposite direction right before they recalled the word.  Overall, this new work links traveling waves to behavior by demonstrating that traveling waves propagate in different directions across the cortex for separate memory processes."

The other limitation of the study is that it doesn't discern whether the traveling waves, and the change in direction, are a cause or an effect -- if the change in direction causes recall, or if the shift in wave direction is caused by some other process that is the actual trigger for recall -- so the direction change is merely a byproduct.  But it certainly is an intriguing start on a vexing question in neuroscience.

Me, I want to know what's going on with the "tip of the tongue" phenomenon.  Just about everyone experiences it -- you know the memory is in there somewhere, you can almost get it, but... nope.  Most puzzling (and frustrating), I find that giving up and going to The Google often triggers the memory to appear before I have the chance to look it up.  This happened not long ago -- for some reason I was trying to come up with the name of the third Musketeer.  Athos, Porthos, and... who?  I pondered on it, and then finally went, "to hell with it," and did a search, but before I could even hit "return" my brain said, "Aramis."

What the fuck, brain?  Do you do this just to taunt me?

At least I comfort myself in knowing that we don't really understand how any of this works.  Which is slim consolation -- but at least it means that my own brain is no more baffling than anyone else's.

****************************************



Friday, June 9, 2023

The myth of the Golden Age

You hear it all the time, don't you?  There's no such thing as common decency any more.  Moral values are in freefall.  Simple politeness is a thing of the past.  Kids today don't understand the value of (choose all that apply): hard work, honesty, compassion, loyalty, friendship, culture, intellectual pursuits.  The whole world has gone seriously downhill.

Oh, and we mustn't forget "Make America Great Again."  Implying that there was a time in the past -- usually unspecified -- when America was great, but it's kind of gone down the tubes since then.  But it's not just the Republicans; a 2015 study found that 76% of respondents in the United States believed that "addressing the moral breakdown of the country should be a high priority for their government."

This whole deeply pessimistic attitude is widespread -- that compared to the past, we're a hopeless mess.  The first clue that this might not be accurate, though, comes from history, and not just the fact that the past -- regardless which part of it you choose -- had some seriously bad parts.  Consider in addition that just about every era has felt the same way about its own past.  Nineteenth century Europe, for example, had a nearly religious reverence for the societies of classical Rome and Greece -- which is ironic, because the Greeks and Romans at the height of their civilizations both looked back to their ancestors as living in a "Golden Age of Heroes" that had, sadly, devolved into chaos and highly unheroic ugliness.

The Golden Age by Pietro de Cortona (17th century) [Image is in the Public Domain]

So psychologists Adam Mastroianni (of Columbia University) and Daniel Gilbert (of Harvard University) decided to see if there was any truth to the claim that we really are in moral decline.

Their findings, which were published last week in Nature, drew on sixty years of surveys about moral values, with respondents from 59 different countries.  These surveys not only asked questions regarding whether morality had declined over the respondents' lifetimes (84% said it had), they asked them to rate their own values and their peers'.

Interestingly, although most people said things were worse now than they had been in the past, there was no decline over time in how people rated the values and morality of the people around them in the present.  The percentage of people respondents knew and described as kind, decent, honest, or hard-working has remained completely flat over the past sixty years.

So what's going on?

Mastroianni and Gilbert say it's simple.

People idealize the past because they have bad memories.

It's the same phenomenon as when we recall vacations where there have been mishaps.  After a couple of years have passed, we remember the positive parts -- the walks on the beach, the excellent food, the beautiful weather -- and the sunburn, mosquito bites, delayed flights, and uncomfortable hotel room beds have all faded from memory.  It has to be really bad before the unpleasant memories come to mind first, such as the trip I took with my wife to Belize where the guests and staff of the lodge where we were staying all simultaneously came down with the worst food poisoning I've ever experienced.

Okay, that I remember pretty vividly.  But most vacation mishaps?  Barely remembered -- or only recalled with a smile, a laugh, a "can you believe that happened?"

What Mastroianni and Gilbert found was that we put that same undeserved gloss on the past in general.  It's an encouraging finding, really; people aren't getting worse, morality isn't going downhill, the world isn't going to hell in a handbasket.  In reality, most people now -- just like in the past -- are honest and decent and kind.

The problem, of course, is that given how widespread this belief is, and how resistant it is to changing, how to get folks to stop looking at the past as some kind of Golden Age.  Because the fact is, we have made some significant strides in a great many areas; equality for women and minorities, LGBTQ rights and treatment, concern for the environment are all far ahead of where they were even forty years ago.  There are a lot of ways the past wasn't all that great.

Believe me, as a closeted queer kid who grew up in the Deep South of the 1960s and 1970s, I wouldn't want to go back there for any money.

So maybe we need to turn our focus away from the past and look instead toward the future -- instead of lamenting some mythical and almost certainly false lost paradise, working toward making what's to come even better for everyone. 

****************************************



Saturday, April 8, 2023

Invention of things past

On July 7, 2005, an Islamic suicide bomber detonated an explosive device on a double-decker bus in Tavistock Square, London, killing thirteen people and injuring dozens of others.  It was part of a coordinated series of attacks that day that took 52 lives.

Understandably, investigators put a tremendous amount of effort into trying to determine what exactly had happened on that horrible day.  They questioned eyewitnesses, and of course the case was all over the news for weeks.  Three years later, a man named James Ost, of the University of Portsmouth, became interesting in finding out what impact the event had made on people who lived nearby at the time, and began to interview locals.

A common theme was how traumatizing it had been to watch the CCTV footage of the actual Tavistock explosion.  Four out of ten people Ost interviewed had details seared into their brains -- hearing the screams, seeing the debris flying in all directions.  One man said he remembered actually seeing someone -- he wasn't sure if it was a passenger or the bomber himself -- blown to bits.  More than one said they had felt reluctant to watch it at the time, and afterwards regretted having done so.

All of which is fascinating -- because there is no CCTV footage of the explosion.  In fact, no video record of the bombing, of any kind, exists.

Ost's study was not the first to look at the phenomenon of false or invented memory, but it's justifiably one of the most famous.  A couple of things that are remarkable about this study are the Ost didn't give much of a prompt to the test subjects about video footage; he simply asked them to recall as much as they could about what they'd seen of the bombing, and the subjects came up with the rest on their own.  Second, the memories had astonishing detail, down to the color of clothing some of the people in the imagined video were wearing.  And third -- most disturbingly -- was the power of the false memory.  Several test subjects, when told there was no footage of the attack, simply refused to believe it.

"But I remember it," was the common refrain.

Our memories are incomplete and inaccurate, filled with lacunae (the psychological term for gaps in recall), and laced through with seemingly sharp details of events that never actually happened.  Those details can come from a variety of sources -- what we were told happened, what we imagine happened, what happened to someone else that we later misremembered as happening to us, and outright falsehoods.  Oh, sure, some of what we remember is accurate; but how do you know which part that is, when the false and inaccurate memories seem just as vivid, just as real?

[Image licensed under the Creative Commons © Michel Royon / Wikimedia Commons, Brain memory, CC0 1.0]

The scariest part is how quickly those errors start to form.  In the last fifteen seconds, I took a sip of my morning coffee, looked out of the window at a goldfinch on my bird feeder, noticed that my dog had gotten up because I could hear him eating his breakfast in the next room.  How in the hell could I be remembering any of that incorrectly, given that it all happened under a minute ago?

Well, a paper that appeared last week in PLOS-One, about a study done at the University of Amsterdam, showed that inaccuracies in our memories increase by 150% in the time between a half-second and three seconds after the event occurs.

The study was simple and elegant.  Test subjects were shown words with highlighted letters, and asked to recall two things; which letter was highlighted, and whether the highlighted letter was shown in its normal orientation or else reversed right-to-left.  Most people were pretty good at recalling what the highlighted letter was, but because seeing mirror-image letters is not something we expect, recognizing and recalling that took more effort.

And if you wait three seconds, the error rate for remembering whether the letter was reversed climbs from twenty to thirty percent.  Evidently, our memory very quickly falls back on "recalling" what it thinks we should have seen, and not what we actually did see.

It's a profoundly unsettling finding.  It's almost like our existence is this moving window of reality, and as it slips by, the images it leaves behind begin to degrade almost immediately.  "I know it happened that way, I remember it clearly" is, honestly, an absurd statement.  None of us remembers the past with any kind of completeness or clarity, however sure we feel about it.  Unless you have a video of the events in question, I'd hesitate to trumpet your own certainty too loudly.

And, of course, it also means you have to check to see if the video itself actually exists.

****************************************



Friday, February 24, 2023

Saucy savagery

Kids these days, ya know what I mean?

Wiser heads than mine have commented on the laziness, disrespectfulness, and general dissipation of youth.  Here's a sampler:
  • Parents themselves were often the cause of many difficulties.  They frequently failed in their obvious duty to teach self-control and discipline to their own children.
  • We defy anyone who goes about with his eyes open to deny that there is, as never before, an attitude on the part of young folk which is best described as grossly thoughtless, rude, and utterly selfish.
  • The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise.  Children are now tyrants, not the servants of their households.  They no longer rise when elders enter the room.  They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.
  • Never has youth been exposed to such dangers of both perversion and arrest as in our own land and day.  Increasing urban life with its temptations, prematurities, sedentary occupations, and passive stimuli just when an active life is most needed, early emancipation and a lessening sense for both duty and discipline, the haste to know and do all befitting man's estate before its time, the mad rush for sudden wealth and the reckless fashions set by its gilded youth--all these lack some of the regulatives they still have in older lands with more conservative conditions.
  • Youth were never more saucy -- never more savagely saucy -- as now... the ancient are scorned, the honourable are condemned, and the magistrate is not dreaded.
  • Our sires' age was worse than our grandsires'.  We, their sons, are more worthless than they; so in our turn we shall give the world a progeny yet more corrupt.
  • [Young people] are high-minded because they have not yet been humbled by life, nor have they experienced the force of circumstances…  They think they know everything, and are always quite sure about it.
Of course, I haven't told you where these quotes come from. In order:
  • from an editorial in the Leeds Mercury, 1938
  • from an editorial in the Hull Daily Mail, 1925
  • Kenneth John Freeman, Cambridge University, 1907
  • Granville Stanley Hall, The Psychology of Adolescence, 1904
  • Thomas Barnes, The Wise Man's Forecast Against the Evil Time, 1624
  • Horace, Odes, Book III, 20 B.C.E.
  • Aristotle, 4th century B.C.E.
So yeah.  Adults saying "kids these days" has a long, inglorious history.  (Nota bene: the third quote, from Kenneth Freeman, has often been misattributed to Socrates, but it seems pretty unequivocal that Freeman was the originator.)

Jan Miense Molenaar, Children Making Music (ca. 1630) [Image is in the Public Domain]

I can say from my admitted sample-size-of-one that "kids these days" are pretty much the same as they were when I first started teaching 35 long years ago.  Throughout my career there were kind ones and bullies, intelligent and not-so-much, hard-working and not-so-much, readers and non-readers, honest and dishonest.  Yes, a lot of the context has changed; just the access to, and sophistication of, technology has solved a whole host of problems and created a whole host of other ones, but isn't that always the way?  In my far-off and misspent youth, adults railed against rock music and long hair in much the same way that they do today about cellphones and social media, and with about as much justification.  Yes, there are kids who misuse social media and have their noses in their SmartPhones 24/7, but the vast majority handle themselves around these devices just fine -- same as most of my generation didn't turn out to be drug-abusing, illiterate, disrespectful dropouts.

This comes up because of a study in Science Advances by John Protzko and Jonathan Schooler, called "Kids These Days: Why the Youth of Today Seem Lacking."  And its unfortunate conclusion -- unfortunate for us adults, that is -- is that the sense of today's young people being irresponsible, disrespectful, and lazy is mostly because we don't remember how irresponsible, disrespectful, and lazy we were when we were teenagers.  And before you say, "Wait a moment, I was a respectful and hard-working teenager" -- okay, maybe.  But so are many of today's teenagers.  If you want me to buy that we're in a downward spiral, you'll have to convince me that more teenagers back then were hard-working and responsible, and that I simply don't believe.

And neither do Protzko and Schooler.

So the whole thing hinges more on idealization of the past, and our own poor memories, than on anything real.  I also suspect that a good many of the older adults who roll their eyes about "kids these days" don't have any actual substantive contact with young people, and are getting their impressions of teenagers from the media -- which certainly doesn't have a vested interest in portraying anyone as ordinary, honest, and law-abiding.

Oh, and another thing.  What really gets my blood boiling is the adults who on the one hand snarl about how complacent and selfish young people are -- and then when young people rise up and try to change things, such as Greta Thunberg and the activists from Marjory Stoneman Douglas High School, they say, "Wait, not like that."  What, you only accept youth activism if it supports the status quo?

All well and good for kids to have opinions, until they start contradicting the opinions of adults, seems like.

Anyhow, I'm an optimist about today's youth.  I saw way too many positive things in my years as a high school teacher to feel like this is going to be the generation that trashes everything through irresponsibility and disrespect for tradition.  And if after reading this, you're still in any doubt about that, I want you to think back on your own teenage years, and ask yourself honestly if you were as squeaky-clean as you'd like people to believe.

Or were you -- like the youth in Aristotle's day -- guilty of thinking you knew everything, and being quite sure about it?

****************************************