Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label forgetting. Show all posts
Showing posts with label forgetting. Show all posts

Saturday, February 19, 2022

Remembrance of things past

Like many People Of A Certain Age, I'm finding that my memory isn't what it used to be.

I walk into a room, and then say, "Why did I come in here?"  I'll think, "I don't need a grocery list, I'm just going for a few things," and come back with half of them.  We just had our dogs in for their annual checkups and shots, and there were a few things for each of them we wanted to ask the vet about.  My wife and I dutifully sat down and made a list -- and both of us forgot to put something on the list that we'd talked about only the previous day.

It's shown up, too, in more academic pursuits.  For my birthday last year my wife got me an online course through Udemy in beginning Japanese, a language I've always wanted to learn.  My dad had been stationed in Japan in the 1950s, and he learned enough of the language to get by; I grew up around the Japanese art and music my dad brought back with him, and became a Japanophile for life.  So I was thrilled to have the opportunity to study the country's unique and beautiful language.  The course starts out with a brief pronunciation guide, then launches into the hiragana -- one of three scripts used in written Japanese.  Each of the 46 characters stands for either a phoneme or a syllable, and some of them look quite a bit alike, so it's a lot to remember.  I have flash cards I made for all 46, and there are some I consistently miss, every single time I go through them.

When I flip the card over, my response is always, "Damn!  Of course!  Now I remember it!"  I recognize the character immediately, and can often even remember the mnemonic the teacher suggested to use in recalling it.  I'm getting there -- of the 46, there are about ten that I still struggle with -- but I know that twenty years ago, I'd have them all down cold by now.

Kids playing a memory game [Image is in the Public Domain]

Understandably, there's a nasty little thought in the back of my mind about senility and dementia.  My mother's sister had Alzheimer's -- to my knowledge, the only person in my extended family to suffer from that horrific and debilitating disease -- and I watched her slow slide from a smart, funny woman who could wipe the floor with me at Scrabble, did crossword puzzles in ink, and read voraciously, to a hollow, unresponsive shell.  I can think of no more terrifying fate. 

A new piece of research in Trends in Cognitive Science has to some extent put my mind at ease.  In "Cluttered Memory Representations Shape Cognition in Old Age," psychologists Tarek Amer (of Columbia University), Jordana Wynn (of Harvard University), and Lynn Hasher (of the University of Toronto) found that the forgetfulness a lot of us experience as we age isn't a simple loss of information, it's a loss of access to information that's still there, triggered by the clutter of memories from the past.

The authors write:
Wisdom and knowledge, cognitive functions that surely depend on being able to access and use memory, grow into old age.  Yet, the literature on memory shows that intentional, episodic memory declines with age.  How are we to account for this paradox?  To do so, we need to understand three aspects of memory differences associated with aging, two of which have received extensive investigation: age differences in memory encoding and in retrieval.  A third aspect, differences in the contents of memory representations, has received relatively little empirical attention.  Here, we argue that this aspect is central to a full understanding of age differences in memory and memory-related cognitive functions.  We propose that, relative to younger adults, healthy older adults (typically between 60 and 85 years of age) process and store too much information, the result of reductions in cognitive control or inhibitory mechanisms.  When efficient, these mechanisms enable a focus on target or goal-relevant information to the exclusion (or suppression) of irrelevant information.  Due to poor control (or reduced efficiency), the mnemonic representations of older adults can include: (i) recently activated but no-longer-relevant information; (ii) task-unrelated thoughts and/or prior knowledge elicited by the target information; and/or (iii) task-irrelevant information cued by the immediate environment.  This information is then automatically bound together with target information, creating cluttered memory representations that contain more information than do those of younger adults.

It's like trying to find something in a cluttered, disorganized attic.  Not only is it hard to locate what you're looking for, you get distracted by the other things you run across.  "Wow, it's been years since I've seen this!  I didn't even know this was up here!.... wait, what am looking for?"

I've noticed this exact problem in the kitchen.  I'm the chief cook in our family, and I love to make complex dinners with lots of ingredients.  I've found that unless I want to make a dozen trips to the fridge or cabinets to retrieve three items, I need to focus on one thing at a time.  Get a green pepper from the vegetable crisper.  Find the bottle of cooking sherry.  Go get the bottle of tabasco sauce from the table.  If I try to keep all three in my mind at once, I'm sure to return to the stove and think, "Okay, what the hell do I need, again?"

I wonder if this mental clutter is at the heart of my struggle with memorizing the hiragana characters in Japanese.  I've done at least a cursory study of about a dozen languages -- I'm truly fluent in only a couple, but my master's degree in historical linguistics required me to learn at least the rudiments of the languages whose history I was studying.  Could my difficulty in connecting the Japanese characters to the syllables they represent be because my Language Module is clogged with Old Norse and Welsh and Scottish Gaelic and Icelandic, and those all get in the way?

In any case, it's kind of a relief that I'm (probably) not suffering from early dementia.  It also gives me an excuse the next time my wife gets annoyed at me for forgetting something.  "I'm sorry, dear," I'll say.  "I'd have remembered it, but my brain is full.  But at least I remembered that the character yo looks like a yo-yo hanging from someone's finger!"

Nah, I doubt that'll work, and the fact that I remembered one of the Japanese characters instead of stopping by the store to pick up milk and eggs will only make it worse.  When I want to be sure not to forget something, I guess I'll have to keep making a list.

The only problem is then, I need to remember where I put the list.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, May 26, 2021

Thanks for the memories

I've always been fascinated with memory. From the "tip of the tongue" phenomenon, to the peculiar (and unexplained) phenomenon of déjà vu, to why some people have odd abilities (or inabilities) to remember certain types of information, to caprices of the brain such as its capacity for recalling a forgotten item once you stop thinking about it -- the way the brain handles storage and retrieval of memories is a curious and complex subject.

Two pieces of research have given us a window into how the brain organizes memories, and their connection to emotion.  In the first, a team at Dartmouth and Princeton Universities came up with a protocol to induce test subjects to forget certain things intentionally.  While this may seem like a counterproductive ability -- most of us struggle far harder to recall memories than to forget them deliberately -- consider the applicability of this research to debilitating conditions such as post-traumatic stress disorder.

In the study, test subjects were shown images of outdoor scenes as they studied two successive lists of words.  In one case, the test subjects were told to forget the first list once they received the second; in the other, they were instructed to try to remember both.

"Our hope was the scene images would bias the background, or contextual, thoughts that people had as they studied the words to include scene-related thoughts," said Jeremy Manning, an assistant professor of psychological and brain sciences at Dartmouth, who was lead author of the study.  "We used fMRI to track how much people were thinking of scene-related things at each moment during our experiment.  That allowed us to track, on a moment-by-moment basis, how those scene or context representations faded in and out of people's thoughts over time."

What was most interesting about the results is that in the case where the test subjects were told to forget the first list, the brain apparently purged its memory of the specifics of the outdoor scene images the person had been shown as well.  When subjects were told to recall the words on both lists, they recalled the images on both sets of photographs.

"[M]emory studies are often concerned with how we remember rather than how we forget, and forgetting is typically viewed as a 'failure' in some sense, but sometimes forgetting can be beneficial, too," Manning said.  "For example, we might want to forget a traumatic event, such as soldiers with PTSD.  Or we might want to get old information 'out of our head,' so we can focus on learning new material.  Our study identified one mechanism that supports these processes."

What's even cooler is that because the study was done with subjects connected to an fMRI, the scientists were able to see what contextual forgetting looks like in terms of brain firing patterns.  "It's very difficult to specifically identify the neural representations of contextual information," Manning said.  "If you consider the context you experience something in, we're really referring to the enormously complex, seemingly random thoughts you had during that experience.  Those thoughts are presumably idiosyncratic to you as an individual, and they're also potentially unique to that specific moment.  So, tracking the neural representations of these things is extremely challenging because we only ever have one measurement of a particular context.  Therefore, you can't directly train a computer to recognize what context 'looks like' in the brain because context is a continually moving and evolving target.  In our study, we sidestepped this issue using a novel experimental manipulation -- we biased people to incorporate those scene images into the thoughts they had when they studied new words.  Since those scenes were common across people and over time, we were able to use fMRI to track the associated mental representations from moment to moment."

In the second study, a team at UCLA looked at what happens when a memory is connected to an emotional state -- especially an unpleasant one.  What I find wryly amusing about this study is that the researchers chose as their source of unpleasant emotion the stress one feels in taking a difficult math class.

I chuckled grimly when I read this, because I had the experience of completely running into the wall, vis-à-vis mathematics, when I was in college.  Prior to that, I actually had been a pretty good math student.  I breezed through high school math, barely opening a book or spending any time outside of class studying.  In fact, even my first two semesters of calculus in college, if not exactly a breeze, at least made good sense to me and resulted in solid A grades.

Then I took Calc 3.

I'm not entirely sure what happened, but when I hit three-dimensional representations of graphs, and double and triple integrals, and calculating the volume of the intersection of four different solid objects, my brain just couldn't handle it.  I  got a C in Calc 3 largely because the professor didn't want to have to deal with me again.  After that, I sort of never recovered.  I had a good experience with Differential Equations (mostly because of a stupendous teacher), but the rest of my mathematical career was pretty much a flop.

And the worst part is that I still have stress dreams about math classes.  I'm back at college, and I realize that (1) I have a major exam in math that day, and (2) I have no idea how to do what I'll be tested on, and furthermore (3) I haven't attended class for weeks.  Sometimes the dream involves homework I'm supposed to turn in but don't have the first clue about how to do.  Sometimes, I not only haven't studied for the exam I'm about to take, I can't find the classroom.

Keep in mind that this is almost forty years after my last-ever math class. And I'm still having anxiety dreams about it.



What the researchers at UCLA did was to track students who were in an advanced calculus class, keeping track of both their grades and their self-reported levels of stress surrounding the course.  Their final exam grades were recorded -- and then, two weeks after the final, they were given a retest over the same material.

The fascinating result is that stress was unrelated to students' scores on the actual final exam, but the students who reported the most stress did significantly more poorly on the retest.  The researchers call this "motivated forgetting" -- that the brain is ridding itself of memories that are associated with unpleasant emotions, perhaps in order to preserve the person's sense of being intelligent and competent.

"Students who found the course very stressful and difficult might have given in to the motivation to forget as a way to protect their identity as being good at math," said study lead author Gerardo Ramirez.  "We tend to forget unpleasant experiences and memories that threaten our self-image as a way to preserve our psychological well-being.  And 'math people' whose identity is threatened by their previous stressful course experience may actively work to forget what they learned."

So that's today's journey through the recesses of the human mind.  It's a fascinating and complex place, never failing to surprise us, and how amazing it is that we are beginning to understand how it works.  As my dear friend, Professor Emeritus Rita Calvo, Cornell University teacher and researcher in Human Genetics, put it: "The twentieth century was the century of the gene.  The twenty-first will be the century of the brain.  With respect to neuroscience, we are right now about where genetics was in the early 1900s -- we know a lot of the descriptive features of the brain, some of the underlying biochemistry, and other than that, some rather sketchy details about this and that.  We don't yet have a coherent picture of how the brain works.

"But we're heading that direction.  It is only a matter of time till we have a working model of the mind.  How tremendously exciting!"

***********************************

Saber-toothed tigers.  Giant ground sloths.  Mastodons and woolly mammoths.  Enormous birds like the elephant bird and the moa.  North American camels, hippos, and rhinos.  Glyptodons, an armadillo relative as big as a Volkswagen Beetle with an enormous spiked club on the end of their tail.

What do they all have in common?  Besides being huge and cool?

They all went extinct, and all around the same time -- around 14,000 years ago.  Remnant populations persisted a while longer in some cases (there was a small herd of woolly mammoths on Wrangel Island in the Aleutians only four thousand years ago, for example), but these animals went from being the major fauna of North America, South America, Eurasia, and Australia to being completely gone in an astonishingly short time.

What caused their demise?

This week's Skeptophilia book of the week is The End of the Megafauna: The Fate of the World's Hugest, Fiercest, and Strangest Animals, by Ross MacPhee, which considers the question, and looks at various scenarios -- human overhunting, introduced disease, climatic shifts, catastrophes like meteor strikes or nearby supernova explosions.  Seeing how fast things can change is sobering, especially given that we are currently in the Sixth Great Extinction -- a recent paper said that current extinction rates are about the same as they were during the height of the Cretaceous-Tertiary Extinction 66 million years ago, which wiped out all the non-avian dinosaurs and a great many other species at the same time.  

Along the way we get to see beautiful depictions of these bizarre animals by artist Peter Schouten, giving us a glimpse of what this continent's wildlife would have looked like only fifteen thousand years ago.  It's a fascinating glimpse into a lost world, and an object lesson to the people currently creating our global environmental policy -- we're no more immune to the consequences of environmental devastation as the ground sloths and glyptodons were.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!] 


Saturday, December 26, 2020

Purging memory

Last night I had a good bit of trouble sleeping.

This isn't all that uncommon.  I've had issues with insomnia ever since I was a teenager.  Sometimes the issue is physical restlessness; sometimes it's anxiety, either over something real or something imagined.

Last night, it's because my brain was shrieking, over and over, "FELIZ NAVIDAD, FELIZ NAVIDAD, FELIZ NAVIDAD, PRÓSPERO AÑO Y FELICIDAD."

At least its timing was reasonably good, being that yesterday was Christmas.  What I wonder is why it couldn't choose a song that I don't hate.  I'm not one of those Bah-Humbug curmudgeons who dislikes all Christmas music; some of it I actually rather enjoy.

I'm of the opinion, however, that listening to "Feliz Navidad" over and over would have been ruled out as a torture device by Tómas de Torquemada on the basis of being too cruel.

Leaving aside my brain's questionable choice of which song to holler at me, a more interesting question is how to get rid of it once it's stuck there.  I've found that for me, the best thing is to replace it with something less objectionable, which in this case would have been just about anything.  There are a couple of pieces of sedate classical music and a slow Irish waltz or two that I can usually use to shove away whatever gawdawful song is on repeat in my skull, if I concentrate on running them mentally in a deliberate fashion.  It eventually worked, but it did take much longer than usual.

José Feliciano is nothing if not persistent.

The reason all of this comes up -- besides my Christmas-music-based bout of insomnia -- is some research out of a team from the University of Colorado - Boulder and the University of Texas - Austin that appeared in Nature Communications this month.  Entitled, "Changes to Information in Working Memory Depend on Distinct Removal Operations," by Hyojeong Kim, Harry Smolker, Louisa Smith, Marie Banich, and Jarrod Lewis-Peacock, this research shows that the kind of deliberate pushing away I use to purge bad music from my brain works in a lot of other situations as well -- and may have applications in boosting creativity and in relieving anxiety, obsessive-compulsive disorder, and PTSD.

What they did was to put a thought into their test subjects' heads -- a photograph of a face, a bowl of fruit, or an outdoor scene -- instructed them to think about it for four seconds, and then to deliberately stop thinking about it, all the while watching what happens to their neural systems using an fMRI machine.  Ceasing to think about something without replacing it with something else is remarkably hard; it brings to mind my dad's recommended cure for hiccups, which is to run around the house three times without thinking of an elephant.

The three different sorts of things they asked the subjects to try -- to replace the thought with something else, to clear all thoughts completely, or to suppress that one thought without replacing it -- all resulted in different patterns on the fMRI.  "Replace" and "clear" both worked fairly rapidly, but both left a trace of the original thought pattern behind -- a "ghost image," as the researchers called it.  "Suppress" took longer, and subjects described it as being more difficult, but once accomplished, the original pattern had faded completely.

"We found that if you really want a new idea to come into your mind, you need to deliberately force yourself to stop thinking about the old one," said study co-author Marie Banich, in a press release from the University of Colorado.

Co-author Jarrod Lewis-Peacock concurred.  "Once we’re done using that information to answer an email or address some problem, we need to let it go so it doesn’t clog up our mental resources to do the next thing."

[Image © Michel Royon / Wikimedia Commons; used with permission]

This explains another phenomenon I've noted; that when I'm trying to think of something I've forgotten, or come up with a solution to a problem that's stumping me, it often helps if I deliberately set it aside.  Just a couple of days ago, I was working on my fiction work-in-progress, and found I'd written myself into a corner.  I'd created a situation that called for some as-yet-undreamed-of plot twist, or else rewriting a big section of it to eliminate the necessity (something I didn't want to do).  After basically beating it with a stick for an hour or two, I gave up in frustration, and went to clean up my garage.

And while working on this chore, and not thinking about my writing at all, a clever solution to the problem simply popped into my head, seemingly out of nowhere.

This is far from the first time this sort of thing has happened to me, and the Kim et al. paper at least gives a first-order approximation as to how this occurs.  Pushing aside what you're thinking about, either consciously and deliberately or else (as in my garage-cleaning example) by replacing it with something unrelated, clears the cognitive thought patterns and gives your brain room to innovate.

Now, where exactly the creative solution comes from is another matter entirely.  I've described before how often my ideas for writing seem to originate from outside my own head.  I don't subscribe to a belief in any sort of Jungian collective unconscious, but sometimes it sure feels that way.

In any case, all of this gives us a lens into how to make our own thought processes more efficient -- in cases of clogged creativity as well as situations where errant thoughts are themselves causing problems, as in PTSD.  What the Kim et al. research suggests is that the first thing to work on is consciously purging the brain in order to create space for more positive and beneficial thoughts.

It's not necessarily easy, of course.  For example, my brain has finally stopped screaming "Feliz Navidad" at me, but has replaced it with "Let it Snow, Let it Snow, Let it Snow," which is only fractionally less annoying.  My considered opinion is that whoever wrote "Let it Snow, Let it Snow, Let it Snow" should be pitched, bare-ass naked, head-first into a snowbank.

Okay, so maybe I am a Bah-Humbug curmudgeon.  God bless us every one anyhow, I suppose.

****************************************

Not long ago I was discussing with a friend of mine the unfortunate tendency of North Americans and Western Europeans to judge everything based upon their own culture -- and to assume everyone else in the world sees things the same way.  (An attitude that, in my opinion, is far worse here in the United States than anywhere else, but since the majority of us here are the descendants of white Europeans, that attitude didn't come out of nowhere.)  

What that means is that people like me, who live somewhere WEIRD -- white, educated, industrialized, rich, and democratic -- automatically have blinders on.  And these blinders affect everything, up to and including things like supposedly variable-controlled psychological studies, which are usually conducted by WEIRDs on WEIRDs, and so interpret results as universal when they might well be culturally-dependent.

This is the topic of a wonderful new book by anthropologist Joseph Henrich called The WEIRDest People in the World: How the West Became Psychologically Peculiar and Particularly Prosperous.  It's a fascinating lens into a culture that has become so dominant on the world stage that many people within it staunchly believe it's quantifiably the best one -- and some act as if it's the only one.  It's an eye-opener, and will make you reconsider a lot of your baseline assumptions about what humans are and the ways we see the world -- of which science historian James Burke rightly said, "there are as many different versions of that as there are people."

[Note:  If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Tuesday, April 25, 2017

Thanks for the memories

I've always been fascinated with memory.  From the "tip of the tongue" phenomenon, to the peculiar (and unexplained phenomenon) of déjà vu, to why some people have odd abilities (or inabilities) to remember certain types of information, to caprices of the brain such as its capacity for recalling a forgotten item once you stop thinking about it -- the way the brain handles storage and retrieval of memories is a curious and complex subject.

Two pieces of recent research have given us a window into how the brain organizes memories, and their connection to emotion.  In the first, a team at Dartmouth and Princeton Universities came up with a protocol to induce test subjects to forget certain things intentionally.  While this may seem like a counterproductive ability -- most of us struggle far harder to recall memories than to forget them deliberately -- consider the applicability of this research to debilitating conditions such as post-traumatic stress disorder.

In the study, test subjects were shown images of outdoor scenes as they studied two successive lists of words.  In one case, the test subjects were told to forget the first list once they received the second; in the other, they were instructed to try to remember both.

"Our hope was the scene images would bias the background, or contextual, thoughts that people had as they studied the words to include scene-related thoughts," said Jeremy Manning, an assistant professor of psychological and brain sciences at Dartmouth, who was lead author of the study.  "We used fMRI to track how much people were thinking of scene-related things at each moment during our experiment.  That allowed us to track, on a moment-by-moment basis, how those scene or context representations faded in and out of people's thoughts over time."

What was most interesting about the results is that in the case where the test subjects were told to forget the first list, the brain apparently purged its memory of the specifics of the outdoor scene images the person had been shown as well.  When subjects were told to recall the words on both lists, they recalled the images on both sets of photographs.

"[M]emory studies are often concerned with how we remember rather than how we forget, and forgetting is typically viewed as a 'failure' in some sense, but sometimes forgetting can be beneficial, too," Manning said.  "For example, we might want to forget a traumatic event, such as soldiers with PTSD.  Or we might want to get old information 'out of our head,' so we can focus on learning new material.  Our study identified one mechanism that supports these processes."

What's even cooler is that because the study was done with subjects connected to an fMRI, the scientists were able to see what contextual forgetting looks like in terms of brain firing patterns.   "It's very difficult to specifically identify the neural representations of contextual information," Manning said.  "If you consider the context you experience something in, we're really referring to the enormously complex, seemingly random thoughts you had during that experience.  Those thoughts are presumably idiosyncratic to you as an individual, and they're also potentially unique to that specific moment.  So, tracking the neural representations of these things is extremely challenging because we only ever have one measurement of a particular context.  Therefore, you can't directly train a computer to recognize what context 'looks like' in the brain because context is a continually moving and evolving target.  In our study, we sidestepped this issue using a novel experimental manipulation -- we biased people to incorporate those scene images into the thoughts they had when they studied new words.  Since those scenes were common across people and over time, we were able to use fMRI to track the associated mental representations from moment to moment."

In the second study, a team at UCLA looked at what happens when a memory is connected to an emotional state -- especially an unpleasant one.  What I find wryly amusing about this study is that the researchers chose as their source of unpleasant emotion the stress one feels in taking a difficult math class.

I chuckled grimly when I read this, because I had the experience of completely running into the wall, vis-à-vis mathematics, when I was in college.  I actually was a pretty good math student.  I breezed through high school math, barely opening a book or spending any time outside of class studying.  In fact, even my first two semesters of calculus in college, if not exactly a breeze, at least made good sense to me and resulted in solid A grades.

Then I took Calc 3.

I'm not entirely sure what happened, but when I hit three-dimensional representations of graphs, and double and triple integrals, and calculating the volume of the intersection of four different solid objects, my brain just couldn't handle it.  I got a C in Calc 3 largely because the professor didn't want to have to deal with me again.  After that, I sort of never recovered.  I had a good experience with Differential Equations (mostly because of a stupendous teacher), but the rest of my mathematical career was pretty much a flop.

And the worst part is that I still have stress dreams about math classes.  I'm back at college, and I realize that (1) I have a major exam in math that day, and (2) I have no idea how to do what I'll be tested on, and furthermore (3) I haven't attended class for weeks.  Sometimes the dream involves homework I'm supposed to turn in but don't have the first clue about how to do.

Keep in mind that this is 35 years after my last-ever math class.  And I'm still having anxiety dreams about it.


What the researchers at UCLA did was to track students who were in an advanced calculus class, keeping track of both their grades and their self-reported levels of stress surrounding the course.  Their final exam grades were recorded -- and then, two weeks after the final, they were given a retest over the same material.

The fascinating result is that stress was unrelated to students' scores on the actual final exam, but the students who reported the most stress did significantly more poorly on the retest.  The researchers call this "motivated forgetting" -- that the brain is ridding itself of memories that are associated with unpleasant emotions, perhaps in order to preserve the person's sense of being intelligent and competent.

"Students who found the course very stressful and difficult might have given in to the motivation to forget as a way to protect their identity as being good at math," said study lead author Gerardo Ramirez.  "We tend to forget unpleasant experiences and memories that threaten our self-image as a way to preserve our psychological well-being.  And 'math people' whose identity is threatened by their previous stressful course experience may actively work to forget what they learned."

So that's today's journey through the recesses of the human mind.  It's a fascinating and complex place, never failing to surprise us, and how amazing it is that we are beginning to understand how it works.  As my dear friend, Professor Emeritus Rita Calvo, Cornell University teacher and researcher in Human Genetics, put it: "The twentieth century was the century of the gene.  The twenty-first will be the century of the brain.  With respect to neuroscience, we are right now about where genetics was in 1917 -- we know a lot of the descriptive features of the brain, some of the underlying biochemistry, and other than that, some rather sketchy details about this and that.  We don't yet have a coherent picture of how the brain works.

"But we're heading that direction.  It is only a matter of time till we have a working model of the mind.  How tremendously exciting!"