Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label brain physiology. Show all posts
Showing posts with label brain physiology. Show all posts

Friday, April 18, 2014

Fear, the amygdala, and "Whistle"

I find fear fascinating.

Fear is an eminently useful evolved characteristic; the ability to recognize and avoid threats has an obvious survival benefit.  Fears can be learned, but as the famous "Little Albert experiment" showed, the object of our fear can result in overgeneralization, which is probably the origin of phobias and other irrational fears.  (For those of you unfamiliar with this interesting, but dubiously ethical, experiment, researchers back in 1920 showed a toddler a variety of objects, including a white rat -- and when the baby reached for the rat, they made a loud noise.  Soon, "Little Albert" would cry when shown the rat, but also when shown other white objects, including a rabbit, a stuffed bear, and a Santa mask.)

All of this comes up because of some recently-published research at Cold Spring Harbor Laboratory, where neuroscientist Bo Li and his colleagues have discovered how we encode fear in the brain -- and how those memories result in behavior.

It has been known for some time that the fear response results from activation of neurons in the amygdala, an almond-shaped structure deep inside the brain.  In 1998, K. S. LaBar et al. used fMRI studies to demonstrate the role of the amygdala in responding to fear stimuli, but it was still unknown how that structure actuated the behaviors associated with the fear response -- sweating, increased pulse, freezing in place, and the fight-or-flight reaction.

Now Li and his colleagues at Cold Spring Harbor have found that there are neurons that connect the amygdala to the brainstem, and that activation of a fear response causes a feedback between the amygdala and brainstem via those neurons -- thus turning an emotional response into a behavior.

"This study not only establishes a novel pathway for fear learning, but also identifies neurons that actively participate in fear conditioning," says Li.  "This new pathway can mediate the effect of the central amygdala directly, rather than signaling through other neurons, as traditionally thought."  Li hopes that his study will be useful in understanding, and perhaps remediating, cases of severe phobias and post-traumatic stress syndrome.

I find myself wondering, however, how this evolved system, with its elaborate architecture and neurochemistry, can explain why some people seek out fear-inducing experiences.  I've been drawn to horror stories since I can remember, and have written more than one myself.  The cachet that writers like Stephen King and Dean Koontz have is hard to explain evolutionarily -- given the fact that fear is unpleasant, intended to drive us to avoid whatever the evokes the response, and is supposed to communicate to our brains the message, "Danger!  Danger!  Run!"

[image courtesy of the Wikimedia Commons]

Take, for example, one of my favorite scary stories -- one I remember well from my youth.  My grandma, who was an avid book collector, had a little paperback copy of a book by C. B. Colby called Strangely Enough.  This book had dozens of odd little one-to-two page stories, most of which fell into the "urban legend" or "folk tale" categories and were entertaining but not particularly memorable.  But one of them, a story called "The Whistle," has stuck with me -- and evidently not only me.  When researching this post, I looked up Colby, and found his little book had been mentioned more than once in websites about scary stories -- and damn near everyone mentioned "The Whistle" as being the scariest of the lot.

I don't have to tell you the story, though, because two film directors, Eric Walter and Jon Parke, thought it was good enough to make a short film based on the story.  It's only seven minutes long, but it captures the essence of what is chilling about Colby's story -- without a single word of dialogue.  It's not gory (for those of you who dislike such things), just viscerally terrifying.  And all of you should right now take seven minutes and watch "Whistle."

There.  Did I tell you?  What I find the scariest about this film is that... almost nothing happens.  You never see the monster, if monster it was.  All there is is a whistling noise.  But it's got all the elements; a widowed woman living alone; a dog who tries to warn her that something is amiss; an old house; a radio that mysteriously malfunctions.

But why is it scary?  It is, I think, precisely because we fear the unknown.  What is known is (usually) harmless; what is unknown is (possibly) deadly.  We've undergone millions of years of evolutionary selection to create brain wiring that keeps us from making stupid decisions, such as confronting a predator while weaponless, trusting a stranger without caution... or staying outside when there's a strange, unearthly noise.

Perhaps that explains why we're drawn to horror fiction.  The character trapped in the story is in danger, perhaps mortal danger, from which (s)he may not be able to escape.  We, on the other hand, can experience the character's fear on a visceral level, but then we can turn the movie off, close the book, go back to our safe, normal lives, secure in the fact that we're not going to die, or at least not yet.  We can get the rush of terror, but then when the scary story is over, the pleasure-and-reward circuits that our brain also evolved can turn on and reassure us that the monsters didn't get us, that we've survived for another day.

And now Bo Li and his colleagues have discovered how the brain helps us to do that.  As for me, I'm going to go have some coffee, and wait until my amygdala calms down, because while I was doing this post, I had to watch "Whistle" again, and now I'm all creeped out.

Saturday, September 7, 2013

Thinking with both sides of the brain

One of the reasons I love science is that it challenges our preconceived notions about the way the world works.

We are data-gatherers and pattern-noticers, we humans.  Even as babies we are watching and learning, and trying to make generalizations about the world based on what we've experienced.  And while many of those generalizations turn out to be correct -- we wouldn't have lasted long as a species if they weren't -- we sometimes draw incorrect conclusions.

And when we do, we tend to hang onto them like grim death.  Once people have settled on a model, for whatever reason -- be it that "it seems like common sense" or that it has gained currency as some kind of "urban legend" -- it becomes extremely hard to undo, even when the science is unequivocal that our beliefs are wrong.

I ran across a particularly good example of that this week.  I teach an introductory neurology class, and when we start talking about brain physiology and its role in personality, inevitably someone brings up the phenomenon of brain lateralization -- the fact that, as we develop, one side of the brain exerts more influence over us physically than the other does.  This is why most of us have a dominant hand, foot, eye, and so forth.

Most common biological traits can be explained based upon some kind of evolutionary advantage they provide, but the jury's still out on this one.  Halpern et al. concluded, in 2005 in The Journal of Neuroscience, in their paper "Lateralization of the Vertebrate Brain: Taking the Side of Model Systems," that the evolutionary advantage of allowing one side of the brain to dominate the motor activity of the body is that it allows the other, non-dominant side to do other things -- something they call "parallel processing."  But even they admitted that this was speculation.

One claim that gained a lot of currency, beginning in the 1960s, was that people who were right brain dominant were artistic, creative, and saw things holistically, and that people who were left brain dominant were logical, verbal, mathematical, and sequential.

Now, there may be some truth to the claim that the sensory-processing centers on the two sides of the brain do see the word differently -- studies done on people who have had strokes in the cerebrum, and those with "split brains" (who have had the corpus callosum cut, preventing cross-talk between the two cerebral hemispheres), do seem to support that there is a dramatic difference in how the two sides of the brain interpret what you see.   (For an amazing personal account that supports this view, check out Jill Bolte Taylor's talk "A Stroke of Insight.")

The idea that people with intact brains are either artistic right-brainers or logical left-brainers has led to a whole slew of "therapies" meant to allow people to "balance their brains."  It has been especially targeted at the left-brainers, who are sometimes seen as cold and calculating.

Many of these treatments require such things as forcing people to write or perform actions with their non-dominant hands, or patching their dominant eye -- the claim being that this will force the poor, subjugated non-dominant side of the brain to feel free to express itself, resulting in an enlightened, fully-realized personality.

All of this, apparently, is pseudoscience.

I've suspected this for a while, frankly.  In my neurology class, we do a physical brain dominance test, and someone always asks about brain lateralization's role in personality.  When this happens, I have had to do something I am always reluctant to do, which is to say, "Well, I haven't seen any research, but this seems to me to be bogus."

I don't have to say that any more. 

Two weeks ago, the peer-reviewed journal PLOS-One published a paper by Jared A. Nielsen, Brandon A. Zielinski, Michael A. Ferguson, Janet E. Lainhart, and Jeffrey S. Anderson entitled, "An Evaluation of the Left-Brain vs. Right-Brain Hypothesis with Resting State Functional Connectivity Magnetic Resonance Imaging."  In this paper they describe a series of experiments that looked at the actual structure of the brain, and its connectivity -- and they found that there's no such thing as a "right-brain" personality and "left-brain" personality based upon anything real that is present in the brain wiring.  Here's what they said in their discussion section:
In popular reports, “left-brained” and “right-brained” have become terms associated with both personality traits and cognitive strategies, with a “left-brained” individual or cognitive style typically associated with a logical, methodical approach and “right-brained” with a more creative, fluid, and intuitive approach. Based on the brain regions we identified as hubs in the broader left-dominant and right-dominant connectivity networks, a more consistent schema might include left-dominant connections associated with language and perception of internal stimuli, and right-dominant connections associated with attention to external stimuli.

Yet our analyses suggest that an individual brain is not “left-brained” or “right-brained” as a global property, but that asymmetric lateralization is a property of individual nodes or local subnetworks, and that different aspects of the left-dominant network and right-dominant network may show relatively greater or lesser lateralization within an individual.
So the truth turns out to be more complicated, but more interesting, than the commonly-accepted model.  We tend to do that a lot, don't we?  After all, what is much of pseudoscience but an attempt to impress order upon nature, to make it fit in neat little packages, to make it work the way we'd like it to?  Astrology, for example, would have you believe that there are twelve personality types, and that anything about your behavior that needs explanation can be filed under the heading of, "Oh, but of course I'm like that.  I'm a Scorpio."

But the world is complex and messy, and doesn't care about our desire for order.  However, it is also beautiful and mysterious and fascinating, and ultimately, understandable.  And science remains our best lens for doing so, for blowing away the dust and cobwebs of our preconceived notions, and helping us to comprehend the world as it is.

And it works regardless of which side of the brain you're thinking with.

Tuesday, May 7, 2013

The "honey trap," perception, and our sense of self

I think the human brain is fascinating.  Not a surprising statement, I suppose, coming from someone who teaches (amongst other things) an introductory neurology course.  What intrigues me most, though, is the way all of this rock-solid sense of self we all have -- the sum of our perceptions, attitudes, experiences, and memories -- is the result of a bunch of chemicals jittering around in 1.3 kilograms of skull-glop, and an electrical output that would only be sufficient to illuminate a twenty-watt light bulb.

And if that's not humbling enough, our personalities may not be as rock-solid as all that.  If something changes the chemistry or the pattern of electrical firings in your brain, who you are and what you experience changes.  As my long-ago physiology professor said, "In a very real way, your brain is the only sensory organ you have.  If your brain gets tricked, that is what you think you've seen, or heard, or felt."

It works all the way up to the level of our emotions and personality, too -- realms of the human experience that are supposed to be somehow "different."  Okay, we can accept it when a drug makes you hallucinate; that's just the brain's neural firings being altered.  But our attitudes, biases, preferences, emotional reactions -- no, that's something else entirely.  Those are all part of this "me" that is independent of the "meat machine" in my skull, this spiritual entity that is separate from mere biochemistry, a personal being that can well be imagined going on after the animal part dies.

Right?

Eight scientists in the Department of Human Environment Studies at Kyushu University in Japan have just punched another hole in this belief, with a paper that appeared in Nature last week entitled, "Minocycline, a microglial inhibitor, reduces 'honey trap' risk in human economic exchange."  In this study, Motoki Watabe et al. had observed that minocycline, a tetracycline-derivative antibiotic, had not only been useful for fighting infections but had led to improvement in psychological disorders in patients who were taking it.  In particular, taking minocycline seemed to improve patient's capacity for "sober decision-making."  So the group at Kyushu University decided to see if they could pinpoint what, exactly, was changing in the brain of a person on minocycline.

The results were, to say the least, eye-opening.

It's long been known that human males tend to trust physically attractive females, sometimes leading to their betrayal -- a tendency called the "honey trap" that has been used as a plot twist in hundreds of thrillers, all the way back to Milady Winter and d'Artagnan in The Three Musketeers.


Well, the "honey trap" response vanishes in men on minocycline.

The men in the experiment were split into two groups -- one group got the antibiotic, the others a placebo.  None knew which they'd gotten:
In this experiment, 98 healthy males played a trust game with 8 photographed young females after a 4-day oral treatment course of either minocycline or placebo. Looking at a picture showing a female's face, male players decided how much out of 1300 yen (approximately 13 USD) they would give to each female. Males then evaluated how trustworthy each female was and how physically attractive she was using a 11-point Likert Scale (0: Not at all – 10: Perfectly so). Of note, all of the photographed females had actually decided, in advance, to choose ‘betray’ against the male players. Therefore, male participants played with untrustworthy female partners, but were unaware of the deception.
Overwhelmingly, the men who were in the control group showed a strong correlation between rating a woman as highly attractive and being trustworthy; the group on minocycline showed no such correlation.  They recognized attractiveness, ranking some photographs as more attractive than others; but they ranked all of the women as about equal in trustworthiness.

A much more reasonable response, given that they all were strangers!

Watabe et al. suggested that this indicates a role in cognition for the microglia -- cells that heretofore were thought mostly to mediate the brain's immune defense system and blood/brain barrier, and which are inhibited by minocycline.  Me, I'm more intrigued by the larger issue, that who we are, the central core of our personalities, might be far more dependent on minor changes in brain chemistry than most of us are comfortable admitting.

It's also why I have a hard time accepting the idea that the visions experienced by people on dimethyltryptamine (DMT) actually mean anything, in the spiritual sense.  People on DMT report overwhelming hallucinations that were "spiritually transforming," in which they had the sense of being connected with "higher mind" -- i.e., with god.  Terrence McKenna, one of the primary exponents of the use of this drug for inducing spiritual experiences, describes one of his trips this way:
(Y)ou, when you're shown one of these things, a single one of them, you look at it an you know, without a shadow of a doubt, in the moment of looking at this thing, that if it were right here, right now, this world would go mad.  It's like something from another dimension.  It's like an artifact from a flying saucer. It's like something falling out of the mind of God - such objects DO not exist in this universe, and yet, you're looking at it.  [Source]
My problem with all of this is not some kind of moralistic "don't do that stuff to your body," nor is it even a concern for the side effects; it's more that the whole thing strikes me as kind of... silly.  If you throw a monkey wrench into your neurotransmitters, of course you're going to see weird shit.  Acting as if what you're seeing has some sort of external reality seems to me to be a major stretch, landing us right into the weird world of such wingnuts as Carlos CastaƱeda with his datura root and magic mushrooms as a means of contacting the "ally."

I know, however, that we're also getting perilously close to a topic I touched briefly on a few weeks ago, namely, how we can prove that anything outside our experiences is real.  And I've no desire to skate out onto that philosophical thin ice once again.  But I do think that the scientists in Japan have given yet another blow to our sense of having some kind of permanent external "self" that is independent of our biology.  If all it takes is an antibiotic tablet to change who we trust, it seems that we are, on a fundamental level, what our brain chemistry is at the moment -- and not very much else.

Monday, March 4, 2013

The case of the telepathic mice

One area in which a lot of people could use some work is in how to draw logical connections.

It's not that it's necessarily that simple.  Given a lot of facts, the question, "Now what does this all mean?" can be decidedly non-trivial.  After all, if it were trivial, there would be only one political party, and the only job we skeptics would have would be uncovering what the facts actually are.  The deductive work, the drawing of a conclusion, would be quick and unanimous, and Washington DC would be a decidedly more congenial place.

To take a rather simpler example, let's look at the following picture, that's been making the rounds of the social network lately:


Even ignoring the rather dubious religious aspect, this seems to me to be a rather ridiculous conclusion.  Just because these foods vaguely resemble a human organ (really vaguely, in the case of the tomato and the heart), is their supposed beneficial effect on that organ why they look that way?  It doesn't take a rocket scientist, nor a botanist, to find a dozen counter-examples, of plants that look like a human organ, but which have no beneficial effects on that organ whatsoever.  (This whole idea goes back to medieval times, when it was known as the "Doctrine of Similars."  It's why so many plants' names end in "-wort" -- wyrt was Old English for "plant," and the doctors of that time, whom we must hope had their malpractice insurance paid up, used lungwort, liverwort, spleenwort, and the rest to try to cure their patients.  No wonder the life expectancy back then was so low.)

On the other hand, Amanita mushrooms look a little like a penis, and if you eat one, you're fucked.  So maybe there's something to this after all.

In any case, let's move on to something a little trickier -- last week's story of the telepathic mice.

Miguel Nicolelis, of Duke University, announced last week that he'd been able to accomplish something that no one had done -- to create a device that allowed the electrical firings in one brain (in this case, a mouse) to be beamed to another brain, influencing that brain's firing.  In his paper, released in Nature, Nicolelis and his team describe engineering microelectrodes that were then implanted in the primary motor cortex of mouse #1.  These electrodes are capable of detecting the neural firing pattern in the mouse's brain -- specifically, to determine which of a pair of levers the mouse selected to pull.  A second mouse has a different set of implants -- one which stimulate neurons.  If mouse #1 pulls the right hand lever, and mouse #2 does, too, they both get a treat.  They can't see each other -- but the electrodes in the brain of mouse #1 sends a signal, via the electrode array, to the electrodes in the brain of mouse #2, stimulating it to choose the correct lever.

Direct, brain-to-brain communication.  Obvious application to medicine... and the military.  But my problem is how it's been described in popular media.  Everyone's calling it "telepathy" -- making a number of psychic websites erupt in excited backslapping, claiming that this "scientifically proves telepathy to be real."  "They just showed what we've been claiming for decades," one thrilled woo-woo stated.

The problem is -- is this actually telepathy?  Well, in one limited sense, yes; the word, after all, comes from the Greek tele (distant) + pathĆ©ia (feeling).  So, yes, the mice were able to feel, or at least communicate, at a distance.  But remember that the only reason it worked was that both the encoder and the decoder mouse had electrode arrays stuck into their brains.  There's an understood mechanism at work here; Nicolelis knows exactly how the signal from mouse #1 got to mouse #2 and stimulated its brain to perform the task correctly.  This is in exact opposition to the usual claims of telepathy -- that somehow (no mechanism specified) one human brain can pick up information from another, sometimes over great distances.  Complex information, too; not just enough to know which lever to choose, but whole conversations, visual images, sounds, and emotions.

Oh, and some people think they can get into telepathic contact with their pets.  Which adds a whole new level of craziness to the claim.

So, actually, what Nicolelis got his mice to do isn't telepathy at all, at least not in the usual sense of the word.  But on a surface read, it would be easy to miss the difference, to see why (in fact) his experiment makes the claims of the telepaths less likely, not more.  If it takes fancy arrays of electrodes to allow the transmission of even the simplest of information, how on earth could two brains communicate far more complex information, without any help at all?  Add that to the fact that there has not been a single experiment that has conclusively demonstrated that telepathy, as advertised, actually exists (for an excellent, and unbiased, overview of the history of telepathy experiments, go here).  It seems very likely, just based on the evidence, that telepathy doesn't exist -- not between Nicolelis' mice, and certainly not between humans.

Just as well, really.  I'd really rather people not read my mind.  For one thing, my brain can be a little... distractible:


Most days, reading my mind would be the telepathic equivalent of riding the Tilt-o-Whirl.  So probably better that my thoughts remain where they are, bouncing randomly off the inside of my skull as usual.

Tuesday, January 29, 2013

Remembrance of things past

I'm going to begin today's post with a bit of shameless self-promotion.  The wonderful site The Skeptic is sponsoring a contest to select (amongst other things) the best skeptical blog of 2012, and if you have read and enjoyed Skeptophilia, I'd like to toss aside my usual charming modesty and ask for your vote.  It takes only a moment -- click on the site link I posted above, and go down to the heading "Best Blog of 2012," and put in my website address (skeptophilia.blogspot.com).  I'd appreciate it immensely!

********************************

I've been interested in memory as long as I can remember.  Part of the reason is that my own personal brain seems to be made up of a rather peculiar assemblage of things I can remember with apparent ease and things that I don't seem to be able to remember at all.  I recall music with no effort whatsoever; I once put a nifty little Serbian dance tune into long-term storage for over twenty years after hearing it twice (and not practicing it or writing it down in the interim).  Names, likewise, stick with me; I know more scientific names of obscure species than is useful or even reasonable, and it's not from engaging in any sort of surreptitious memorization of taxonomic lists late at night when no one's looking.  That sort of stuff simply sticks.

On the other hand, numbers.  I know people who can remember what their phone number was in houses they haven't lived in for thirty years.  I'm lucky when I can remember what my phone number is now.  In this day of passwords, PINs, and so on, there are a variety of number/letter combinations I'm expected to remember, and the maximum amount of these I seem to be able to recall is: one.  For all of the passwords where this is possible, I use the same one.  If anyone ever discovers it, I'm royally screwed.  Fortunately, it's pretty obscure, so I don't think it's likely (meaning you shouldn't waste your time trying to figure it out).

It does, however, point up something odd about memory, which is how compartmentalized it is.  People can be exceptionally good at certain types of memory, and rather bad at others.  A few things, however, seem common to all sorts of memory; repetition improves retention, memory consolidation increases after sleep, and we all get worse at it (all types) as we age.

This last one is the subject of a recent bit of research published in Nature (available here), by Zhenzhong Cui, Ruiben Feng, Stephanie Jacobs, Yanhong Duan, Huimin Wang, Xiaohua Cao, and Joe Z. Tsien, as a collaborative project between Georgia Health Sciences University and East China Normal University.  The experiments involved using transgenic mice that overproduced a neurotransmitter receptor called NR2A, and found that they were significantly poorer than normal at forming new long-term memories than ordinary mice were.  The reason, the researchers speculate, is that this receptor is involved in weakening the synaptic firing patterns from old memories.

Put another way, it seems like one of the reasons we become more forgetful as we age is that we aren't as good at getting rid of things we already have stored in there.  In an interview with The New York Times, study lead author Joe Z. Tsien compares our brains when young to a blank page, and older brains to a page from a newspaper.  "The difference is not how dark the pen is," he said, "but that the newspaper already has writing on it."

"What our study suggests," Tsien added, "is that it’s not just the strengthening of connections, but the weakening of the other sets of connections that creates a holistic pattern of synaptic connectivity that is important for long-term memory formation."

In other words, our brains really do fill up and (in some sense) run out of space.

It's a funny thought, isn't it?  One of the reasons I can't remember where I left my keys is because my brain still is determined to hang onto the name of my 7th grade English teacher (Mrs. Trowbridge).

I find this a fascinating result, partly because it contradicts my long-held belief (admittedly based on no evidence whatsoever) that no one ever gets close to the actual memory storage capacity of the brain.  Also, it brings up the questionably prudent possibility of developing technology to selectively erase memories, Ć  la Eternal Sunshine of the Spotless Mind.  Not, in this case, to eliminate traumatic or unpleasant memories, as it was for Jim Carrey's character -- but to free up hard drive space.

In any case, this is only the beginning.  A dear friend of mine, the brilliant (now retired) Cornell human genetics professor Dr. Rita Calvo, once made the prediction that "if the 20th century was the century of the gene, the 21st will be the century of the brain."  We are, she said, right now with respect to our understanding of the brain approximately where we were in 1913 with respect to our understanding of genetics -- we know a little bit of the "what" and the "how much," but almost nothing about the "how" and the "why."

If so, we should be looking forward to some amazing advances over the next few years, and I'm sure I'll have to do a lot of reading to keep up with the research even well enough to teach competently my Introductory Neurology class.  It's exciting, however, to think that we may finally be elucidating the inner workings of our most intricate organ, and finding out how it does one of the most mysterious things of all -- storing, and retrieving, information.

Oh, and one more thing; did you vote for my blog?  I hope you hadn't forgotten.

Friday, January 18, 2013

Amodal nudity

I am endlessly interested in human perception.  The way our sensory systems, and the sensory-integrative parts of the brain, work is one of the most fascinating parts of biology -- and one of the least understood.  And it only gets more interesting when you combine it with that topic that everyone thinks about and pretends that they don't: sex.

I started considering the connection between sex and perception because of an article on the James Randi Educational Foundation's website called "The Lurking Pornographer: Why Your Brain Turns Bubbles into Nude Bodies."  In it, we find photos that lead to the rather startling conclusion that when a swimsuit-clad individual has the swimsuit covered up by strategically placed blank space, our brain makes the executive decision that the person in the photograph is naked.

Don't believe me?  Take a look:


And lest you think it's just because men are sex-obsessed, it works for photographs of guys, too:


The author of the article, Kyle Hill, explains this effect as the "lurking pornographer" in the brain; that the brain is always "looking for the body parts we are trying to cover up" as an outcome of the pressure to reproduce.  I wonder, though, if it might be simpler than that; my guess is that this is just a form of amodal completion, where the brain tries to fill in the gaps in incomplete images in the way that requires the least assumptions.  A simple example is the Kanisza triangle:


That you have a white triangle overlaid on top of a triangular outline and three black circles is a simpler explanation than having three V-shaped bits and three black Pac-Man shapes all laid out just so as to appear to make two triangles.  But amodal completion, like any inference based on incomplete information, can also get it wrong.  Consider the horse(s) and cat(s) in the following drawing:


Two horses (in the left-hand drawing) and two cats (in the right-hand one) are merged into one by the brain "forcing" a wrong interpretation -- amodally completing the two animals into a single, extra-long horse and cat because of trying to fill in the missing pieces in the simplest possible way.

Likewise, when we have no other information about a person -- all we see is skin -- inferring a swimsuit seems like a jump.  The easier solution is that they're running around naked.  And of course, the fact that this stimulates our brains in a different way makes us stick with that solution once we've arrived at it.

It's like our neural network is hardwired with a perceptual form of Ockham's Razor, especially when the simpler solution is one that's kind of fun to look at.

Still, there's no doubt that most of our brains are obsessed with sex.  The three chemicals that mediate the majority of the sexual response -- dopamine, oxytocin, and endorphin -- are a mighty powerful cocktail, and none of them have much to do with thinking.  Interestingly, dopamine is the same neurotransmitter that is involved in addiction -- which may explain why the "sex drive" is called a "drive."

Oh, and about all of the claims that men think about sex twice as much as women do; there may be something to it.  According to recent research by Dr. Ananya Mandal, the preoptic nucleus of the hypothalamus, which is one area of the brain involved in sexual response and mating behavior, is 2.2 times larger in men than in women.  Size apparently matters -- in that respect, at least.

Anyhow, I thought all of this was pretty cool.  It's always interesting to find out why we do what we do.  It's why I found Desmond Morris' classic book The Naked Ape so fascinating when I first ran into it, at age 17 -- I'd never before considered human behavior from the standpoint of looking at humanity as if we were just another animal species.  And far from being demeaning, that perceptual shift leaves me feeling interconnected to the rest of the natural world in a far more intricate way.  We have reasons for doing what we do, just like every other living thing on Earth -- including the birds and the bees.

Thursday, November 1, 2012

Brains, mysticism, and melting faces

"Well, I saw it.  I saw it with my own eyes."

You hear that a lot, in claims of the paranormal.  I was just sitting there, in my room, and the ghost floated in through the wall.  I was outside at night, and I saw the UFO zoom across the sky.  I was at the lake, and I saw ripples in the water, and a dinosaur's head poked out and looked at me.

In a court of law, "eyewitness testimony" is considered one of the strongest pieces of evidence, and yet time and again experimental science has shown that your sensory apparatus and your memory are flawed and unreliable.  It doesn't take much to confuse your perception -- witness how persuasive many optical illusions are -- and if you couple that with how easily things get muddled in your memory, it's no wonder that when eyewitness claims of the paranormal are presented to scientists, most scientists say, "Sorry.  We need more than that."

Just last week, a study published in the Journal of Neuroscience put another nail in the coffin of our perceptual integrative systems, showing how easy it is to trigger someone to see something that isn't there in a completely convincing way.  [Source]  Ron Blackwell, an epileptic, was in the hospital having tests done to see if a bit of his brain that was causing his seizures could be safely removed.  As part of the pre-surgical tests at Stanford University Hospital, his doctor, Dr. Josef Parvizi, had placed a strip of electrodes across his fusiform gyrus, a structure in the temporal lobe of the cerebrum.  And when the electrodes were activated, Blackwell saw Dr. Parvizi's face melt.

"You just turned into somebody else," Blackwell said. "Your face metamorphosed.  Your nose got saggy, went to the left.  You almost looked like somebody I'd seen before, but somebody different."  He added, rather unnecessarily, "That was a trip."

This study has three interesting outcomes, as far as I'm concerned.

First, it shows that the fusiform gyrus has something to do with facial recognition.  I'm personally interested in this, because as I've described before in Skeptophilia, I have a peculiar inability to recognize faces.  I don't have the complete prosopagnosia that people like the eminent science writer Oliver Sacks has -- where he doesn't even recognize his own face in a mirror -- but the fact remains that I can see a person I've met many times before in an unfamiliar place or circumstance, and literally have no idea if I've ever seen them before.  However -- and this is relevant to Parvizi's study -- other human features, such as stance, gait, and voice, I find easily and instantly recognizable.  And indeed, Blackwell's experience of seeing his doctor's face morph left other body parts intact, and even while the electrodes were activated, Blackwell knew that Dr. Parvizi's body and hands were "his."  So it seems that what psychologists have claimed -- that we have a dedicated module devoted solely to facial recognition -- is correct, and this study has apparently pinpointed its location.

Second, this further supports a point I've made many times, which is that if you fool your brain, that's what you perceive.  Suppose Blackwell's experience had occurred a different way; suppose his fusiform gyrus had been stimulated by one of his seizures, away from a hospital, away from anyone who could immediately reassure him that what he was seeing wasn't real, with no one there who could simply turn the electrodes off and make the illusion vanish.  Is it any wonder that some people report absolutely convincing, and bizarre, visions of the paranormal?  If your brain firing pattern goes awry -- for whatever reason -- you will perceive reality abnormally.  And if you are already primed to accept the testimony of your eyes, you very likely will interpret what you saw as some sidestep into the spirit world.  Most importantly, your vehement claims that what you saw was real cannot be accepted into evidence by science.  Ockham's Razor demands that we accept the simpler explanation, that requires fewer ad hoc assumptions, which is (sorry) that you simply had an aberrant firing pattern occur in your brain.

Third, this study has significant bearing on the stories of people who claim to have had "spiritual visions" while under the influence of psychoactive drugs.  One in particular, DMT (dimethyltryptamine), present in such ritual concoctions as ayahuasca, is supposed to create a "window into the divine."  A number of writers, particularly Terence McKenna and Rick Strassman (the latter wrote a book called DMT: The Spirit Molecule), claim that DMT is allowing you to see and communicate with real entities that are always there, but which only the drug allows you to experience.  Consider McKenna's account of his first experience with the chemical:
So I did it and...there was a something, like a flower, like a chrysanthemum in orange and yellow that was sort of spinning, spinning, and then it was like I was pushed from behind and I fell through the chrysanthemum into another place that didn't seem like a state of mind, it seemed like another place.  And what was going on in this place aside from the tastefully soffited indirect lighting, and the crawling geometric hallucinations along the domed walls, what was happening was that there were a lot of beings in there, what I call self-transforming machine elves.  Sort of like jewelled basketballs all dribbling their way toward me.  And if they'd had faces they would have been grinning, but they didn't have faces.  And they assured me that they loved me and they told me not to be amazed; not to give way to astonishment.
A generation earlier, Carlos CastaƱeda recounted similar sorts of experiences after ingesting datura root and psilocybe mushrooms, and like McKenna and Strassman, CastaƱeda was convinced that what he was seeing was absolutely real, more real in fact than the ordinary world around us.

My response, predictably, is: of course you saw weird stuff, and thought it was real.  What did you expect?  You monkey around with your brain chemistry, and you will obviously foul up your perceptual apparatus, and your ability to integrate what's being observed.  It's no more surprising that this happens than it would be if you spilled a cup of coffee on your computer, and it proceeded to behave abnormally.  If you short out your neural circuitry, either electrically (as Parvizi did) or chemically (as McKenna and others did), it should come as no shock that things don't work right.  And those altered perceptions are hardly evidence of the existence of a mystical world.

In any case, Parvizi's accidental discovery is a fascinating one, and will have wide-reaching effects on the study of perceptual neuroscience.  All of which supports what a friend of mine, a retired Cornell University professor of human genetics, once told me: the 21st century will be the century of the brain.  We are, she said, at a point of our understanding of how the brain works that corresponds to where geneticists were in 1912 -- we can see some of the pieces, but have no idea how the whole system fits together.  Soon, she predicts, we will begin to put together the underlying mechanism -- and at that point, we will be starting to develop a complete picture of how our most complex organ actually works.