Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label sensory organs. Show all posts
Showing posts with label sensory organs. Show all posts

Friday, December 9, 2022

It's a bird, it's a plane... no, it's both

One topic I've come back to over and over again here at Skeptophilia is how flawed our sensory/perceptive apparatus is.  Oh, it works well enough; most of the time, we perceive the external world with sufficient clarity not to walk into walls or get run over by oncoming trains.  But our impression that we experience the world as it is -- that our overall ambient sense of everything around us, what the brilliant neurophysiologist David Eagleman calls our umwelt, is a crystal-clear reflection of the real universe -- simply is false.

All it takes is messing about with optical illusions to convince yourself how easy our brains and sensory organs are to fool.  For example, in the following drawing, which is darker; square A or square B?


They're exactly the same.  Don't believe me?  Here's the same drawing, with a pair of gray lines superimposed on it:



Because your brain decided that B was in the shadow and A wasn't, then it concluded that A had to be intrinsically darker.  What baffles me still about this illusion is that even once you know how the trick works, it's impossible to see it any other way.

As astronomer Neil deGrasse Tyson put it, "Our brains are rife with ways of getting it wrong.  You know optical illusions?  That's not what they should call them.  They should call them brain failures.  Because that's what they are.  A few cleverly drawn lines, and your brain can't handle it."

Well, we just got another neat hole shot in our confidence that what we're experiencing is irrefutable concrete reality with a study that appeared in the journal Psychological Science this week.  What the researchers did was attempt to confound the senses of sight and hearing by showing test subjects a photograph of one object morphing into another -- say, a bird into an airplane.  During the time they studied the photograph, they were exposed to a selection from a list of sounds, two of which were relevant (birdsong and the noise of a jet engine) and a number of which were irrelevant distractors (like a hammer striking a nail).

They were then told to use a sliding scale to estimate where in the transformation of bird-into-airplane the image was (e.g. seventy percent bird, thirty percent airplane).  What the researchers found was that people were strongly biased by what they were hearing; birdsong biased the test subjects to overestimate the birdiness of the photograph, and reverse happened with the sound of a jet engine.  The irrelevant noises didn't effect choice (and thus, when exposed to the irrelevant noises, their visual perceptions of the image were more accurate).

"When sounds are related to pertinent visual features, those visual features are prioritized and processed more quickly compared to when sounds are unrelated to the visual features," said Jamal Williams, of the University of California - San Diego, who led the study, in an interview with Science Daily.  "So, if you heard the sound of a birdsong, anything bird-like is given prioritized access to visual perception.  We found that this prioritization is not purely facilitatory and that your perception of the visual object is actually more bird-like than if you had heard the sound of an airplane flying overhead."

I guess it could be worse; at least hearing birdsong didn't make you see a bird that wasn't there.  But it does once again make me wonder how eyewitness testimony is still considered to carry the most weight in a court of law when experiment after experiment has demonstrated not only how incomplete and easily biased our perceptions are, but how flawed our memories are.

Something to keep in mind next time you are tempted to say "I know it happened that way, I saw it with my own eyes."

****************************************


Monday, May 4, 2020

The return of the senses

The news has been pretty uniformly dismal lately.

I don't even have to list all the ways.  We've all been inundated by the headlines, not to mention how these developments have changed our lives, and it's becoming increasingly clear those changes aren't going away soon.  It's easy to get discouraged, to decide that everything is bleak and hopeless.

So today, I want to look at a new development that should make you feel at least a little better about what humanity can accomplish -- in this case, for people who have been through the devastating experience of losing a limb.

A high school friend of mine was involved in a terrible accident on his family farm and ended up losing both of his arms from the elbow down.  He was fitted with prosthetic arms, and after recovering managed amazingly well -- his courage and fortitude through this ordeal was something that inspired our entire school, and still inspires me to this day.  But his prostheses were no real replacements for lower arms and hands, and there was (and is) a lot he could not do.

Those limitations might soon be a thing of the past.

A collaboration between Chalmers University of Technology, Sahlgrenska University Hospital, the University of Gothenburg, and Integrum AB (a Swedish medical technology firm), the Medical University of Vienna, and the Massachusetts Institute of Technology has produced prosthetic arms for three amputees in Sweden that interface directly with the user's nerves, muscles, and skeletons.  Not only does this mean that the patient has much improved fine motor control over the prosthetic hand, but the nerve connection runs both ways, not only delivering output to control what the hand does, but relaying input received by the hand back to the brain.

Put simply: this prosthesis has a sense of touch.

"Our study shows that a prosthetic hand, attached to the bone and controlled by electrodes implanted in nerves and muscles, can operate much more precisely than conventional prosthetic hands," said Max Ortiz Catalan, who headed the research and was lead author on the paper describing it that appeared last week in the New England Journal of Medicine, in an interview with Science Daily.  "We further improved the use of the prosthesis by integrating tactile sensory feedback that the patients use to mediate how hard to grab or squeeze an object.  Over time, the ability of the patients to discern smaller changes in the intensity of sensations has improved."

The new prostheses, as amazing as they are, are just the first step.  "Currently, the sensors are not the obstacle for restoring sensation," said Ortiz Catalan.  "The challenge is creating neural interfaces that can seamlessly transmit large amounts of artificially collected information to the nervous system, in a way that the user can experience sensations naturally and effortlessly."

It's kind of amazing how fluid the human brain can be.  Neuroscientist David Eagleman, in his brilliant talk "Can We Create New Senses for Humans?", describes our sensory organs as being like the peripherals in a computer system -- and explains how quickly the brain can learn to obtain the same information from a different peripheral.  Some of his examples:
  • blind people using echolocation -- clicks -- to create a "soundscape" and navigate their surroundings
  • in a separate experiment, the blind using a head-mounted camera connected by an electrical lead to a flat, horseshoe-shaped piece of metal resting on the tongue -- the camera translates what it "sees" into a pattern of tiny voltage changes in the piece of metal, which the brain converts to rudimentary visual images
  • the hearing impaired using a vibrating vest hooked up to a microphone to learn to "hear" through the vibrations on their skin
For me, the most stunning thing about these examples is that the brain learns to reinterpret the signals coming from the "peripheral" -- in the first example, sounds activate the visual cortex; in the second, touch stimuli activate the visual cortex; in the third, touch stimuli activate the auditory cortex.  All neural signals are the same; the brain simply decides how to interpret them.  You literally are seeing with your ears, seeing with your tongue, or hearing with your skin.

Here, though, the peripheral really is a peripheral, i.e., a machine.  You're not co-opting one of your pre-existing senses for a different purpose; you're hooking in an external apparatus to your brain, receiving input from an array of computerized sensors.  You may have been reminded, as I was, of Luke Skywalker:


It's a phenomenal improvement over previous prostheses, that were moved by muscle contractions in the arm it was attached to; here, the prosthesis is not only mind-controlled, it sends information back to the brain about what it's touching, giving the wearer back at least the beginnings of a sense of touch.

"Right now, patients in Sweden are participating in the clinical validation of this new prosthetic technology for arm amputation," said Ortiz Catalan.  "We expect this system to become available outside Sweden within a couple of years, and we are also making considerable progress with a similar technology for leg prostheses, which we plan to implant in a first patient later this year."

So the news these days isn't all bad, even if you have to dig a bit to find the heartening parts.  Regardless of what's happening now, I remain an optimist about human compassion and human potential.  I'm reminded of the final lines of the beautiful poem "Desiderata" by Max Ehrmann: "With all its sham, drudgery and broken dreams, it is still a beautiful world.  Be cheerful.  Strive to be happy."

**********************************

This week's Skeptophilia book recommendation is about a phenomenal achievement; the breathtaking mission New Horizons that gave us our first close-up views of the distant, frozen world of Pluto.

In Alan Stern and David Grinspoon's Chasing New Horizons: Inside the Epic First Mission to Pluto, you follow the lives of the men and women who made this achievement possible, flying nearly five billion kilometers to something that can only be called pinpoint accuracy, then zinging by its target at fifty thousand kilometers per hour while sending back 6.25 gigabytes of data and images to NASA.

The spacecraft still isn't done -- it's currently soaring outward into the Oort Cloud, the vast, diffuse cloud of comets and asteroids that surrounds our Solar System.  What it will see out there and send back to us here on Earth can only be imagined.

The story of how this was accomplished makes for fascinating reading.   If you are interested in astronomy, it's a must-read.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Saturday, December 24, 2016

Signal out of noise

I think I share with a lot of people a difficulty in deciphering what someone is saying when holding a conversation in a noisy room.  I can often pick out a few words, but understanding entire sentences is tricky.  A related phenomenon I've noticed is that if there is a song playing while there's noise going on -- in a bar, or on earphones at the gym -- I often have no idea what the song is, can't understand a single word or pick up the beat or figure out the music, until something clues me in to what the song is.  Then, all of a sudden, I find I'm able to hear it more clearly.

Some neuroscientists at the University of California - Berkeley have just found out what's happening in the brain that causes this oddity in auditory perception.  In a paper in Nature: Communications that came out earlier this week, authors Christopher R. Holdgraf, Wendy de Heer, Brian Pasley, Jochem Rieger, Nathan Crone, Jack J. Lin, Robert T. Knight, and Frédéric E. Theunissen studied how the perception of garbled speech changes when subjects are told what's being said -- and found through a technique called spectrotemporal receptive field mapping that the brain is able to retune itself in less than a second.

The authors write:
Experience shapes our perception of the world on a moment-to-moment basis.  This robust perceptual effect of experience parallels a change in the neural representation of stimulus features, though the nature of this representation and its plasticity are not well-understood.  Spectrotemporal receptive field (STRF) mapping describes the neural response to acoustic features, and has been used to study contextual effects on auditory receptive fields in animal models.  We performed a STRF plasticity analysis on electrophysiological data from recordings obtained directly from the human auditory cortex.  Here, we report rapid, automatic plasticity of the spectrotemporal response of recorded neural ensembles, driven by previous experience with acoustic and linguistic information, and with a neurophysiological effect in the sub-second range.  This plasticity reflects increased sensitivity to spectrotemporal features, enhancing the extraction of more speech-like features from a degraded stimulus and providing the physiological basis for the observed ‘perceptual enhancement’ in understanding speech.
What astonishes me about this is how quickly the brain is able to accomplish this -- although that is certainly matched by my own experience of suddenly being able to hear lyrics of a song once I recognize what's playing.  As James Anderson put it, writing about the research in ReliaWire, "The findings... confirm hypotheses that neurons in the auditory cortex that pick out aspects of sound associated with language, the components of pitch, amplitude and timing that distinguish words or smaller sound bits called phonemes, continually tune themselves to pull meaning out of a noisy environment."

A related phenomenon is visual priming, which occurs when people are presented with a seemingly meaningless pattern of dots and blotches, such as the following:


Once you're told that the image is a cow, it's easy enough to find -- and after that, impossible to unsee.

"Something is changing in the auditory cortex to emphasize anything that might be speech-like, and increasing the gain for those features, so that I actually hear that sound in the noise," said study co-author Frédéric Theunissen.  "It’s not like I am generating those words in my head.  I really have the feeling of hearing the words in the noise with this pop-out phenomenon.  It is such a mystery."

Apparently, once the set of possibilities of what you're hearing (or seeing) is narrowed, your brain is much better at extracting meaning from noise.  "Your brain tries to get around the problem of too much information by making assumptions about the world," co-author Christopher Holdgraf said.  "It says, ‘I am going to restrict the many possible things I could pull out from an auditory stimulus so that I don’t have to do a lot of processing.’ By doing that, it is faster and expends less energy."

So there's another fascinating, and mind-boggling, piece of how our brains make sense of the world.  It's wonderful that evolution could shape such an amazingly adaptive device, although the survival advantage is obvious.  The faster you are at pulling a signal out of the noise, the more likely you are to make the right decisions about what it is that you're perceiving -- whether it's you talking to a friend in a crowded bar or a proto-hominid on the African savanna trying to figure out if that odd shape in the grass is a crouching lion.

Monday, September 15, 2014

Hearing through your skin

I first ran into David Eagleman when a student of mine loaned me his phenomenal book Incognito: The Secret Lives of the Brain.

Even considering that I have a decent background in neuroscience, this book was an eye-opener.  Eagleman, a researcher at Baylor College of Medicine, not only is phenomenally knowledgeable in his field, he is a fine writer (and needless to say, those two don't always go together).  His insights about how our own brains work were fascinating, revealing, and often astonishing, and for anyone with an interest in cognitive science, it's a must-read.  (The link above will bring you to the book's Amazon page, should you wish to buy it, which all of you should.)

I've since watched a number of Eagleman's videos, and always come away with the feeling, "This guy is going to turn our understanding of the mind upside down."  And just yesterday, I found out about a Kickstarter project that he's undertaking that certainly makes some strides in that direction.

It's widely known that the brain can use a variety of inputs to get sensory data, substituting another when one of them isn't working.  Back in 2009, some scientists at Wicab, Inc. developed a device called the BrainPort that gave blind people the ability to get visual information about their surroundings, through a horseshoe-shaped output device that sits on the tongue.  A camera acts as a sensor, and transmits visual data into the electrode array on the output device, which then stimulates the surface of the tongue.  After a short training period, test subjects could maneuver around obstacles in a room.

And the coolest part is that the scientists found that the device was somehow stimulating the visual cortex of the brain -- the brain figured out that it was receiving visual data, even though the information was coming through the tongue.  And the test subjects were sensing visual images of their surroundings, even though nothing whatsoever was coming through their eyes.

So Eagleman had an idea.  Could you use a tactile sense to replace any other sense?  He started with trying to substitute tactile stimulation for hearing -- because, after all, they both work more or less the same way.  Touch and hearing both function because of mechanoreceptors, which are nerves that fire due to vibration or deflection.  (Taste, which is a chemoreceptor, and sight, an electromagnetic receptor, are much further apart in how they function.)



It's a vest that's equipped with a SmartPhone, and hundreds of tiny motors -- the transducer activates the motors, turning any sounds picked up by the phone into a pattern of vibrations on your upper body.  And just as with the BrainPort, a short training period is all that's needed before your can, effectively, hear with your skin.

Trials already allowed deaf individuals to understand words at a far higher rate than chance guessing; and you can only imagine that the skill, like any, would improve with time.  Eagleman writes:
We hypothesize that our device will work for deaf individuals, and even be good enough to provide a new perception of hearing. This itself has a number of societal benefits: such a device would cost orders of magnitude less than cochlear implants (hundreds-to-thousands as a opposed to tens-of-thousands), be discrete, and give the wearer the freedom to not be attached to it all the time. The cost effectiveness of the device would also make it realistic to distribute it widely in developing countries. 
More exciting than this, however, is what this proof of principle might enable: the ability to feed all sorts of new and profound sensory information into our brains.
I find this sort of thing absolutely fascinating.  The brain, far from being the static and rigid device we used to believe it was, has amazing plasticity.  Given new sources of information, it responds by integrating those into the data set it uses to allow us to explore the world.  And even though the VEST is currently being considered primarily for restoration of a sense to individuals who have lost one, I (like Eagleman) can't help but wonder about its use in sensory enhancement.

What sorts of things are we missing, through our imperfect sensory apparatus, that such a device might allow us to see?

Consider giving Eagleman's Kickstarter your attention -- he's the sort of innovative genius who could well change the world.  Just what he's done thus far is phenomenal, moving us into possibilities that heretofore were confined to science fiction.

And man, do I want to try one of those vests.  I hear just fine, but still.  How cool would that be?

Wednesday, April 9, 2014

Wine, violins, and trusting your senses

There's still time to put in your guess and enter the 50/50 contest for when Skeptophilia will reach its one-millionth hit!  The cost to enter is $10 (PayPal link on the right, or contact me by email).  Be sure to add a note telling me your guess!

*******************************

I guess I know too much about neuroscience to trust my own senses.  It's a point I've made before; we get awfully cocky about our own limited perspective, when rightfully we should have remarkably little faith in what we see or hear (or remember, for that matter).  Oh, our perceptions are enough to get by on; we wouldn't have lasted long as a species if our sight and hearing led us astray more often than not.

But the devil is in the details, they say, and in this case it proves remarkably (and perhaps regrettably) true.  What you think your senses are telling you is probably not accurate.

At all.

And the worst part is, it doesn't matter if you're an expert.  It might even be worse if you are.  Not only does your confidence blind you to your own mistakes, at times your expectations about what you're experiencing seem to predispose you to blundering more than an amateur would in similar circumstances.

I first ran into this rather troubling phenomenon last year, when a study came out that indicated that wine experts couldn't tell the difference between an expensive wine and a cheap one -- if they were deprived of the information on the label:
French academic Frédéric Brochet... presented the same Bordeaux superior wine to 57 volunteers a week apart and in two different bottles – one for a table wine, the other for a grand cru. 
The tasters were fooled. 
When tasting a supposedly superior wine, their language was more positive – describing it as complex, balanced, long and woody.  When the same wine was presented as plonk, the critics were more likely to use negatives such as weak, light and flat.
Then Brochet pissed off the wine snobs even worse with a subsequent experiment in which it became apparent that the tasters couldn't even tell the difference between a red and a white wine:
[Brochet] asked 54 wine experts to test two glasses of wine– one red, one white. Using the typical language of tasters, the panel described the red as "jammy' and commented on its crushed red fruit. 
The critics failed to spot that both wines were from the same bottle. The only difference was that one had been coloured red with a flavourless dye.
Now lest you think that this phenomenon only applies to wine snobbery, a study has come out from Claudia Fritz at the University of Paris that shows that the same expert-and-expectation bias can occur with our perceptions of sounds -- when she demonstrated that expert violinists couldn't reliably tell the difference between a Stradivarius and a newly-fashioned modern violin:
“During both sessions, soloists wore modified welders’ goggles, which together with much-reduced ambient lighting made it impossible to identify instruments by eye,” the researchers write. In addition, the new violins were sanded down a bit to “eliminate any tactile clues to age, such as unworn corners and edges...” 
In the concert hall, the violinists were given free reign: They could ask for feedback from a designated friend or colleague, and a pianist was on hand so they could play excerpts from sonatas on the various violins. 
Afterwards, they rated each instrument for various qualities, including tone quality, projection, articulation/clarity, “playability,” and overall quality. Finally, they briefly played six to eight of the instruments and guessed whether each was old or new. 
The results: Six of the soloists chose new violins as their hypothetical replacement instruments, while four chose ones made by Stradivari. One particular new violin was chosen four times, and one Stradivarius was chosen three times, suggesting those instruments were the clear favorites.
You can understand how these results might upset classical violinists, perhaps even more than Brochet's experiment ruffled the feathers of the wine tasters.  Stradivarius, after all, is considered the touchstone for sound quality in a string instrument.

[image courtesy of photographer Håkan Svensson and the Wikimedia Commons]

There are 650 known Stradivari instruments, and their market value is estimated to be in the hundreds of thousands to millions of dollars.  Each.  The idea that a new -- albeit excellent -- violin could compete with a Strad in sound quality is profoundly unsettling to a lot of people.

Reasonably speaking, however, I don't know why it should be (other than the monetary aspect, of course).  Wines and music are both rich sensory experiences, and our appreciation of either (or both) is the result of not only the stimulation of millions of sensory neurons, but the release of a complex broth of neurochemicals that creates a feedback loop with our sense organs, emotional centers, and cognitive processes.  We shouldn't expect that experiencing either wine or music would be a predictable thing; if it was, they probably wouldn't have the resonance they do.

So it's not surprising, really, that our expectations about the taste of a wine or the sound of a violin should change our perceptions.  It's just one more kick in the pants to our certainty, however, that what we see and hear and feel is accurate in its details.  The idea doesn't bother me much, honestly.  Nothing that a little Riunite on ice can't fix.