Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label perception. Show all posts
Showing posts with label perception. Show all posts

Monday, July 21, 2025

Cats in boxes

Any cat owners amongst my readers will undoubtedly know about the strange propensity of cats to climb into boxes.  Apparently it works for cats of all sizes:

With apologies to Robert Burns, a cat's a cat for a' that.

In fact, it doesn't even have to be a real box:


I've never heard a particularly convincing explanation of why cats do this.  Some people suggest it's because being in close quarters gives them a sense of security, perhaps a remnant of when they lived in the wild and slept in burrows or caves.  Me, I suspect it's just because cats are a little weird.  I've been of this opinion ever since owning a very strange cat named Puck, who used to sleep on the arm of the couch with one front and one back leg hanging limp on one side of the arm and the other two dangling over the other side, a pose that earned her the nickname "Monorail Cat."  She also had eyes that didn't quite line up, and a broken fang that caused her tongue to stick out of one side of her mouth.  She was quite a sweet-natured cat, really, but even people who love cats thought Puck looked like she had a screw loose.

The topic comes up because of a delightful piece of research in the journal Applied Animal Behaviour Science.  The paper was titled "If I Fits, I Sits: A Citizen Science Investigation into Illusory Contour Susceptibility in Domestic Cats," by Gabriella Smith and Sarah-Elizabeth Byosiere (of Hunter College) and Philippe Chouinard (of LaTrobe University), and looked at data collected from cat owners to find out if cats are fooled by the Kanizsa Rectangle Illusion.

The Kanizsa Rectangle Illusion is an image that tricks the brains into seeing contours that aren't there.  Here's one representation of it:


To most people, this looks like an opaque white rectangle laid over four black hexagons, and not what it really is -- four black hexagons with triangular wedges cut out.  Apparently the brain goes with an Ockham's Razor-ish approach to interpreting what it sees, deducing that a white rectangle on top of black hexagons is much more likely than having the cut-out bits just happening to line up perfectly.  It's amazing, though, how quickly this decision is made; we don't go through a back-and-forth "is it this, or is it that?"; the illusion is instantaneous, and so convincing that many of us can almost see the entire boundary of the rectangle even though there's nothing there.

Well, apparently, so can cats.  And, as one would expect, they sit in the middle of the nonexistent rectangle just as if it was a real box.  The authors write:
A well-known phenomenon to cat owners is the tendency of their cats to sit in enclosed spaces such as boxes, laundry baskets, and even shape outlines taped on the floor.  This investigative study asks whether domestic cats (Felis silvestris catus) are also susceptible to sitting in enclosures that are illusory in nature, utilizing cats’ attraction to box-like spaces to assess their perception of the Kanizsa square visual illusion...  [T]his study randomly assigned citizen science participants booklets of six randomized, counterbalanced daily stimuli to print out, prepare, and place on the floor in pairs.  Owners observed and videorecorded their cats’ behavior with the stimuli and reported findings from home over the course of the six daily trials...  This study revealed that cats selected the Kanizsa illusion just as often as the square and more often than the control, indicating that domestic cats may treat the subjective Kanizsa contours as they do real contours.
It's a fascinating result, and indicative that other animal species see the world much as we do.  It still doesn't explain why cats like to sit in boxes, though.  I think my conclusion ("cats are weird") covers it about as well as anything.  But at least in one way, our perceptual/interpretive centers are just as weird as the cats' are.  I'm not inclined to go sit in a box, but it does make me wonder what our pets would think if we showed them other optical illusions.

I doubt my dogs would be interested.  If what they're looking at has nothing to do with food, petting, napping, or playing, they pretty much ignore it.  Must be nice to see the world in such simple terms.

****************************************


Friday, March 14, 2025

In the blink of an eye

One of the things I love about science is how it provides answers to questions that are so ordinary that few of us appreciate how strange they are.

I remember how surprised I was when I first heard a question about our vision that had honestly never occurred to me.  You know how images jump around when you're filming with a hand-held videocamera?  Even steady-handed people make videos that are seriously nausea-inducing, and when the idea is to make it look like it's filmed by amateurs -- such as in the movie The Blair Witch Project -- the result looks like it was produced by strapping a camera to the head of a kangaroo on crack.

What's a little puzzling is why the world doesn't appear to jump around like that all the time.  I mean, think about it; if you walk down the hall holding a videocamera on your shoulder, and watch the video and compare it to the way the hall looked while you were walking, you'll see the image bouncing all over the place on the video, but won't have experienced that with your eyes.  Why is that?

The answer certainly isn't obvious.  One guess scientists have is that we stabilize the images we see, and compensate for small movements of our head, by using microsaccades -- tiny, involuntary, constant jitters of the eyes.  The thought is that those little back-and-forth movements allow your brain to smooth out the image, keeping us from seeing the world as jumping around every time we move.

Another question about visual perception that I had never thought about was the subject of some research out of New York University and the University Medical Center of Göttingen that was published in the journal Current Biology.  Why don't you have the perception of the world going dark for a moment when you blink?  After all, most of us blink about once every five seconds, and we don't have the sense of a strobe effect.  In fact, most of us are unaware of any change in perception whatsoever.

[Image licensed under the Creative Commons Mcorrens, Iris of the Human Eye, CC BY-SA 3.0]

By studying patients who had lesions in the cerebrum, and comparing them to patients with intact brains, the scientists were not only able to answer this question, but to pinpoint exactly where this phenomenon happens -- the dorsomedial prefrontal cortex, a part of the brain immediately behind the forehead.  What they found was that individuals with an intact dmPFC store a perceptual memory of what they've just seen, and use that to form the perception they're currently seeing, so the time during which there's no light falling on the retina -- when you blink -- doesn't even register.  On the other hand, a patient with a lesion in the dmPFC lost that ability, and didn't store immediate perceptual memories.  The result?  Every time she blinked, it was like a shutter closed on the world.

"We were able to show that the prefrontal cortex plays an important role in perception and in context-dependent behavior," said neuroscientist Caspar Schwiedrzik, who was lead author of the study.  "Our research shows that the medial prefrontal cortex calibrates current visual information with previously obtained information and thus enables us to perceive the world with more stability, even when we briefly close our eyes to blink...  This is not only true for blinking but also for higher cognitive functions.  Even when we see a facial expression, this information influences the perception of the expression on the next face that we look at."

All of which highlights that all of our perceptual and integrative processes are way more sophisticated than they seem at first.  It also indicates something that's a little scary; that what we're perceiving is partly what's really out there, and partly what our brain is telling us it thinks is out there.  Which is right more often than not, of course.  If that weren't true, natural selection would have finished us off a long time ago.  But that fraction of the times that it's wrong, it can create some seriously weird sensations -- or make us question things that we'd always taken for granted.

****************************************


Wednesday, September 25, 2024

Reality, nightmares, and the paranormal

I was giving some thought this morning to why I've turned into such a diehard doubter of paranormal occurrences.  And I think one of the main reasons is because I know enough neuroscience to have very little faith in my own brain and sensory organs.

I'm not an expert on the topic, mind you.  I'm a raving generalist, what some people describe as "interested in everything" and more critical sorts label as a shallow dilettante.  But I know enough about the nervous system to have taught a semester-long elective in introductory neuroscience for years, and that plus my native curiosity has always kept me reading about new developments.

This is what prompted a friend of mine to hand me the late Oliver Sacks's book Hallucinations.  I love Sacks's writing -- The Man Who Mistook His Wife for a Hat and Musicophilia are tours de force -- but this one I hadn't heard of.

And let me tell you, if you are the type who is prone to say, "I know it happened, I saw it with my own eyes!", you might want to give this book a read.

The whole book is a devastating blow to our confidence that what we see, hear, and remember is reality.  But the especially damning part began with his description of hypnopompic hallucinations -- visions that occur immediately upon waking.  Unlike the more common hypnagogic experiences, which are dreamlike states in light sleep, hypnopompic experiences have the additional characteristic that when you are in one, you are (1) convinced that you are completely awake, and (2) certain that what you're seeing is real.

Sacks describes one of his own patients who suffered from frequent hypnopompic hallucinations. Amongst the things the man saw were:
  • a huge figure of an angel
  • a rotting corpse lying next to him in bed
  • a dead child on the floor, covered in blood
  • hideous faces laughing at him
  • giant spiders
  • a huge hand suspended over his face
  • an image of himself as an older man, standing by the foot of the bed
  • an ugly-looking primitive man lying on the floor, with tufted orange hair
Fortunately for him, Sacks's patient was a rational man and knew that what he was experiencing was hallucination, i.e., not real.  But you can see how if you were even slightly inclined to believe in the paranormal, this would put you over the edge (possibly in more than one way).

But it gets worse.  There's cataplexy, which is a sudden and total loss of muscular strength, resulting in the sufferer falling to the ground while remaining completely conscious.  Victims of cataplexy often also experience sleep paralysis, which is another phenomenon that occurs upon waking, and in which the system that is supposed to re-sync the voluntary muscles with the conscious mental faculties fails to occur, resulting in a terrifying inability to move.  As if this weren't bad enough, cataplexy and sleep paralysis are often accompanied by hallucinations -- one woman Sacks worked with experienced an episode of sleep paralysis in which she saw "an abnormally tall man in a black suit...  He was greenish-pale, sick-looking, with a shock-ridden look in the eyes.  I tried to scream, but was unable to move my lips or make any sounds at all.  He kept staring at me with his eyes almost popping out when all of a sudden he started shouting out random numbers, like FIVE-ELEVEN-EIGHT-ONE-THREE-TWO-FOUR-NINE-TWENTY, then laughed hysterically."

After this the paralysis resolved, and the image of the man "became more and more blurry until he was gone."

Johann Heinrich Füssli, The Nightmare (1790) [Image is in the Public Domain]

Then there are grief-induced hallucinations, an apparently well-documented phenomenon which I had never heard of before.  A doctor in Wales, W. D. Rees, interviewed three hundred people who had recently lost loved ones, and found that nearly half of them had at least fleeting hallucinations of seeing the deceased.  Some of these hallucinations persisted for months or years.

Given all this, is it any wonder that every culture on Earth has legends of ghosts, demons, and spirits?

Of course, the True Believers in the studio audience (hey, there have to be some, right?) are probably saying, "Sacks only calls them hallucinations because that's what he already believed to be true -- he's as guilty of confirmation bias as the people who believe in ghosts."  But the problem with this is, Sacks also tells us that there are certain medications which make such hallucinations dramatically worse, and others that make them diminish or go away entirely.  Hard to explain why, if the ghosts, spirits, et al. have an external reality, taking a drug can make them go away.

But the psychics probably will just respond by saying that the medication is making people "less attuned to the frequencies of the spirit world," or some such.  You can't win.

Nota bene: I'm not saying ghosts, or spirits, or the afterlife, don't exist or, even more, can't exist.  Just that there's an alternate plausible explanation for these experiences that relies on nothing but known science.  As skeptic Robert Carroll put it, "Before you accept a paranormal or supernatural account of the world, you had better make sure that you've ruled out all the normal and natural ones first."

In any case, I highly recommend Sacks's book.  (The link to the Amazon page is posted above, if you'd like to buy a copy.)  It will, however, have the effect of making you doubt everything you're looking at.  Not that that's necessarily a bad thing; a little less certainty, and a little more acknowledgement of doubt, would certainly make my job a hell of a lot easier.

****************************************


Wednesday, June 26, 2024

Primed and ready

One of the beefs a lot of aficionados of the paranormal have with us skeptics has to do with a disagreement over the quality of evidence.

Take, for example, Hans Holzer, who was one of the first serious ghost hunters.  His work in the field started in the mid-twentieth century and continued right up to his death in 2009 at the venerable age of 89, during which time he not only visited hundreds of allegedly haunted sites but authored 120 books documenting his experiences.

No one doubts Holzer's sincerity; he clearly believed what he wrote, and was not a hoaxer or a charlatan.  But if you read his books, what will strike anyone of a skeptical bent is that virtually all of it is comprised of anecdote.  Stories from homeowners, accounts of "psychic mediums," recounting of old tales and legends.  None of it is demonstrated scientifically, in the sense of encounters that occur in controlled circumstances where credulity or outright fakery by others can be rigorously ruled out.

After all, Holzer may well have been scrupulously honest, but that doesn't mean that the people he worked with were.

I'll just interject my usual disclaimer; none of this constitutes disproof, either.  But in the absence of evidence that meets the minimum standard acceptable in science, the most parsimonious explanation is that Holzer's many stories are accounted for by human psychology, flaws in perception, and the plasticity of memory, and the possibility that at least some of his informants were exaggerating or lying about their own experiences.

As an illustration of just one of the difficulties with accepting anecdote, consider the phenomenon of priming.  What we experience is strongly affected by what we expect to experience; even a minor interjection ahead of time of a mental image (for example) can alter how we see, interpret, and remember something else that occurred afterward.  A simple example -- if someone is shown a yellow object and afterward asked to name a fruit, they come up with "banana" or "lemon" far more frequently than someone who was shown a different color (or who wasn't primed at all).  It all occurs without our conscious awareness; often the person who was primed didn't even know it was happening.

This becomes more insidious when it starts affecting how people understand the world around them.  To take another lightweight example, but which gets at how claims of the supernatural start, take the currently popular "paranormal game" called "Red Door, Yellow Door."  "Red Door, Yellow Door" is a little like the game that all of us Of A Certain Age will remember, the one called "Bloody Mary."  The way "Bloody Mary" works is that you stand in front of a mirror, stare into it, and chant "Bloody Mary" over and over, and after a moment, nothing happens.

What's supposed to happen is that your face turns into the blood-dripping visage of a woman, or else you see her over your shoulder.  Most of us who tried it, of course, got what the paranormal investigators call "disappointing results."  But "Red Door, Yellow Door" moves even one step further from verifiable reality,  because the whole thing takes place in your mind.  You're supposed to lie down and close your eyes, while a friend (the "guide") massages your temples and says, "Red door, yellow door, any other color door" over and over.  You're supposed to picture a hallway in your mind, and as soon as you've got a clear image, you give a hand signal to the guide to stop chanting.  Then you describe it, entering doors as you see fit and describing to the guide what you're seeing.

[Image licensed under the Creative Commons dying_grotesque from Richards Bay, South Africa, Red Door (3275822777), CC BY 2.0]

Thus far, it's just an exercise in imagination, and innocent enough; but the claim is, what you're seeing is real -- and can harm you.  Because of the alleged danger, there are a variety of rules you are supposed to remember.  If a room you enter has clocks in it, get out fast -- you can get trapped permanently.  If there are staircases, never take one leading downward.  If you meet a man with a suit, open your eyes and end the game immediately, because he's evil and can latch on to you and start following you around in real life if you don't act quickly enough.

Oh, and to add the obligatory frisson to the whole thing: if you die in the game, you actually die.

What's striking about "Red Door, Yellow Door" is that despite the fact that its claims are patently absurd, there are huge numbers of apparently completely serious people who have had terrifying experiences while playing it -- not only manifestations during the game, but afterward.  (If you search for the game, you'll find hundreds of accounts, many of them warning people from ever playing it because they were so traumatized by it.)  The thing is, what did they expect would happen?  They'd been primed by all of the setup; it's unsurprising they saw clocks and eerie staircases descending into darkness and evil guys in suits, and that those same images haunted their memories for some time after the game ended.

And if a silly game for gullible teenagers can do that, how much more do our perception and memory get tainted by how we're primed, especially by our prior notions of what might be going on?  Hang out in graveyards and spooky attics, and you're likely to see ghosts whether or not they're there.

As I recounted in Monday's post, I've been fascinated by tales of the supernatural since I was a kid, and on some level, I'm like Fox Mulder -- "I Want To Believe."  But the fact is, the evidence we have thus far just isn't enough.  Humans are way too suggestible to rely entirely on anecdote.

Astrophysicist Neil deGrasse Tyson put it most succinctly: "In science, we need more than 'you saw it.'  When you have something tangible we can bring back to the lab and analyze, then we can talk."

****************************************



Saturday, March 23, 2024

Twisted faces

One of the most terrifying episodes The X Files ever did was called "Folie à Deux."  In the opening scene, a man sees his boss not as a human but as a hideous-looking insectile alien who is, one by one, turning the workers in the company into undead zombies.

The worst part is that he's the only one who sees all of this.  Everyone else thinks everything is perfectly normal.

The episode captures in appropriately ghastly fashion the horror of psychosis -- the absolute conviction that the awful things you're experiencing are real despite everyone's reassurance that they're not.  In the show, of course, they are real; it's the people who aren't seeing it who are delusional.  But when this sort of thing happens in the real world, it is one of the scariest things I can imagine.  As I made the point in my neuroscience classes, your brain is taking the information it receives from your sensory organs and trying to assemble a picture of reality from those inputs; if something goes wrong, and the brain puts that information together incorrectly, that flawed picture becomes your reality.  At that point, there is no reliable way to distinguish reality from hallucination.

I was, unfortunately, reminded of that episode when a friend and loyal reader of Skeptophilia sent me a link yesterday to a story in NBC News Online about a man with prosopometamorphopsia, a (thank heaven) rare disorder that causes the patient's perception of human faces to go awry.  When he looks at another person, he sees their face as grotesquely stretched, with deep grooves in the forehead and cheeks.

Computer-generated images of what the patient describes seeing [Image credit: Antônio Mello, Dartmouth University]

Weirdly, it doesn't happen when he looks at a drawing or a photograph; only actual faces trigger the shift.  A moving face -- someone talking, for example -- accentuates the distortion.

Some people with prosopometamorphopsia (PMO) have it from birth; most, though, acquire it through physical damage to the brain, such as a stroke or traumatic brain injury.  The patient who was the first subject of this study shows up in MRI images with a lesion on the left side of his brain that is undoubtedly the origin of the distorted perception.  As far as the origin of that, he had a severe concussion in his forties (he's now 59), but also suffered from accidental carbon monoxide poisoning four months before the onset of symptoms.  Which of those is the root cause of the lesion, or if it's from something else entirely, is unknown.

At least now that he knows what's going on, he has been reassured that he's not going insane -- or worse, that he's seeing the world as it actually is, and like the man in "Folie à Deux," become convinced that he's the only one who does.  "My first thought was I woke up in a demon world," the patient told researchers, regarding how he felt when the symptoms started.  "I came so close to having myself institutionalized.  If I can help anybody from the trauma that I experienced with it and keep people from being institutionalized and put on drugs because of it, that’s my number-one goal."

I was immediately reminded of a superficially similar disorder called Charles Bonnet syndrome. (Nota bene: Charles Bonnet is no relation.  My French great-grandfather's name was changed upon arrival in the United States, so my last name shouldn't even be Bonnet.)  In this disorder, people with partial blindness, often from macular degeneration, start putting together the damaged and incomplete information their eyes are relaying to their brains in novel ways, causing what are called visual release hallucinations.  They can be complex -- one elderly woman saw what appeared to be tame lions strolling about in her house -- but there's no actual psychosis.  The people experiencing them, as with PMO, know (or can be convinced) that what they're seeing isn't real, which takes away a great deal of the anxiety, fear, and trauma of having hallucinations.

So at least that's one upside for PMO sufferers.  Still, it's got to be disorienting to look at the world around you and know for certain that what you're seeing isn't the way it actually is.  My eyesight isn't great, even with bifocals, but at least what I am seeing is real.  I'll take that over twisted faces and illusory lions any day.

****************************************



Wednesday, September 20, 2023

Faces in the woods

One of the first things I ever wrote about in this blog was the phenomenon of pareidolia -- because the human brain is wired to recognize faces, we sometimes see faces where there are only random patterns of lights and shadows that resemble a face.  This is why, as children, we all saw faces in clouds and on the Moon; and it also explains the Face on Mars, most "ghost photographs," and the countless instances of seeing the face of Jesus on grilled cheese sandwiches, tortillas, and concrete walls.

When I first mentioned pareidolia, almost thirteen years ago, it seemed like most people hadn't heard of it.  Recently, however, the idea has gained wider currency, and now when some facelike thing is spotted, and makes it into the mainstream press, the word seems to come up with fair regularity.  Which is all to the good.

But it does leave the woo-woos in a bit of a quandary, doesn't it?  If all of their ghost photographs and Faces on Mars and grilled cheese Jesuses (Jesi?) are just random patterns, perceived as faces because that's how the human brain works, what's a woo-woo to do?

Well, a post at the website Crystal Life gives us the answer.

Entitled "A Visit With the Nature Spirits," the author admits that pareidolia does occur:
How do you see nature spirits in trees?  You use pareidolia, a faculty of the mind that enables you to see patterns in objects where none supposedly exist.  It’s how we see faces and shapes and animals in water, rocks, and tree trunks.  Conventional psychology regards this faculty as pure imagination, but if it is used in a certain way, it can open you up to subtler realities of which conventional psychology is unaware.
Okay, so far so good.  So how do we tell the difference between imagining a face (which surely we all do from time to time), and seeing a face because there's a "nature spirit" present?  We can't, the writer says, because even if it is pareidolia, the spirits are still there.  She gives an example:
“Trees like to express their environment,” she [a like-minded person she was talking with] observes, and so create forms, such as burls, in their bark to reflect what they experience.  I could see the figures she described, although my immediate impression had been that of an energy like that of an octopus.  Atala explained that various people will see different images and aspects of the trees’ energy.  Overall her experiences of the nature spirit were more visual (she took many photographs), while mine were more kinesthetic.  It’s possible that with the pine tree, I was simply picking up certain tendrils of energy that it was extending toward me.
So, in other words -- if I'm understanding her correctly -- even if analysis of the photograph showed that the image we thought was a Nature Spirit turned out to be a happenstance arrangement of leaves and branches, it's still a Nature Spirit -- it's just that the Spirit used the leaves and branches to create his face?  (At this point, you should go back and click the link, if you haven't already done so, it includes some photographs of "Woodland Spirits" that she took, and that are at least mildly entertaining, including one of a guy "coming into rapport" with a tree.)

[Image licensed under the Creative Commons Lauren raine, Greenman mask with eyes, CC BY-SA 3.0]

Well, to a skeptic's ear, all of this sounds mighty convenient.  It's akin to a ghost hunter saying, "No -- the ghostly image wasn't just a smudge on the camera lens; the ghost created a smudge on your camera lens in order to leave his image on the photograph."  What this does, of course, is to remove photographic evidence from the realm of the even potentially falsifiable -- any alternate explanations simply show that the denizens of the Spirit World can manipulate their surroundings, your mind, and the camera or recording equipment.

The whole thing puts me in mind of China Miéville's amazing (and terrifying) short story "Details," in which a woman admits that cracks in sidewalks and stains on walls and patterns in carpet that happen to resemble faces are just random and meaningless -- but at the same time, they are monsters.  Here's how the main character, the enigmatic Mrs. Miller, describes it:
"For most people, it's just chance, isn't it?" Mrs Miller said.  "What shapes they see in a tangle of wire.  There's a thousand pictures there, and when you look, some of them just appear.  But now... the thing in the lines chooses the pictures for me.  It can thrust itself forward.  It makes me see it.  It's found its way through."
It does bear keeping in mind, though, that however wonderful Miéville's story is, you will find it on the "Fiction" aisle in the bookstore.  For a reason.

Of course, it's not like any hardcore skeptic considers photographic evidence all that reliable in the first place.  Besides pareidolia and simple camera malfunctions, programs like Photoshop have made convincing fakes too easy to produce.  This is why scientists demand hard evidence when people make outlandish claims -- show me, in a controlled setting, that what you are saying is true.  If you think there's a troll in the woods, let's see him show up in front of reliable witnesses.  Let's have a sample of troll hair on which to perform DNA analysis, or a troll bone to study in the lab.  If you say a house is haunted by a "spirit," design me a Spirit-o-Meter that can detect the "energy field" that you people always blather on about -- don't just tell me that you sensed a Great Disturbance in the Force, and if I didn't, it's just too bad that I don't have your level of psychic sensitivity.  Also, for cryin' in the sink, don't tell me that my "disbelief is getting in the way," which is another accusation I've had leveled at me.  Honestly, you'd think that, far from being discouraged by my disbelief, a ghost would want to appear in front of skeptics like myself, just for the fun of watching us piss our pants in abject terror.  ("I do believe in spooks, I do believe in spooks, I do believe, I do believe...")

In any case, the article on Crystal Life gives us yet another example of how the worlds of science and the paranormal define the word "evidence" rather differently.  The two views, I think, are probably irreconcilable.  So I'll end here, on that rather pessimistic note, not only because I've reached the end of my post for the day, but also because I just spilled a little bit of coffee on my desk, and I want to wipe it up before the Coffee Fairy fashions it into a scary-looking face.

****************************************



Thursday, June 15, 2023

Trompe l'oeil

I have a fascination for optical illusions.

Not only are they cool, they often point out some profound information about how we process sensory input.  Take the famous two-and-a-half pronged fork:


The problem here is that we're trying to interpret a two-dimensional drawing as if it were a three-dimensional object, and the two parts of the drawing aren't compatible under that interpretation.  Worse, when you try to force your brain to make sense of it -- following the drawing from the bottom left to the top right, and trying to figure out when the object goes from three prongs to two -- you fail utterly.

Neil deGrasse Tyson used optical illusions as an example of why we should be slow to accept eyewitness testimony.  "We all love optical illusions," he said. "But that's not what they should call them.  They should call them 'brain failures.'  Because that's what they are.  A clever drawing, and your brain can't handle it."

(If you have some time, check out this cool compendium of optical illusions collected by Michael Bach, which is even more awesome because he took the time to explain why each one happens, at least where an explanation is known.)

It's even more disorienting when an illusion occurs because of two senses conflicting.  Which was the subject of a paper out of Caltech, "What You Saw Is What You Will Hear: Two New Illusions With Audiovisual Postdictive Effects," by Noelle R. B. Stiles, Monica Li, Carmel A. Levitan, Yukiyasu Kamitani, and Shinsuke Shimojo.  What they did is an elegant experiment to show two things -- how sound can interfere with visual processing, and how a stimulus can influence our perception of an event, even if the stimulus occurs after the event did!

Sounds like the future affecting the past, doesn't it?  It turns out the answer is both simpler and more humbling; it's another example of a brain failure.

Here's how they did the experiment.

In the first trial, they played a beep three times, 58 milliseconds apart.  The first and third beeps were accompanied by a flash of light.  Most people thought there were three flashes -- a middle one coincident with the second beep.

The second setup was, in a way, opposite to the first.  They showed three flashes of light, on the right, middle, and left of the computer screen.  Only the first and third were accompanied by a beep.  Almost everyone didn't see -- or, more accurately, didn't register -- the middle flash, and thought there were only two lights.

Sorry, I had to.

"The significance of this study is twofold," said study co-author Shinsuke Shimojo.  "First, it generalizes postdiction as a key process in perceptual processing for both a single sense and multiple senses.  Postdiction may sound mysterious, but it is not—one must consider how long it takes the brain to process earlier visual stimuli, during which time subsequent stimuli from a different sense can affect or modulate the first.  The second significance is that these illusions are among the very rare cases where sound affects vision, not vice versa, indicating dynamic aspects of neural processing that occur across space and time.  These new illusions will enable researchers to identify optimal parameters for multisensory integration, which is necessary for both the design of ideal sensory aids and optimal training for low-vision individuals."

All cool stuff, and more information about how the mysterious organ in our skull works.  Of course, this makes me wonder what we imagine we see because our brain anticipates that it will there, or perhaps miss because it anticipates that something out of of place shouldn't be there.  To end with another quote from Tyson: "Our brains are unreliable as signal-processing devices.  We're confident about what we see, hear, and remember, when in fact we should not be."

****************************************



Friday, June 9, 2023

The myth of the Golden Age

You hear it all the time, don't you?  There's no such thing as common decency any more.  Moral values are in freefall.  Simple politeness is a thing of the past.  Kids today don't understand the value of (choose all that apply): hard work, honesty, compassion, loyalty, friendship, culture, intellectual pursuits.  The whole world has gone seriously downhill.

Oh, and we mustn't forget "Make America Great Again."  Implying that there was a time in the past -- usually unspecified -- when America was great, but it's kind of gone down the tubes since then.  But it's not just the Republicans; a 2015 study found that 76% of respondents in the United States believed that "addressing the moral breakdown of the country should be a high priority for their government."

This whole deeply pessimistic attitude is widespread -- that compared to the past, we're a hopeless mess.  The first clue that this might not be accurate, though, comes from history, and not just the fact that the past -- regardless which part of it you choose -- had some seriously bad parts.  Consider in addition that just about every era has felt the same way about its own past.  Nineteenth century Europe, for example, had a nearly religious reverence for the societies of classical Rome and Greece -- which is ironic, because the Greeks and Romans at the height of their civilizations both looked back to their ancestors as living in a "Golden Age of Heroes" that had, sadly, devolved into chaos and highly unheroic ugliness.

The Golden Age by Pietro de Cortona (17th century) [Image is in the Public Domain]

So psychologists Adam Mastroianni (of Columbia University) and Daniel Gilbert (of Harvard University) decided to see if there was any truth to the claim that we really are in moral decline.

Their findings, which were published last week in Nature, drew on sixty years of surveys about moral values, with respondents from 59 different countries.  These surveys not only asked questions regarding whether morality had declined over the respondents' lifetimes (84% said it had), they asked them to rate their own values and their peers'.

Interestingly, although most people said things were worse now than they had been in the past, there was no decline over time in how people rated the values and morality of the people around them in the present.  The percentage of people respondents knew and described as kind, decent, honest, or hard-working has remained completely flat over the past sixty years.

So what's going on?

Mastroianni and Gilbert say it's simple.

People idealize the past because they have bad memories.

It's the same phenomenon as when we recall vacations where there have been mishaps.  After a couple of years have passed, we remember the positive parts -- the walks on the beach, the excellent food, the beautiful weather -- and the sunburn, mosquito bites, delayed flights, and uncomfortable hotel room beds have all faded from memory.  It has to be really bad before the unpleasant memories come to mind first, such as the trip I took with my wife to Belize where the guests and staff of the lodge where we were staying all simultaneously came down with the worst food poisoning I've ever experienced.

Okay, that I remember pretty vividly.  But most vacation mishaps?  Barely remembered -- or only recalled with a smile, a laugh, a "can you believe that happened?"

What Mastroianni and Gilbert found was that we put that same undeserved gloss on the past in general.  It's an encouraging finding, really; people aren't getting worse, morality isn't going downhill, the world isn't going to hell in a handbasket.  In reality, most people now -- just like in the past -- are honest and decent and kind.

The problem, of course, is that given how widespread this belief is, and how resistant it is to changing, how to get folks to stop looking at the past as some kind of Golden Age.  Because the fact is, we have made some significant strides in a great many areas; equality for women and minorities, LGBTQ rights and treatment, concern for the environment are all far ahead of where they were even forty years ago.  There are a lot of ways the past wasn't all that great.

Believe me, as a closeted queer kid who grew up in the Deep South of the 1960s and 1970s, I wouldn't want to go back there for any money.

So maybe we need to turn our focus away from the past and look instead toward the future -- instead of lamenting some mythical and almost certainly false lost paradise, working toward making what's to come even better for everyone. 

****************************************



Thursday, March 9, 2023

Pitch perfect

I've been a music lover since I was little.  My mom used to tell the story of my being around four years old and begging her to let me put records on the record player.  At first, she was reluctant, but for once my persistence won the day, and she finally relented.  To my credit, despite my youth I was exceedingly careful and never damaged a record; the privilege was too important to me to risk revocation.  There were certain records I played over and over, such as Rimsky-Korsakov's Scheherazade (a piece I love to this day).

I've always been fascinated with the question of whether musicality is inborn or learned.  My parents, while they had a decent record collection, weren't musical themselves; they certainly didn't have anything like the passion for it I experienced.  While the capacity for appreciating music is still poorly understood, today I'd like to tell you about some research indicating that the way our brains interpret tone structure is inborn.

First, a little background.

While it may appear on first glance that the major key scale -- to take the simplest iteration of tone structure as an example -- must be arbitrary, there's an interesting relationship between the frequencies of the notes.  Middle C, for example, has a frequency of about 260 hertz (depending on how your piano is tuned), and the C above middle C (usually written C') has exactly twice that frequency, 520 hertz. Each note is half the frequency of the note one octave above.  The frequency of G above middle C (which musicians would say is "a fifth above") has a frequency of 3/2 that of the root note, or tonic (middle C itself), or 390 hertz.  The E above middle C (a third above) has a frequency of 5/4 that of middle C, or 325 hertz.  Together, these three make up the "major triad" -- a C major chord.  (The other notes in the major scale also have simple fractional values relative to the frequency of the tonic.)

[Note bene: Music theoretical types are probably bouncing up and down right now and yelling that this is only true if the scale is in just temperament, and that a lot of Western orchestral instruments are tuned instead in equal temperament, where the notes are tuned in intervals that are integer powers of the basic frequency increase of one half-tone.  My response is: (1) yes, I know, and (2) what I just told you is about all I understand of the difference, and (3) the technical details aren't really germane to the research I'm about to reference.  So you must forgive my oversimplifications.]

Because there are such natural relationships between the notes in a scale, it's entirely possible that our ability to perceive them is hard-wired.  It takes no training, for example, to recognize the relationship between a spring that is vibrating at a frequency of f (the lower wave on the diagram) and one that is vibrating at a frequency of 2f (the upper wave on the diagram).  There are exactly twice the number of peaks and troughs in the higher frequency wave as there are in the lower frequency wave.


Still, being able to see a relationship and hear an analogous one is not a given.  It seems pretty instinctive; if I asked you (assuming you're not tone deaf) to sing a note an octave up or down from one I played on the piano, you probably could do it, as long as it was in your singing range.

But is this ability learned because of our early exposure to music that uses that chord structure as its basis?  To test this, it would require comparing a Western person's ability to match pitch and jump octaves (or other intervals) with someone who had no exposure to music with that structure -- and that's not easy, because most of the world's music has octaves, thirds, and fifths somewhere, even if there are other differences, such as the use of quarter-tones in a lot of Middle Eastern music.

This brings us to a paper in the journal Current Biology called "Universal and Non-universal Features of Musical Pitch Perception Revealed by Singing," by Nori Jacoby (of the Max Planck Institute and Columbia University), Eduardo A. Undurraga, Joaquín Valdés, and Tomás Ossandón (of the Pontificia Universidad Católica de Chile), and Malinda J. McPherson and Josh H. McDermott (of MIT).  And what this team discovered is something startling; there's a tribe in the Amazon which has had no exposure to Western music, and while they are fairly good at mimicking the relationships between pairs of notes, they seemed completely unaware that they were singing completely different notes (as an example, if the researchers played a C and a G -- a fifth apart -- the test subjects might well sing back an A and an E -- also a fifth apart but entirely different notes unrelated to the first two).

The authors write:
Musical pitch perception is argued to result from nonmusical biological constraints and thus to have similar characteristics across cultures, but its universality remains unclear.  We probed pitch representations in residents of the Bolivian Amazon—the Tsimane', who live in relative isolation from Western culture—as well as US musicians and non-musicians.  Participants sang back tone sequences presented in different frequency ranges.  Sung responses of Amazonian and US participants approximately replicated heard intervals on a logarithmic scale, even for tones outside the singing range.  Moreover, Amazonian and US reproductions both deteriorated for high-frequency tones even though they were fully audible.  But whereas US participants tended to reproduce notes an integer number of octaves above or below the heard tones, Amazonians did not, ignoring the note “chroma” (C, D, etc.)...  The results suggest the cross-cultural presence of logarithmic scales for pitch, and biological constraints on the limits of pitch, but indicate that octave equivalence may be culturally contingent, plausibly dependent on pitch representations that develop from experience with particular musical systems.
Which is a very curious result.

It makes me wonder if our understanding of a particular kind of chord structure isn't hardwired, but is learned very early from exposure -- explaining why so much of pop music has a familiar four-chord structure (hilariously lampooned by the Axis of Awesome in this video, which you must watch).  I've heard a bit of the aforementioned Middle Eastern quarter-tone music, and while I can appreciate the artistry, there's something about it that "doesn't make sense to my ears."

Of course, to be fair, I feel the same way about jazz.

In any case, I thought this was a fascinating study, and like all good science, opens up a variety of other angles of inquiry.  Myself, I'm fascinated with rhythm more than pitch or chord structure, ever since becoming enthralled by Balkan music about thirty years ago.  Their odd rhythmic patterns and time signatures -- 5/8, 7/8, 11/16, 13/16, and, no lie, 25/16 -- take a good bit of getting used to, especially for people used to good old Western threes and fours.

So to conclude, here's one example -- a lovely performance of a dance tune called "Gankino," a kopanica in 11/16.  See what sense you can make of it.  Enjoy!

****************************************



Friday, February 24, 2023

Saucy savagery

Kids these days, ya know what I mean?

Wiser heads than mine have commented on the laziness, disrespectfulness, and general dissipation of youth.  Here's a sampler:
  • Parents themselves were often the cause of many difficulties.  They frequently failed in their obvious duty to teach self-control and discipline to their own children.
  • We defy anyone who goes about with his eyes open to deny that there is, as never before, an attitude on the part of young folk which is best described as grossly thoughtless, rude, and utterly selfish.
  • The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise.  Children are now tyrants, not the servants of their households.  They no longer rise when elders enter the room.  They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.
  • Never has youth been exposed to such dangers of both perversion and arrest as in our own land and day.  Increasing urban life with its temptations, prematurities, sedentary occupations, and passive stimuli just when an active life is most needed, early emancipation and a lessening sense for both duty and discipline, the haste to know and do all befitting man's estate before its time, the mad rush for sudden wealth and the reckless fashions set by its gilded youth--all these lack some of the regulatives they still have in older lands with more conservative conditions.
  • Youth were never more saucy -- never more savagely saucy -- as now... the ancient are scorned, the honourable are condemned, and the magistrate is not dreaded.
  • Our sires' age was worse than our grandsires'.  We, their sons, are more worthless than they; so in our turn we shall give the world a progeny yet more corrupt.
  • [Young people] are high-minded because they have not yet been humbled by life, nor have they experienced the force of circumstances…  They think they know everything, and are always quite sure about it.
Of course, I haven't told you where these quotes come from. In order:
  • from an editorial in the Leeds Mercury, 1938
  • from an editorial in the Hull Daily Mail, 1925
  • Kenneth John Freeman, Cambridge University, 1907
  • Granville Stanley Hall, The Psychology of Adolescence, 1904
  • Thomas Barnes, The Wise Man's Forecast Against the Evil Time, 1624
  • Horace, Odes, Book III, 20 B.C.E.
  • Aristotle, 4th century B.C.E.
So yeah.  Adults saying "kids these days" has a long, inglorious history.  (Nota bene: the third quote, from Kenneth Freeman, has often been misattributed to Socrates, but it seems pretty unequivocal that Freeman was the originator.)

Jan Miense Molenaar, Children Making Music (ca. 1630) [Image is in the Public Domain]

I can say from my admitted sample-size-of-one that "kids these days" are pretty much the same as they were when I first started teaching 35 long years ago.  Throughout my career there were kind ones and bullies, intelligent and not-so-much, hard-working and not-so-much, readers and non-readers, honest and dishonest.  Yes, a lot of the context has changed; just the access to, and sophistication of, technology has solved a whole host of problems and created a whole host of other ones, but isn't that always the way?  In my far-off and misspent youth, adults railed against rock music and long hair in much the same way that they do today about cellphones and social media, and with about as much justification.  Yes, there are kids who misuse social media and have their noses in their SmartPhones 24/7, but the vast majority handle themselves around these devices just fine -- same as most of my generation didn't turn out to be drug-abusing, illiterate, disrespectful dropouts.

This comes up because of a study in Science Advances by John Protzko and Jonathan Schooler, called "Kids These Days: Why the Youth of Today Seem Lacking."  And its unfortunate conclusion -- unfortunate for us adults, that is -- is that the sense of today's young people being irresponsible, disrespectful, and lazy is mostly because we don't remember how irresponsible, disrespectful, and lazy we were when we were teenagers.  And before you say, "Wait a moment, I was a respectful and hard-working teenager" -- okay, maybe.  But so are many of today's teenagers.  If you want me to buy that we're in a downward spiral, you'll have to convince me that more teenagers back then were hard-working and responsible, and that I simply don't believe.

And neither do Protzko and Schooler.

So the whole thing hinges more on idealization of the past, and our own poor memories, than on anything real.  I also suspect that a good many of the older adults who roll their eyes about "kids these days" don't have any actual substantive contact with young people, and are getting their impressions of teenagers from the media -- which certainly doesn't have a vested interest in portraying anyone as ordinary, honest, and law-abiding.

Oh, and another thing.  What really gets my blood boiling is the adults who on the one hand snarl about how complacent and selfish young people are -- and then when young people rise up and try to change things, such as Greta Thunberg and the activists from Marjory Stoneman Douglas High School, they say, "Wait, not like that."  What, you only accept youth activism if it supports the status quo?

All well and good for kids to have opinions, until they start contradicting the opinions of adults, seems like.

Anyhow, I'm an optimist about today's youth.  I saw way too many positive things in my years as a high school teacher to feel like this is going to be the generation that trashes everything through irresponsibility and disrespect for tradition.  And if after reading this, you're still in any doubt about that, I want you to think back on your own teenage years, and ask yourself honestly if you were as squeaky-clean as you'd like people to believe.

Or were you -- like the youth in Aristotle's day -- guilty of thinking you knew everything, and being quite sure about it?

****************************************