Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, May 23, 2019

Dog days

The nature/nurture debate has been going on for some time -- whether our behavior and personalities are controlled by our genetics or our environment.  Most scientists believe it's both -- the differences lie in what amount and which parts of our personalities are due to each factor.

The search for answers has led to some rather startling results.  Separated-twin studies -- involving locating sets of identical twins who were separated at birth and raised in different homes -- resulted in some correspondences that were astonishing.  The weirdest one, done back in 1979, found a pair of identical twin brothers who were separated at age four weeks and didn't even know of each other's existence -- and they were both named Jim, drove the same type of car, both had tension headaches and were chronic nail-biters, were both firefighters, and chain-smoked the same brand of cigarettes.

Now, even the most die-hard proponent of the personality-is-inborn explanation wouldn't claim that the kind of car you drive is genetic.  Some of these similarities are clearly due to the Law of Large Numbers -- in a big enough sample size, you'll find strange coincidences that don't really mean anything profound.  Add to that Dart-Thrower's Bias -- our tendency to notice and remember outliers -- and the Two Jims aren't really that hard to explain.  (And, of course, out of the thousands of pairs of twins studied, the media is going to point out the one that is the oddest.)

But still.  Consider some of the other similarities.  While cigarette brand choice is certainly not genetic, addictive behaviors have an inheritable componentSo does anxiety, which accounts for the headaches and the nail-biting.  And to be a firefighter, you have to have physical strength, courage, and an ability to take risks -- all features of personality that could well have an origin in our DNA.

Last week, a paper appeared in Scientific Reports that supports a strange conjecture -- that dog ownership is partly genetic.  The research, which came out of Uppsala University (Sweden), looked at the concordance rates of dog ownership between identical twins (which share 100% of their DNA) and fraternal twins (which share, on average, half of their DNA).  And the data were clear; the concordance between identical twins is far higher, supporting a large degree of heritability for dog ownership.

"We were surprised to see that a person's genetic make-up appears to be a significant influence in whether they own a dog," said Tove Fall, professor of medical epidemiology and lead author of the study.  "As such, these findings have major implications in several different fields related to understanding dog-human interaction throughout history and in modern times.  Although dogs and other pets are common household members across the globe, little is known how they impact our daily life and health.  Perhaps some people have a higher innate propensity to care for a pet than others."


"The study has major implications for understanding the deep and enigmatic history of dog domestication" said Keith Dobney, Chair of Human Palaeoecology in the Department of Archaeology, Classics and Egyptology at the University of Liverpool, who co-authored the study.  "Decades of archaeological research have helped us construct a better picture of where and when dogs entered into the human world, but modern and ancient genetic data are now allowing us to directly explore why and how."

My sense of why I have dogs is that I've somehow become convinced that my house just isn't filthy enough.  The entryway from our back yard was once off-white linoleum, but years of tracking by various dogs we've owned has left it way more off than white.  And I'm assuming the indoor-outdoor carpeting in our basement has been tan all along, but who knows?

Of course, we own dogs for more than just the random carpet stains and pieces of dismembered squirrel.  They're sweet, cuddly, love to play, and have a boundless enthusiasm for enjoying life that I can only aspire to.  Every time we lose one -- inevitable, given their shorter life span -- it leaves me grieving for months.  And while I've sometimes contemplated not replacing them when they're gone, within the year I'm already perusing PetFinder looking for another rescue puppy to give a home.

Unsurprisingly, given the Fall et al. study, my parents were also major dog lovers.  My dad had a little terrier named Max whom he adopted while working for the post office as a letter carrier.  He had a walking route, and Max would meet him every morning and follow him the entire way.  My dad started bringing snacks for him, and eventually Max's owners said, "You know, we don't have time for him anyway, would you like to take Max home?"  My dad agreed -- and his canine pal spent the rest of his life following him around, even after Max had gone complete blind from cataracts.  In fact, Max would walk behind my dad, keeping track of where he was by sound and smell, and when my dad stopped Max would keep going till he bumped into my dad's leg, then stand there, nose pressed against him, until he started walking again.

Old habits die hard.

Anyhow, this gives us a new perspective on dog ownership, and the strange relationship between our genes and our behavior.  But I need to wind this up, because Guinness wants to play ball, and track more mud around the basement.  You know how it goes.

***********************************

Back in 1989, the United States dodged a serious bullet.

One hundred wild monkeys were imported for experimental purposes, and housed in a laboratory facility in Reston, Virginia, outside of Washington DC.  Soon afterwards, the monkeys started showing some odd and frightening symptoms.  They'd spike a fever, become listless and glassy-eyed, and at the end would "bleed out" -- capillaries would start rupturing all over their body, and they'd bleed from every orifice including the pores of the skin.

Precautions were taken, but at first the researchers weren't overly concerned.  Most viruses have a feature called host specificity, which means that they tend to be infectious only in one species of host.  (This is why you don't need to worry about catching canine distemper, and your dog doesn't need to worry about catching your cold.)

It wasn't until someone realized the parallels with a (then) obscure viral outbreak in 1976 in Zaire (now the Republic of Congo) that the researchers realized things might be much more serious.  To see why, let me just say that the 1976 epidemic, which completely wiped out three villages, occurred on...

... the Ebola River.

Of course, you know that the feared introduction of this deadly virus into the United States didn't happen.  But to find out why -- and to find out just how lucky we were -- you should read Richard Preston's book The Hot Zone.  It's a brilliantly-written book detailing the closest we've come in recent years to a pandemic, and that from a virus that carries with it a 95% mortality rate.  (One comment: the first two chapters of this book require a bit of a strong stomach.  While Preston doesn't go out of his way to be graphic, the horrifying nature of this disease makes some nauseating descriptions inevitable.)

[Note:  If you purchase this book through the image/link below, part of the proceeds will go to supporting Skeptophilia!]





Wednesday, May 22, 2019

Decoding an enigma

One of the most difficult problems in linguistics is deciphering a text when (1) you don't know what language it's in, and (2) you don't know what sound or sounds the symbols stand for.

The classic example of a case where this seemingly impossible task was accomplished is the (rightly) celebrated decipherment of the Linear B script of Crete by Alice Kober and Michael Ventris.  Earlier attempts had incorrectly identified it as being Etruscan, Phoenician, an early Celtic language, Basque, or Hittite.  It was only when Kober meticulously catalogued symbols which frequently appeared together at the end of words that she determined that the script was the written form of an archaic dialect of Mycenaean Greek -- opening the way for her and Ventris to decipher the entire text.

So: not easy, but a tempting job for a dedicated linguist.  This is why people have been working for years on the celebrated Voynich Manuscript, a fifteenth-century illustrated text with an unknown orthography.  Only with the Voynich Manuscript, there was the additional possibility of its being a hoax -- random symbols spelling out gibberish.

The latter possibility is supported by the fact that some of the "words" in the manuscript appear three or more times in a row -- and that the distribution of symbols within words is strange.  Gonzalo Rubio, professor of ancient languages at Pennsylvania State University, said, "[T]he things we know as 'grammatical markers' – things that occur commonly at the beginning or end of words, such as 's' or 'd' in our language, and that are used to express grammar, never appear in the middle of 'words' in the Voynich manuscript.  That's unheard of for any Indo-European, Hungarian or Finnish language."  There's also an unusually high occurrence of words that differ by only one letter -- another feature that is highly odd.  Such characteristics, said cryptanalyst Elizebeth Friedman, means that any attempts to decode it are "doomed to frustration."

This hasn't stopped people from trying, though.  Like the early guesses about Linear B, Voynich has provoked some strange hypotheses -- that it represented Chinese, a dialect of northern German, Latin, English, a synthetic/constructed language (like Klingon and Elvish), or that it was a true code where only certain symbols carried meaning (such as codes where the first letter in each word spells out a message).  None of these has panned out, and a lot of linguists have declared the manuscript untranslatable.

A page from the Voynich Manuscript [Image is in the Public Domain]

Which still hasn't stopped people from trying.  And just two weeks ago, a paper showed up in the journal Romance Studies which claims that a linguist named David Cheshire at the University of Bristol has cracked the Voynich Manuscript -- and that it's written in a language he calls "proto-Romance" that is somewhere between Church Latin and Italian.

Of course, this got a lot of linguists pretty stirred up.  One of the chief complaints -- one which I had myself when I read the paper -- is that he gives no explanation of how he arrived at his answer, only that "the solution was found by employing an innovative and independent technique of thought experiment."

Myself, I find that unconvincing.  To return to the Linear B decipherment, rightly considered to be the gold standard for such undertakings, Kober and Ventris had a strong and detailed argument for why the order and distribution of the symbols supported the hypothesis that the language it represented was Mycenaean Greek.  The attitude in the scientific world, in every field, is that if you want your claim to be believed, you need to present not only your data and your conclusions, but a cogent argument of how you got from one to the other.  Here, we have nothing more to go on but Cheshire's claim of an "innovative technique" and his end result, the sound/symbol correspondence.

Some linguists were a lot more blunt than I'm being.  Lisa Fagin Davis, executive director of the Medieval Academy of America, tweeted a response to Cheshire's paper, "Sorry, folks, 'proto-Romance language' is not a thing.  This is just more aspirational, circular, self-fulfilling nonsense."  In an article in Ars Technica, she explains in more detail:
As with most would-be Voynich interpreters, the logic of this proposal is circular and aspirational: he starts with a theory about what a particular series of glyphs might mean, usually because of the word’s proximity to an image that he believes he can interpret.  He then investigates any number of medieval Romance-language dictionaries until he finds a word that seems to suit his theory.  Then he argues that because he has found a Romance-language word that fits his hypothesis, his hypothesis must be right.  His “translations” from what is essentially gibberish, an amalgam of multiple languages, are themselves aspirational rather than being actual translations.
After the criticisms began to appear, the University of Bristol walked back their support of Cheshire, stating, "Following media coverage, concerns have been raised about the validity of this research from academics in the fields of linguistics and medieval studies. …  The research was entirely the author’s own work and is not affiliated with the University of Bristol, the school of arts nor the Centre for Medieval Studies."

Which is pretty harsh.  Cheshire, of course, is not backing down, and says he'll be vindicated.  And after all, that's how science works.  Someone proposes a model to explain some set of data, throws the model out there to his/her colleagues, and lets the feeding frenzy begin.  Those models that bear up to peer review, replication, and deep analysis are supported, and ones that don't are thrown out.

My guess is that this will turn out to be the latest in a long list of unsuccessful attempts to decipher one of the oddest artifacts from the Middle Ages, but who knows?  Maybe Cheshire will turn out to have hit the bullseye.  After all, people didn't believe Kober and Ventris at first.  And if he's right, it'll be a coup -- and provide us with an answer to one of the most persistent enigmas in linguistics.

***********************************

Back in 1989, the United States dodged a serious bullet.

One hundred wild monkeys were imported for experimental purposes, and housed in a laboratory facility in Reston, Virginia, outside of Washington DC.  Soon afterwards, the monkeys started showing some odd and frightening symptoms.  They'd spike a fever, become listless and glassy-eyed, and at the end would "bleed out" -- capillaries would start rupturing all over their body, and they'd bleed from every orifice including the pores of the skin.

Precautions were taken, but at first the researchers weren't overly concerned.  Most viruses have a feature called host specificity, which means that they tend to be infectious only in one species of host.  (This is why you don't need to worry about catching canine distemper, and your dog doesn't need to worry about catching your cold.)

It wasn't until someone realized the parallels with a (then) obscure viral outbreak in 1976 in Zaire (now the Republic of Congo) that the researchers realized things might be much more serious.  To see why, let me just say that the 1976 epidemic, which completely wiped out three villages, occurred on...

... the Ebola River.

Of course, you know that the feared introduction of this deadly virus into the United States didn't happen.  But to find out why -- and to find out just how lucky we were -- you should read Richard Preston's book The Hot Zone.  It's a brilliantly-written book detailing the closest we've come in recent years to a pandemic, and that from a virus that carries with it a 95% mortality rate.  (One comment: the first two chapters of this book require a bit of a strong stomach.  While Preston doesn't go out of his way to be graphic, the horrifying nature of this disease makes some nauseating descriptions inevitable.)

[Note:  If you purchase this book through the image/link below, part of the proceeds will go to supporting Skeptophilia!]





Tuesday, May 21, 2019

The stone hand illusion

One of the reasons I trust science is that I have so little trust in my own brain's ability to assess correctly the nature of reality.

Those may sound like contradictions, but they really aren't.  Science is a method that allows us to evaluate hard data -- measurements by devices that are designed to have no particular biases.  By relying on measurements from machines, we are bypassing our faulty sensory equipment, which can lead us astray in all sorts of ways.  In Neil deGrasse Tyson's words, "[Our brains] are poor data-taking devices... that's why we have machines that don't care what side of the bed they woke up on that morning, that don't care what they said to their spouse that day, that don't care whether they had their morning caffeine.  They'll get the data right regardless."

But we still believe that we're seeing what's real, don't we?  "I saw it with my own eyes" is still considered the sine qua non for establishing what reality is.  Eyewitness testimony is still the strongest evidence in courts of law.  Because how could it be otherwise?  Maybe we miss minor things, but how could we get it so far wrong?

A scientist in Italy has knocked another gaping hole in our confidence that our brain can correctly interpret the sensory information it's given -- this time with an actual hammer.

Some of you may have heard of the "rubber hand illusion" that was created in a study back in 1998 by Matthew Botvinick and Jonathan Cohen.  In this experiment, the two scientists placed a rubber hand in view of a person whose actual hand is shielded from view by a curtain.  The rubber hand is stroked with a feather at the same time as the person's real (but out-of-sight) hand receives a similar stroke -- and within minutes, the person becomes strangely convinced that the rubber hand is his hand.

The Italian experiment, which I found out about from an article in Discover Online, substitutes an auditory stimulus for the visual one -- with an even more startling result.

Irene Senna, professor of psychology at Milono-Bicocca University in Milan, rigged up a similar scenario to Botvinick and Cohen's.  A subject sits with one hand through a screen.  On the back of the subject's hand is a small piece of foil which connects an electrical lead to a computer.  The subject sees a hammer swinging toward her hand -- but the hammer stops just short of smashing her hand, and only touches the foil gently (but, of course, she can't see this).  The touch of the hammer sends a signal to the computer -- which then produces a hammer-on-marble chink sound.

And within minutes, the subject feels like her hand has turned to stone.

[Image licensed under the Creative Commons Tony Hisgett from Birmingham, UK, Hand Sculpture 1 (22797821268), CC BY 2.0]

What is impressive about this illusion is that the feeling persists even after the experiment ends, and the screen is removed -- and even though the test subjects knew what was going on.  Subjects felt afterwards as if their hands were cold, stiff, heavier, less sensitive.  They reported difficulty bending their wrists.

To me, the coolest thing about this is that our knowledge centers, the logical and rational prefrontal cortex and associated areas, are completely overcome by the sensory-processing centers when presented with this scenario.  We can know something isn't real, and simultaneously cannot shake the brain's decision that it is real.  None of the test subjects was crazy; they all knew that their hands weren't made of stone.  But presented with sensory information that contradicted that knowledge, they couldn't help but come to the wrong conclusion.

And this once again illustrates why I trust science, and am suspicious of eyewitness reports of UFOs, Bigfoot, ghosts, and the like.  Our brains are simply too easy to fool, especially when emotions (particularly fear) run high.  We can be convinced that what we're seeing or hearing is the real deal, to the point that we are unwilling to admit the possibility of a different explanation.

But as Senna's elegant little experiment shows, we just can't rely on what our senses tell us.  Data from scientific measuring devices will always be better than pure sensory information.  To quote Tyson again: "We think that the eyewitness testimony of an authority -- someone wearing a badge, or a pilot, or whatever -- is somehow better than the testimony of an average person.  But no.  I'm sorry... it's all bad."

***********************************

Back in 1989, the United States dodged a serious bullet.

One hundred wild monkeys were imported for experimental purposes, and housed in a laboratory facility in Reston, Virginia, outside of Washington DC.  Soon afterwards, the monkeys started showing some odd and frightening symptoms.  They'd spike a fever, become listless and glassy-eyed, and at the end would "bleed out" -- capillaries would start rupturing all over their body, and they'd bleed from every orifice including the pores of the skin.

Precautions were taken, but at first the researchers weren't overly concerned.  Most viruses have a feature called host specificity, which means that they tend to be infectious only in one species of host.  (This is why you don't need to worry about catching canine distemper, and your dog doesn't need to worry about catching your cold.)

It wasn't until someone realized the parallels with a (then) obscure viral outbreak in 1976 in Zaire (now the Republic of Congo) that the researchers realized things might be much more serious.  To see why, let me just say that the 1976 epidemic, which completely wiped out three villages, occurred on...

... the Ebola River.

Of course, you know that the feared introduction of this deadly virus into the United States didn't happen.  But to find out why -- and to find out just how lucky we were -- you should read Richard Preston's book The Hot Zone.  It's a brilliantly-written book detailing the closest we've come in recent years to a pandemic, and that from a virus that carries with it a 95% mortality rate.  (One comment: the first two chapters of this book require a bit of a strong stomach.  While Preston doesn't go out of his way to be graphic, the horrifying nature of this disease makes some nauseating descriptions inevitable.)

[Note:  If you purchase this book through the image/link below, part of the proceeds will go to supporting Skeptophilia!]





Monday, May 20, 2019

All in the family

In the latest from the Wishful Thinking department, we have a woman in Murrysville, Pennsylvania who claims she is the Virgin Mary's first cousin, 65 times removed.

Mary Beth Webb began her inquiry into her genealogy in 1999, shortly after her brother was diagnosed with terminal cancer.  Like most of us who have done genealogical research, Webb started with census and other vital records, and used online resources like Ancestry.com and Rootsweb.  But this evidently proved inadequate -- she began to run into dead ends, which genealogists call "brick walls."  I have several of these frustrating people in my own family tree, the most annoying of which is the direct paternal ancestor of my grandmother.  His name is recorded as John Scott in all of the records -- but a recent Y-DNA study of one of his patrilineal descendants proved beyond question that he was actually a Hamilton, allied to the Scottish Clan Hamilton of Raploch.  And interestingly... two of his grandsons were named Hamilton Scott.

But we have been unable to find anything more about his origins, despite extensive research.

Perhaps, though, we should take a page from Webb's book.  Because when she became stymied by various long-dead ancestors, she adopted a novel method for researching her roots.  She simply asked her parents.

The "novel" part comes in because her parents were both dead at the time.

Fortunately for her, her cousin is a medium, and was happy to contact her parents for her, and (after his death) her brother.  And all three of the dear departed told her all sorts of details about her ancestors, because (after all) the whole lot of them were up in heaven with them.

I don't know if that'd work so well in my family.  I've got some seriously sketchy ancestry, including a guy who spent years in prison in New Jersey for "riot, poaching, and mischief," a Scottish dude who lost his soul to the devil in a game of cards, and a French military officer who very nearly got hanged for killing a guy he found in flagrante delicto with his wife.  So I might have better success if the medium tried contacting people down below, if you get my drift.

"Yes... great-great-great grandpa Jean-Pierre says to tell you hi, and to also to let you know you're a direct descendant of Attila the Hun.  Also, please send down an air conditioner, because it's a bit toasty down here.  Thanks bunches."

But of course, Webb's relatives all were either nicer or luckier or both, so she got scads of heaven-sent information about her genealogy.  And after a bit of this kind of "research," she found out that she was a direct descendant of Joseph of Arimathea, who was allegedly the Virgin Mary's uncle. According to Webb's calculations, this makes her Mary's first cousin 65 times removed.

The problem is, the whole thing about Descent From Antiquity (as genealogists refer to any claims of pre-medieval proven ancestry) is that the best historians don't consider any of it to be true.  The time between the Fall of Rome and the beginning of the Medieval Age was seriously lacking in reliable documentation, and what we have in the way of such records stands a good chance of being (1) a forgery, or (2) a lie.  Or (3), both.  By the time the Medieval Age was in full swing, the Romans were looked upon as being a Golden Age, despite the fact that a good fraction of the nobility in ancient Rome seemed to have some major screws loose.  So there were lots of people claiming descent from the Emperors and Empresses to boost their own stature, with several proposed routes going the proconsul Flavius Afranius Syagrius, and thence to the Egyptian pharaohs and whatnot.


But some people one-up even Webb's claims, and trace their lineages all the way back to Adam and Eve.  I kid you not. If you go into Rootsweb, you can do a search for people descended from Adam and Eve, and find thousands.

Now that's what I call descent from antiquity.

But, sadly, even the descent from the Romans relies on poor historical research and lots of wishful thinking, as does Webb's claim to have proven descent from Joseph of Arimathea.  About as far back as anyone with European ancestry can reliably get is Charlemagne, which sounds cool but isn't because damn near everyone with European ancestry descends from him, because he was proficient at one other thing besides ruling most of Western Europe, if you catch my meaning.

But honestly, that's really not that surprising.  Given the small size of the population back then, if you go back far enough (some geneticists say 1200 C.E. is sufficient), then you descend from everyone in your ancestral homeland who left descendants.  Put another way: prior to 1200 C. E., you can divide all of humanity into two groups; those who were the ancestors of most everyone alive on the Earth today, and those who were the ancestors of no one.  So we're all cousins, really.  And if Joseph of Arimathea left progeny -- which no one knows for sure -- then chances are, Mary Beth Webb is his descendant.

But chances are so am I, and (if you have European or Middle Eastern ancestry), so are you.

But I don't know that because my dead relatives told me so, I just know it because of genetic studies and logic.  Which may be less cool, but is a damn sight more reliable than trying to get a direct line to great-great-great grandpa Jean-Pierre down in hell.

***********************************

Back in 1989, the United States dodged a serious bullet.

One hundred wild monkeys were imported for experimental purposes, and housed in a laboratory facility in Reston, Virginia, outside of Washington DC.  Soon afterwards, the monkeys started showing some odd and frightening symptoms.  They'd spike a fever, become listless and glassy-eyed, and at the end would "bleed out" -- capillaries would start rupturing all over their body, and they'd bleed from every orifice including the pores of the skin.

Precautions were taken, but at first the researchers weren't overly concerned.  Most viruses have a feature called host specificity, which means that they tend to be infectious only in one species of host.  (This is why you don't need to worry about catching canine distemper, and your dog doesn't need to worry about catching your cold.)

It wasn't until someone realized the parallels with a (then) obscure viral outbreak in 1976 in Zaire (now the Republic of Congo) that the researchers realized things might be much more serious.  To see why, let me just say that the 1976 epidemic, which completely wiped out three villages, occurred on...

... the Ebola River.

Of course, you know that the feared introduction of this deadly virus into the United States didn't happen.  But to find out why -- and to find out just how lucky we were -- you should read Richard Preston's book The Hot Zone.  It's a brilliantly-written book detailing the closest we've come in recent years to a pandemic, and that from a virus that carries with it a 95% mortality rate.  (One comment: the first two chapters of this book require a bit of a strong stomach.  While Preston doesn't go out of his way to be graphic, the horrifying nature of this disease makes some nauseating descriptions inevitable.)

[Note:  If you purchase this book through the image/link below, part of the proceeds will go to supporting Skeptophilia!]





Saturday, May 18, 2019

The path to acceptance

Lately, it's been mighty hard to stay upbeat.  Insularity, fear, intolerance, and suspicion have ruled the day, along with their inevitable outcomes -- racism, homophobia, and other forms of bigotry.

Even so, I've always been optimistic about humanity in general.  Yes, we're capable of some horrifying actions, but there are just as many (or more) cases where people acted with astonishing selflessness.  We're a complex species, and we often don't respond the way you'd think -- in fact, sometimes we even surprise ourselves.

What a lot of social scientists would like to know is how to decrease the former and increase the latter.  And new research in Proceedings of the National Academy of Sciences, by Miguel R. Ramos, Matthew R. Bennett, Douglas S. Massey, and Miles Hewstone, supports a contention I've had for years; that it's very hard to stay prejudiced against a group when you start interacting with members of the group.

[Image licensed under the Creative Commons Frerieke from The Hague, The Netherlands, Diversity and Unity, CC BY 2.0]

In a paper published last week called "Humans Adapt to Social Diversity Over Time," Ramos et al. describe an interesting tendency.  Faced with changes to what had been a homogeneous population, through immigration or a change in the acknowledgment of a group (as in LGBTQ individuals coming out), initially there's resistance, but over time the original population responds by adjusting and becoming more accepting overall.

The authors write:
Humans have evolved cognitive processes favoring homogeneity, stability, and structure.  These processes are, however, incompatible with a socially diverse world, raising wide academic and political concern about the future of modern societies.  With data comprising 22 [years] of religious diversity worldwide, we show across multiple surveys that humans are inclined to react negatively to threats to homogeneity (i.e., changes in diversity are associated with lower self-reported quality of life, explained by a decrease in trust in others) in the short term.  However, these negative outcomes are compensated in the long term by the beneficial influence of intergroup contact, which alleviates initial negative influences.  This research advances knowledge that can foster peaceful coexistence in a new era defined by globalization and a socially diverse future.

In other words, bigotry can be cured.  I know more than one case of a family where one of the individuals was seriously homophobic -- until someone they're close to came out.  At that point, the bigoted individual has to adjust those negative stereotypes to what (s)he knows of the person (s)he loves and has known for years.  It's hard to hate someone once you recognize their common humanity, when you see they laugh, love, hurt, and bleed just like you do.

Of course, it can go the other way.  There are all too many cases of bigotry that has survived contact with members of the disparaged group, of coworkers and neighbors and family members who have still been targeted, of becoming victims of people they've known for years.  But the hopeful message of the Ramos et al. paper is that this reaction is far less common than an increase in acceptance, trust, and understanding.

It's easy to focus on the negative, and certainly the media encourages that.  Outrage increases readership, and (let's face it) there's still a lot to be outraged about.  But this research gives us a way to combat those tendencies for humans to be insular, suspicious, and prejudiced.  And it also is a hopeful note for our own society, which is becoming more heterogeneous whether we want it to or not.  What Ramos et al. suggests is that we can expect to see some growing pains -- of the sort that is exemplified by our current leadership -- but that over time, we will come to accept, and even appreciate, our diversity, to look upon it as a strength rather than as a threat.

*******************************************

When the brilliant British neurologist and author Oliver Sacks died in August of 2015, he was working on a collection of essays that delved into some of the deepest issues scientists consider: evolution, creativity, memory, time, and experience.  A year and a half ago, that collection was published under the title The River of Consciousness, and in it he explores those weighty topics with his characteristic humor, insight, and self-deprecating humility.

Those of us who were captivated by earlier works such as The Man Who Mistook His Wife for a Hat, Musicophilia, Awakenings, and Everything in its Place will be thrilled by this book -- the last thoughts of one of the best thinkers of our time.

[Note:  If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, May 17, 2019

Lurching toward Gilead

Two months ago, an eleven-year-old girl in Argentina was forced to give birth to a baby conceived because she was raped by her grandmother's 65-year-old boyfriend.

The girl was clear she wanted to terminate the pregnancy.  She told her doctors, "I want to remove what the old man put inside me."  But the doctors hemmed and hawed -- lied, even, giving her vitamins alleged to make the fetus develop faster.  In February, she gave birth by caesarian section to a baby that is not expected to survive.  During the procedure, the girl's blood pressure rose to life-threatening levels, requiring emergency treatment, and she nearly died herself.

This is the kind of situation we will ultimately face in Alabama, which this week passed the most stringent abortion restrictions in the United States, signed into law by Governor Kay Ivey.  In fact, by current law, had the Argentinian girl been a citizen of Alabama and gone through with terminating the pregnancy, she and the doctor who performed it would have been punished far more harshly than the rapist who violated her.

Make no mistake here.  This is not about discouraging unwanted pregnancies.  This is about controlling women and restricting their choices.  If elected officials in Alabama were actually concerned about decreasing unwanted pregnancies, they'd mandate comprehensive sex education and increase access to contraceptives, which are two things that have been shown to actually work.  (States with both of those have a lower abortion rate overall, and as access to birth control has become more common worldwide, the overall abortion rate has fallen steadily.)  Instead, what they have now -- abstinence-only sex education and restrictions on contraceptives -- is correlated with a higher rate of teen pregnancy.

[Image licensed under the Creative Commons internets_dairy, Pro-choice chants (2509914840), CC BY 2.0]

The most infuriating part is these anti-choice politicians calling themselves "pro-life."  They're not "pro-life."  They're "pro-embryo."  They talk about how precious and God-given a little blob of cells is, but once that baby's born?  It's on its own.  Alabama is tied for fourth-to-worst in terms of infant mortality, fourth-to-worst in health care and health care outcomes, fifth-to-worst in terms of access and funding for mental health care, and dead last in terms of education.

So "pro-life?"  Give me a break.  The attitude in Alabama is that a person's rights begin at conception and end at birth.

Hell, we're giving women less bodily autonomy than we give corpses.  It is illegal to take an organ from a corpse unless the person gave express permission prior to death, even if it would save someone's life.  So once again: this isn't about saving lives.  It's about controlling the choices of women.

And it's not that I'm "pro-abortion."  Come on, really?  No one is "pro-abortion."  Abortion is not something anyone takes lightly.  It is a gut-wrenching decision and often is simply the best of bad choices.  The decision is between a woman and her doctor.  My beliefs or opinions about it have no place in the discussion.  None.

That misrepresentation of what "pro-choice" means has been turbo-charged by the current administration, where Donald Trump once again (surprise!) blatantly lied to stir up his fanatical base by saying that under current rules, babies can be killed at birth.  His exact words:
The baby is born.  The mother meets with the doctor.  They take care of the baby.  They wrap the baby beautifully.  And then the doctor and the mother determine whether or not they will execute the baby.
Which, of course, doesn't happen.  Ever.  The truth is that only 1% of abortions take place after twenty weeks into the pregnancy, and those are almost always because the life of the mother is at risk.  So as usual, we have Trump making shit up as he goes along, and his followers enshrining it as revealed truth.  Because, after all, it's much easier to demonize your opponents if you represent their position as a straw man, especially when the people who support you don't question a single damn thing you say.

And while we're at it: why is there no discussion amongst the legislators in Alabama about penalizing the men who fathered aborted fetuses?  Women can be sent to prison for life, as can their doctors.  The guy who's responsible for the pregnancy?  Nada.

So this is where we are in Alabama.  Also Georgia, Ohio, and (soon) Missouri.  This is, pure and simple, the crafting of law based on religion (which is the impetus for most "pro-life" talking points), with no acknowledgment of the complexity of the issue, of the impact this has on women's autonomy, of what this says to victims of rape and incest.  It's certain to be challenged in the courts, but if it makes its way to the Supreme Court, my fear is that it'll stand -- thus the Right's stalwart defense of vehemently anti-choice Brett Kavanaugh.  (Remember Susan Collins's mealy-mouthed support of Kavanaugh, that he wouldn't "overturn established law?"  Yeah.  If you don't think that the emboldened anti-choicers are now going to go after Roe v. Wade, I've got oceanfront property in Nebraska I'd like to sell you.)

I'm going to keep talking, keep writing, keep fighting, but it's taking a toll.  Those of us who object to the rightward lurch this country has taken, which a friend of mine calls "Gileadification," have to stand up and speak.  Loudly.  But we're having to fight this on so many fronts -- the upsurge in white supremacy, the eroding of rights for LGBTQ individuals and minorities, the rampant corruption in the federal government, the warmongering, the blocking by Mitch McConnell and his cronies of any legislation that hints of bipartisanship, and the daily barrage of lies -- and it's exhausting me.  I'm not anywhere near giving up, but man, I am seriously ready for some good news for a change.

*******************************************

When the brilliant British neurologist and author Oliver Sacks died in August of 2015, he was working on a collection of essays that delved into some of the deepest issues scientists consider: evolution, creativity, memory, time, and experience.  A year and a half ago, that collection was published under the title The River of Consciousness, and in it he explores those weighty topics with his characteristic humor, insight, and self-deprecating humility.

Those of us who were captivated by earlier works such as The Man Who Mistook His Wife for a Hat, Musicophilia, Awakenings, and Everything in its Place will be thrilled by this book -- the last thoughts of one of the best thinkers of our time.

[Note:  If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Thursday, May 16, 2019

Walls in our minds

One of the biggest unsolved mysteries of science is how the brain encodes what we know, think about, and experience.

For many types of information, we know where in the brain they are stored.  But in what form, and how we retrieve it, is still not known.  As I tell my neuroscience students: think of something simple, like your middle name, the name of your first pet, the name of the first president of the United States.  Now: where in your mind was that information before I asked you to remember it?

Pretty bizarre to consider, isn't it?

Interpretation of sensory input is as mysterious as memory.  When I look around my office, where I'm currently writing this, I can recognize all sorts of objects -- my CD collection, books, the masks hanging on the wall, a wine glass hand-made by my son, an antique typewriter my wife got me for my last birthday.  But... how?  What's being projected onto my retina is just a bunch of splotches of light, shadow, and color, and my brain has to take that chaos and somehow make sense of it.  How can we tell where the edge of an object is?  How do we recognize things -- and know them to be the same objects if seen from a different angle, meaning the pattern of colors thrown onto your retina is completely changed?

Interestingly, there are some people who can't do this.  In apperceptive visual agnosia, usually caused by damage to the visual cortex of the brain, people are cognitively normal in every respect except that they can't recognize anything they see.  It all looks like a random moving kaleidoscope of colors.  Because in other respects they're normal, they can remember what they're told and respond appropriately -- if someone said, "Hey, you see that blob of blue and tan and brown over there?  That's a person, and he's named Gordon," a sufferer from apperceptive visual agnosia would be able to say, "Oh, hi, Gordon," and have a normal conversation with me.  But if I stood up (changing the shape of the blob of color) or changed my shirt, or made any other sort of alteration to what he's seeing, he'd no longer recognize me, not only as a particular human, but as human at all.  Because they're perfectly intelligent, he might be able to reason that since Gordon was over there a few seconds ago, and there's a different blob of blue and tan and brown nearby, that's probably Gordon, too, but it wouldn't be because he actually recognized me.  It would be a logical inference, not visual interpretation.

An interesting piece was added to the puzzle last week with a paper in Neuron that came from some research at Columbia University led by neuroscientist Nikolaus Kriegeskorte.  He and his team were investigating how we perceive walls -- how we know where the edges and barriers are in our environment, a pretty critical skill for spatial navigation.  By showing participants images with walls and other barriers and allowing them to navigate the space virtually, and using fMRI and magnetoencephalography (MEG) neuroimaging, they were able to narrow down where we do edge and obstacle processing to the "occipital place area" (OPA), one of the visual processing centers.

[Image licensed under the Creative Commons Pawel Wozniak, Brick wall close-up view, CC BY-SA 3.0]

"Vision gives us an almost instant sense where we are in space, and in particular of the geometry of the surfaces -- the ground, the walls -- which constrain our movement," Kriegeskorte said.  "It feels effortless, but it requires the coordinated activity of multiple brain regions.  How neurons work together to give us this sense of our surroundings has remained mysterious.  With this study, we are a step closer to solving that puzzle...  Previous studies had shown that OPA neurons encode scenes, rather than isolated objects.  But we did not yet understand what aspect of the scenes this region's millions of neurons encoded...  We would like to put these [data] together and build computer vision systems that are more like our own brains, systems that have specialized machinery like what we observe here in the human brain for rapidly sensing the geometry of the environment."

Once again, we have an idea of where our perception of barriers is housed, but not so much information about how it's stored or accessed.  How, when we see a wall -- especially, as in this experiment, a two-dimensional representation of a wall -- do we recognize it as such, and not just as a smear of colors, lines, and angles?  As Kriegeskorte said, we're a step closer, which is fantastic -- but still a long way away from solving the puzzle of perception.

*******************************************

When the brilliant British neurologist and author Oliver Sacks died in August of 2015, he was working on a collection of essays that delved into some of the deepest issues scientists consider: evolution, creativity, memory, time, and experience.  A year and a half ago, that collection was published under the title The River of Consciousness, and in it he explores those weighty topics with his characteristic humor, insight, and self-deprecating humility.

Those of us who were captivated by earlier works such as The Man Who Mistook His Wife for a Hat, Musicophilia, Awakenings, and Everything in its Place will be thrilled by this book -- the last thoughts of one of the best thinkers of our time.

[Note:  If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]