Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, February 19, 2020

Thin places

My first trip overseas, back in 1995, was an ambitious one; I did a month-long solo hike across England, starting on the shore of the Irish Sea in Blackpool and ending on a hill overlooking the North Sea in Whitby.  I decided to have a theme for the trip -- a practice I have continued to this day -- and the theme I chose was monasteries.

A great many of the abbeys in England were destroyed during the "Dissolution of the Monasteries," when King Henry VIII decided the church was getting way too rich and powerful and decided to see what he could do to remedy that.  Between 1536 and 1541, over eight hundred monasteries, abbeys, and convents were closed and their property sold off, the abbots, priests, and nuns turned out or arrested outright, the majestic buildings left to sink slowly into ruin.

Along the path I took, which largely coincides with the North York Moors Trail, there were a number of these relics, and I made a point of seeing as many as I could.  They were impressive, beautiful, tragic places, monuments not only to spirituality but to greed (on both sides of the struggle).

Unsurprisingly, the spiritual side of it didn't have a great impact on me, except for my sympathy for the religious men and women who had dedicated themselves to the contemplative life and then had those lives turned upside down by the conflict.  But it all seemed relegated in the distant past, unable to touch my modern experience except as a historical footnote.

Until I got to Rievaulx Abbey, near the town of Helmsley.

My hike into Rievaulx was on a gorgeous day -- one of the few I had during a four-week period that was cold and rainy even by English standards.  That day the weather was mild and sunny, with only a few white clouds in an azure sky.  I crested a low line of hills, and looked down into the little valley in which the ruins of the abbey sit, and was dumbstruck.

It was not solely because of the place's beauty, although beautiful it certainly is.  The place gave me chills, as if I was looking at something that wasn't quite of this world -- a reaction I had never experienced before and haven't experienced since.

[Image licensed under the Creative Commons WyrdLight.com, RievaulxAbbey-wyrdlight-24588, CC BY-SA 3.0]

Now, twenty-five years ago, I was every bit as much of a skeptic as I am now, but I couldn't shake the feeling the entire time I wandered around the abbey grounds.  I dropped my pack and shucked my shoes by the side of a little tumbling river that runs through the valley, cooling my sore feet, and kept thinking about the men and women who had lived here -- and whose presence I could still, inexplicably, feel around me.

During my visit, I struck up a conversation with a friendly middle-aged couple, who ended up inviting me to have mid-afternoon tea with them.  I mentioned my odd sensations to them, and the woman immediately smiled.  "Oh, yes," she said.  "Lots of people feel that way about Rievaulx.  You get the impression not that the place is sacred because it's the site of an abbey, but that the abbey was built there because the place was already sacred."

I have never been able to explain what I felt during that visit, other than my rational side's certainty that the beauty of the day and the history of the place simply got the better of me.  But I keep coming back to the fact that I never had those sensations in any of the other religious sites I saw on that trip -- which included gorgeous, history-laden places such as York Cathedral, Fountains Abbey, Kirkham Priory, and Grey Friars Tower.  There was something different about Rievaulx, but what that something is, I've never put my finger on.

The Scots call spots like Rievaulx "thin places."  We walk side-by-side with the spirit world, the legends go, separated by an invisible veil, but in some places the veil is thin and we get a glimpse, or sometimes just a feeling, that there something more there than meets the eye.  Places like that aren't haunted in the conventional sense, but true believers will tell you that you can't go there and come away unscathed.

I won't say that my visit to Rievaulx convinced me of some kind of ineffable otherworld; after all here I am, over two decades later, still talking about rationalism and skepticism and for the most part casting a wry eye at claims of the paranormal.  But something happened to me in that little valley, whether I was picking up on a thin spot in the veil or it was simply the product of my senses acting on my often-overwrought imagination.

And while I don't agree with his basic assumptions, the whole experience makes the quote from the Romanian philosopher Mircea Eliade have a strange resonance for me: "Man becomes aware of the sacred because it manifests itself, shows itself, as something wholly different from the profane...  In each case we are confronted by the same mysterious act — the manifestation of something of a wholly different order, a reality that does not belong to our world, in objects that are an integral part of our natural profane world."

*******************************

This week's book recommendation is a fascinating journey into a topic we've visited often here at Skeptophilia -- the question of how science advances.

In The Second Kind of Impossible, Princeton University physicist Paul Steinhardt describes his thirty-year-long quest to prove the existence of a radically new form of matter, something he terms quasicrystals, materials that are ordered but non-periodic.  Faced for years with scoffing from other scientists, who pronounced the whole concept impossible, Steinhardt persisted, ultimately demonstrating that an aluminum-manganese alloy he and fellow physicists Luca Bindi created had all the characteristics of a quasicrystal -- a discovery that earned them the 2018 Aspen Institute Prize for Collaboration and Scientific Research.

Steinhardt's book, however, doesn't bog down in technical details.  It reads like a detective story -- a scientist's search for evidence to support his explanation for a piece of how the world works.  It's a fascinating tale of persistence, creativity, and ingenuity -- one that ultimately led to a reshaping of our understanding of matter itself.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, February 18, 2020

Back to Africa

I was kind of tickled, when I got my "23 & Me" results back a couple of years ago, to find out that I had 284 identified Neanderthal markers, identifying me as having more Neanderthal ancestry than 60% of the samples tested.

The reason I was happy about this is not because it gave me an explanation for why I like my steaks rare and have a general aversion to wearing clothes.  It was more because I find the Neanderthals a fascinating bunch.  Far from the low-intelligence cave trolls a lot of us picture them as -- witness the use of their name as an insult -- by the end they actually had larger brains than your average modern Homo sapiens.  They had culture; they anointed and buried their dead, seem to have had music (if the archaeologists are correct about the origin of the Divje Babe flute), and might even have had language -- they had the same variant of the FOX-P2 gene that we do, which is instrumental to our ability to understand and produce spoken language.

[Image licensed under the Creative Commons Clemens Vasters, Neanderthal in a business suit, CC BY 2.0]

It also wasn't terribly surprising in my own case, as Europeans generally have more Neanderthal ancestry than any other group, and my test results showed me to be -- also unsurprising, given what I know of my family tree -- nearly 100% of European origin, mainly French, Scottish, Dutch, German, and English.  The Neanderthals themselves are named after the Neander Valley of Germany, where archaeologists found the first fossils of the species (or subspecies, depending on who you believe).  So once it was determined that they had interbred with early modern Homo sapiens, their geographical distribution led to a (correct) surmise that Europeans would have more Neanderthal ancestry than other ethnic groups, for the same reason that southeast Asians and Native Australians have more Denisovan ancestry than the rest of us.

That's why a paper in Cell two weeks ago came as such a shock.  In it, a team led by Lu Chen of Princeton University found that a number of African ethnic groups, especially those in northern and western Africa, have a lot more Neanderthal ancestry than anyone realized.

Because it's still not as much as the Europeans have, and the African Neanderthal genes identified are variants usually found in Europe, the guess is that some of the European Homo sapiens/Neanderthal hybrids made their way across (or around) the Mediterranean in a "back-to-Africa" migration, injecting Neanderthal genes into groups that previously had little to no Neanderthal ancestry.

It also means that geneticists may be underestimating the number of Neanderthal markers in the rest of us.  Because those estimates were made using comparison between sample DNA and that of people thought to have no Neanderthal ancestry at all -- such as the Yoruba of Nigeria -- if those African groups did have ancestry that was the result of a back-to-Africa migration by European hybrids, then that revises the baseline upward.  The former estimates of 1.7-1.8% Neanderthal DNA for your average person of European descent might be on the low side.

"Our work highlights how humans and Neanderthals interacted for hundreds of thousands of years, with populations dispersing out of and back into Africa," said study co-author Joshua Akey in an interview with Science News.  "Remnants of Neanderthal DNA survive in every modern human population studied to date."

I find it fascinating how DNA is now being used to track relationships and migratory patterns not only of other animal species, but of humans.  And it's gotten pretty accurate.  It picked up my Ashkenazic ancestry, identifying it at 6% -- just about right based on my one great-great-grandfather, Solomon Meyer-Lévy of Dauendorf, Alsace, who emigrated from his birthplace and joined a small community of French-speaking Jews in Donaldsonville, Louisiana in around 1850.  The other interesting result in my own DNA that made sense was a smattering of Italian ancestry, undoubtedly because my father's paternal line ancestor, Jacques-Esprit Ariey-Bonnet, was born in a little town in the French Alps, quite close to the border of Italy.

So the amount we can learn about our own past from our genes is staggering, and I'm sure there are other surprises in store for us.  It informs us not only of our physical makeup but our history, a millions-of-years-long trail leading back to our most distant hominid ancestors on the savannas of Kenya and Tanzania.

Although it still doesn't explain the rare steaks and nudity thing.

*******************************

This week's book recommendation is a fascinating journey into a topic we've visited often here at Skeptophilia -- the question of how science advances.

In The Second Kind of Impossible, Princeton University physicist Paul Steinhardt describes his thirty-year-long quest to prove the existence of a radically new form of matter, something he terms quasicrystals, materials that are ordered but non-periodic.  Faced for years with scoffing from other scientists, who pronounced the whole concept impossible, Steinhardt persisted, ultimately demonstrating that an aluminum-manganese alloy he and fellow physicists Luca Bindi created had all the characteristics of a quasicrystal -- a discovery that earned them the 2018 Aspen Institute Prize for Collaboration and Scientific Research.

Steinhardt's book, however, doesn't bog down in technical details.  It reads like a detective story -- a scientist's search for evidence to support his explanation for a piece of how the world works.  It's a fascinating tale of persistence, creativity, and ingenuity -- one that ultimately led to a reshaping of our understanding of matter itself.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Monday, February 17, 2020

The universal language

Sometimes I have thoughts that blindside me.

The last time that happened was three days ago, while I was working in my office and our elderly coonhound, Lena, was snoozing on the floor.  Well, as sometimes happens to dogs, she started barking and twitching in her sleep, and followed it up with sinister-sounding growls -- all the more amusing because while awake, Lena is about as threatening as your average plush toy.

So my thought, naturally, is to wonder what she was dreaming about.  Which got me thinking about my own dreams, and recalling some recent ones.  I remembered some images, but mostly what came to mind were narratives -- first I did this, then the slimy tentacled monster did that.

That's when the blindside happened.  Because Lena, clearly dreaming, was doing all that without language.

How would thinking occur without language?  For almost all humans, our thought processes are intimately tied to words.  In fact, the experience of having an experience or thought that isn't describable using words is so unusual that we have a word for it -- ineffable.

Mostly, though, our experience is completely, um, effable.  So much so that trying to imagine how a dog (or any other animal) experiences the world without language is, for me at least, nearly impossible.

What's interesting is how powerful this drive toward language is.  There have been studies of pairs of "feral children" who grew up together but with virtually no interaction with adults, and in several cases those children invented spoken languages with which to communicate -- each complete with its own syntax, morphology, and phonetic structure.

A fascinating new study that came out last week in the Proceedings of the National Academy of Sciences, detailing research by Manuel Bohn, Gregor Kachel, and Michael Tomasello of the Max Planck Institute for Evolutionary Anthropology, showed that you don't even need the extreme conditions of feral children to induce the invention of a new mode of symbolic communication.  The researchers set up Skype conversations between monolingual English-speaking children in the United States and monolingual German-speaking children in Germany, but simulated a computer malfunction where the sound didn't work.  They then instructed the children to communicate as best they could anyhow, and gave them some words/concepts to try to get across.

They started out with some easy ones.  "Eating" resulted in the child miming eating from a plate, unsurprisingly.  But they moved to harder ones -- like "white."  How do you communicate the absence of color?  One girl came up with an idea -- she was wearing a polka-dotted t-shirt, and pointed to a white dot, and got the idea across.

But here's the interesting part.  When the other child later in the game had to get the concept of "white" across to his partner, he didn't have access to anything white to point to.  He simply pointed to the same spot on his shirt that the girl had pointed to earlier -- and she got it immediately.

Language is defined as arbitrary symbolic communicationArbitrary because with the exception of a few cases like onomatopoeic words (bang, pow, ping, etc.) there is no logical connection between the sound of a word and its referent.  Well, here we have a beautiful case of the origin of an arbitrary symbol -- in this case, a gesture -- that gained meaning only because the recipient of the gesture understood the context.

I'd like to know if such a gesture-language could gain another characteristic of true language -- transmissibility.  "It would be very interesting to see how the newly invented communication systems change over time, for example when they are passed on to new 'generations' of users," said study lead author Manuel Bohn, in an interview with Science Daily.  "There is evidence that language becomes more systematic when passed on."

Because this, after all, is when languages start developing some of the peculiarities (also seemingly arbitrary) that led Edward Sapir and Benjamin Whorf to develop the hypothesis that now bears their names -- that the language we speak alters our brains and changes how we understand abstract concepts.  In K. David Harrison's brilliant book The Last Speakers, he tells us about a conversation with some members of a nomadic tribe in Siberia who always described positions of objects relative to the four cardinal directions -- so my coffee cup wouldn't be on my right, it would be south of me.  When Harrison tried to explain to his Siberian friends how we describe positions, at first he was greeted with outright bafflement.

Then, they all erupted in laughter.  How arrogant, they told him, that you see everything as relative to your own body position -- as if when you turn around, suddenly the entire universe shifts to compensate for your movement!


Another interesting example of this was the subject of a 2017 study by linguists Emanuel Bylund and Panos Athanasopoulos, and focused not on our experience of space but of time.  And they found something downright fascinating.  Some languages (like English) are "future-in-front," meaning we think of the future as lying ahead of us and the past behind us, turning time into something very much like a spatial dimension.  Other languages retain the spatial aspect, but reverse the direction -- such as the Peruvian language of Aymara.  For them, the past is in front, because you can remember it, just as you can see what's in front of you.  The future is behind you -- therefore invisible.

Mandarin takes the spatial axis and turns it on its head -- the future is down, the past is up (so the literal translation of the Mandarin expression of "next week" is "down week").  Asked to order photographs of someone in childhood, adolescence, adulthood, and old age, they will place them vertically, with the youngest on top.  English and Swedish speakers tend to think of time as a line running from left (past) to right (future); Spanish and Greek speakers tended to picture time as a spatial volume, as if it were something filling a container (so emptier = past, fuller = future).

All of which underlines how fundamental to our thinking language is.  And further baffles me when I try to imagine how other animals think.  Because whatever Lena was imagining in her dream, she was clearly understanding and interacting with it -- even if she didn't know to attach the word "squirrel" to the concept.

*******************************

This week's book recommendation is a fascinating journey into a topic we've visited often here at Skeptophilia -- the question of how science advances.

In The Second Kind of Impossible, Princeton University physicist Paul Steinhardt describes his thirty-year-long quest to prove the existence of a radically new form of matter, something he terms quasicrystals, materials that are ordered but non-periodic.  Faced for years with scoffing from other scientists, who pronounced the whole concept impossible, Steinhardt persisted, ultimately demonstrating that an aluminum-manganese alloy he and fellow physicists Luca Bindi created had all the characteristics of a quasicrystal -- a discovery that earned them the 2018 Aspen Institute Prize for Collaboration and Scientific Research.

Steinhardt's book, however, doesn't bog down in technical details.  It reads like a detective story -- a scientist's search for evidence to support his explanation for a piece of how the world works.  It's a fascinating tale of persistence, creativity, and ingenuity -- one that ultimately led to a reshaping of our understanding of matter itself.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, February 15, 2020

Bridging the Great Divide

One of the main things that separates scientists from the rest of us is that they notice things we would just take for granted.

Gregor Mendel started in the research that eventually would uncover the four fundamental laws of inheritance when he noticed that some traits in pea plants seemed to skip a generation.  Percy Spencer was messing around with vacuum tubes, and noticed that in a certain configuration, they caused a chocolate bar in his pocket to melt -- further inquiry led to the invention of the microwave oven.  French physicist Henri Becquerel discovered radioactivity when he accidentally ruined some photographic plates with what turned out to be a chunk of uranium ore.  Alexander Fleming saved countless lives with the discovery of penicillin -- found because he wondered why a colony of mold on one of his culture plates seemed to be killing the bacteria near it.

I consider myself at least a little above average, savvy-wise, but I don't have that ability -- to look at the world and think, "Hmm, I wonder why that happened?"  Mostly I just assume "that's the way it is" and don't consider it much further, a characteristic I suspect I share with a lot of people.  So here's some recent research about something I've known about since I first started reading junior books on astronomy, when I was maybe ten years old, and never thought was odd -- or even worth giving any thought to.

There's a strange gap, something astronomers call "The Great Divide," between Mars and Jupiter.  The distance between Mars and Jupiter is over twice as great as the diameter of the entire inner Solar System.  In that gap is a narrow band called the Asteroid Belt -- and not a hell of a lot else.

Even more peculiar, when you think about it (which as I said, I didn't), is why inside of the Great Divide all the planets are small, dense, and rocky, and outside of it the planets are low-density gas giants (I do remember being shocked by the density thing as a kid, when I read that Saturn's overall density is lower than that of water -- so if you had a swimming pool big enough, Saturn would float).

[Image is in the Public Domain courtesy of NASA/JPL]

The problem with these sorts of observations, though -- even if you stop to wonder about them -- is that until very recently, we pretty much had a sample size of one Solar System to work with, so there was no way to tell if any particular feature of ours was odd or commonplace.  Even now, with the discovery of so many exoplanets that it's estimated there are a billion in our galaxy alone, we only have tentative information about the arrangement of planets around stars, to determine if there's any sort of pattern there, such as the apparent one in our neck of the woods.

Well, it looks like the physicists may have explained the Great Divide and the compositional difference of the planets on either side of it in one fell swoop.  A team from the Tokyo Institute of Technology and Colorado University have found that the Great Divide may be a relic of a ring of material that formed around the early Sun, and then was pulled apart and essentially "sorted" by the gravitational pulls of the coalescing planets.

The authors write:
We propose... that the dichotomy was caused by a pressure maximum in the disk near Jupiter’s location...  One or multiple such—potentially mobile—long-lived pressure maxima almost completely prevented pebbles from the Jovian region reaching the terrestrial zone, maintaining a compositional partition between the two regions.  We thus suggest that our young Solar System’s protoplanetary disk developed at least one and probably multiple rings, which potentially triggered the formation of the giant planets.
And once the process started, it accelerated, pulling dense, rocky material inward and lightweight, organic-chemical-rich material outward, resulting in a gap -- and an outer Solar System with gas giants surrounding an inner Solar System with small, terrestrial worlds.

"Young stellar systems were often surrounded by disks of gas and dust," said Stephen Mojzsis of Colorado University, who co-authored the paper, which appeared in Nature three weeks ago.  "If a similar ring existed in our own solar system billions of years ago, it could theoretically be responsible for the Great Divide, because such a ring would create alternating bands of high- and low-pressure gas and dust.  Those bands, in turn, might pull the solar system's earliest building blocks into several distinct sinks -- one that would have given rise to Jupiter and Saturn, and another Earth and Mars.

"It is analogous to the way the Continental Divide in the Rocky Mountains causes water to drain one way or another.  That's similar to how this pressure bump would have divided material in the early Solar System...  But that barrier in space was not perfect.  Some outer Solar System material still climbed across the divide.  And those fugitives could have been important for the evolution of our own world...  Those materials that might go to the Earth would be those volatile, carbon-rich materials.  And that gives you water.  It gives you organics."

And ultimately, it gives the Earth life.

So here we have a strange observation that most of us probably shrugged about (if we noticed it at all) that not only was instrumental to the formation of our own Solar System, but might (1) drive the arrangement of planets in star systems everywhere in the universe, and (2) has implications for the origin of life on our own -- and probably other -- worlds.

All of which brings to mind the wonderful quote by Hungarian biochemist Albert von Szent-Györgyi -- "Discovery consists of seeing what everyone has seen, and thinking what nobody has thought."

*********************************

This week's Skeptophilia book of the week is a dark one, but absolutely gripping: the brilliant novelist Haruki Murakami's Underground: The Tokyo Gas Attack and the Japanese Psyche.

Most of you probably know about the sarin attack in the subways of Tokyo in 1995, perpetrated by members of the Aum Shinrikyo cult under the leadership of Shoko Asahara.  Asahara, acting through five Aum members, set off nerve gas containers during rush hour, killing fifty people outright and injuring over a thousand others.  All six of them were hanged in 2018 for the crimes, along with a seventh who acted as a getaway driver.

Murakami does an amazing job in recounting the events leading up to the attack, and getting into the psyches of the perpetrators.  Amazingly, most of them were from completely ordinary backgrounds and had no criminal records at all, nor any other signs of the horrors they had planned.  Murakami interviewed commuters who were injured by the poison and also a number of first responders, and draws a grim but fascinating picture of one of the darkest days in Japanese history.

You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, February 14, 2020

A visit from the death reaper

In case you needed more incentive not to take a time machine back to the Cretaceous Period, paleontologists have just discovered a new species of dinosaur from Alberta and christened it Thanatotheristes -- Greek for "the reaper of death."

Add that to the fact that it looked a bit like a cross between a toucan and a Tyrannosaurus rex, and you've got some real nightmare fuel.

[Artist's concept of Thanatotheristes by Julius Csotonyi of the Royal Tyrrell Museum]

The fossils date from 79.5 million years ago, so about 14 million years (give or take) from the giant meteorite collision that would spell the end of the Age of Dinosaurs.

"It definitely would have been quite an imposing animal, roughly 2.4 meters at the hips," said study lead researcher Jared Voris, a doctoral student of paleontology at the University of Calgary in Alberta, in an interview with LiveScience.

Add to that the fact that it was eight meters from tip to tail and had seven-centimeter-long teeth that invite the inevitable comparison to steak knives, and you have a seriously badass creature.

It was found near bones of a Triceratops relative and a species of pachycephalosaurid, both herbivores, and it's a fair guess these were on the Thanatotheristes dinner menu.

It's amazing to think about what the biodiversity must have been like back then, when Alberta was a tropical forest near the equator.  For one thing, we tend to have the impression that the species we've found are all there were, so a new discovery like this is somehow a surprising addition to the menagerie.  In reality, the conditions that result in fossilization are so specific, and so rare, that it's kind of a wonder we have any fossil record at all.  Most dead animals and plants are gone with nary a trace in only a few years; the fact that these bones survived, substantially intact, for almost eighty million years is a little mindblowing.

So what that means is that the species we know about constitute only a very small percentage of the animals and plants that were alive back then.  How small a percentage is a matter of speculation.  But it's a safe bet that it's less that 0.1% -- meaning, even at a generous estimate, for every one species we have fossils of -- and therefore a sense of what it looked like -- there are 999 that we not only don't know about, we have no way of knowing about.

It is only a slight exaggeration that our current situation is like trying to draw a good picture of our current biodiversity using only the bones of a rabbit, largemouth bass, reticulated python, and hummingbird, a handful of mollusk shells, a pile of various insect exoskeletons, and some leaves.  So our ideas about the prehistoric world -- even much more recent times than the Cretaceous Period -- are no so much wrong as they are wildly, hugely incomplete.

Which they will always be, unless we develop that time machine.  And even that brings up its own set of problems, which you know all too well if you've read any science fiction.  I'm reminded of the first time I came across the idea of how fraught time travel into the past would be, when I read Ray Bradbury's brilliant, disturbing short story "The Sound of Thunder" -- where a safari into the Cretaceous Era results in catastrophe in the present despite the organizers' attempts to prevent it. 

So there are a variety of reasons that it might be prudent to remain in ignorance of what critters were around back then, even if we were somehow able to.  Changing the past has drastic consequences, and if you believe science fiction stories, they're almost always bad ones.  Then there's the more direct danger of being eaten by a Toucanosaurus rex.  You can see how that would kind of suck.

*********************************

This week's Skeptophilia book of the week is a dark one, but absolutely gripping: the brilliant novelist Haruki Murakami's Underground: The Tokyo Gas Attack and the Japanese Psyche.

Most of you probably know about the sarin attack in the subways of Tokyo in 1995, perpetrated by members of the Aum Shinrikyo cult under the leadership of Shoko Asahara.  Asahara, acting through five Aum members, set off nerve gas containers during rush hour, killing fifty people outright and injuring over a thousand others.  All six of them were hanged in 2018 for the crimes, along with a seventh who acted as a getaway driver.

Murakami does an amazing job in recounting the events leading up to the attack, and getting into the psyches of the perpetrators.  Amazingly, most of them were from completely ordinary backgrounds and had no criminal records at all, nor any other signs of the horrors they had planned.  Murakami interviewed commuters who were injured by the poison and also a number of first responders, and draws a grim but fascinating picture of one of the darkest days in Japanese history.

You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Thursday, February 13, 2020

Timing out

One of my ongoing frustrations when I was a teacher was the failure of the educational community to use the latest scientific research to guide our approach to pedagogy.

Of course, I shouldn't be surprised.  We here in the United States have made a national pastime out of ignoring scientific research -- climate change and the safety/efficacy of vaccinations being two of the most obvious examples.  Still, it was maddening to see things like high school students struggling in Spanish I when if we put our resources into bilingual education in preschool, kids would learn a second language as easily as they did their first.

And research into the window of opportunity for language learning has been around for thirty years.

Another example was the subject of a paper this week in Nature: Human Behavior.  In "Interplay of Chronotype and School Timing Predicts School Performance," by Andrea P. Goldin, Mariano Sigman, Gisela Braier, Diego A. Golombek, and María J. Leone, of Universidad Torcuato di Tella (Buenos Aires, Argentina), we find out that people tend to have chronotypes -- natural biological clocks that time our highest and lowest alertness -- and that when schools run counter to a student's chronotype, it drastically impacts performance.

[Image licensed under the Creative Commons Robbert van der Steeg, Eternal clock, CC BY-SA 2.0]

I know this from my own experience.  I'm naturally a lark -- up with the sun.  Often earlier, even.  My morning classes in college (and teaching morning classes during my career as an educator) were easy for me.  But I have a slow fade after lunch time -- and by a time that for many people is Evening Party Time, I'm ready to be curled up in bed with a good book.

I literally haven't slept past eight o'clock in maybe twenty years.  And staying up past ten PM?

Not if you want me to be halfway coherent.

But I was painfully aware that a lot of students seemed to be on the opposite schedule.  Trying to get them to learn biology first thing in the morning (hell, trying to get them to stay awake) was an ongoing challenge.  And I can't tell you the number of students who told me that they stay up regularly till three AM -- not because of homework or social media (although those did tend to fill the wakeful hours), but because they were wide awake and going to bed earlier than that would be an exercise in frustration.

So it's a double whammy.  We take kids who are naturally night owls, make them get up early (depriving them of much-needed sleep), and then expect them to perform optimally on intellectual tasks.

Goldin et al. pull no punches about this:
Most adolescents exhibit very late chronotypes and attend school early in the morning, a misalignment that can affect their health and psychological well-being.  Here we examine how the interaction between the chronotype and school timing of an individual influences academic performance, studying a unique sample of 753 Argentinian students who were randomly assigned to start school in the morning (07:45), afternoon (12:40) or evening (17:20).  Although chronotypes tend to align partially with class time, this effect is insufficient to fully account for the differences with school start time.  We show that (1) for morning-attending students, early chronotypes perform better than late chronotypes in all school subjects, an effect that is largest for maths; (2) this effect vanishes for students who attend school in the afternoon; and (3) late chronotypes benefit from evening classes.  Together, these results demonstrate that academic performance is improved when school times are better aligned with the biological rhythms of adolescents.
And I strongly suspect that the effect this research will have on the educational community is... nada.

My wife has a poster in her office showing a dude hauling ass in the annual Pamplona Running of the Bulls, a thousand pounds of snorting animal right behind him.  The caption is: "TRADITION: Just because we've always done it this way doesn't mean it's not a really, really stupid idea."

To which the educational establishment of the United States tends to say, "Oh, well, too bad."

The most frustrating thing is that apparently it doesn't take much of a change to make a difference.  Bumping school start times ahead by an hour -- so from eight to nine AM, in the school district where I taught -- was shown to improve daytime alertness and the quality/length of sleep in adolescents in a study done six years ago.  It still wouldn't be optimal for students who are really night owls, but at this point any gain at all would be an improvement.

But given how most schools have responded to thirty-year-old research on language learning, the Goldin et al. study will probably be filed away with lots of other research, in a folder labeled, "Well, It Would Be Nice, But..."

Along with recommendations to our federal government for halting climate change and mandatory vaccination programs.  Seems like it's an uphill battle for most things these days.

*********************************

This week's Skeptophilia book of the week is a dark one, but absolutely gripping: the brilliant novelist Haruki Murakami's Underground: The Tokyo Gas Attack and the Japanese Psyche.

Most of you probably know about the sarin attack in the subways of Tokyo in 1995, perpetrated by members of the Aum Shinrikyo cult under the leadership of Shoko Asahara.  Asahara, acting through five Aum members, set off nerve gas containers during rush hour, killing fifty people outright and injuring over a thousand others.  All six of them were hanged in 2018 for the crimes, along with a seventh who acted as a getaway driver.

Murakami does an amazing job in recounting the events leading up to the attack, and getting into the psyches of the perpetrators.  Amazingly, most of them were from completely ordinary backgrounds and had no criminal records at all, nor any other signs of the horrors they had planned.  Murakami interviewed commuters who were injured by the poison and also a number of first responders, and draws a grim but fascinating picture of one of the darkest days in Japanese history.

You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Wednesday, February 12, 2020

Very early Californians

Yesterday's post about self-correction and reframing in science prompted a friend of mine, the phenomenal author, blogger, and all-around polymath Gil Miller, to send me an interesting link about an archaeological site in California.

The striking thing about this link is its alleged age.  There are mastodon bones showing signs of butchering dating to 130,000 years ago -- which is 115,000 years older than the oldest recording human-occupied site in North America.

[Image licensed under the Creative Commons Dantheman9758 at the English Wikipedia, Mastodon+Human, CC BY-SA 3.0]

Accepting this would require a major retooling of our understanding of the human colonization of the continent.  Were there other hominids living here, who then died out prior to the ancestors of the Native Americans crossing her from Asia?  If so, why aren't there more fossils and other relics?  Was this an early wave of colonization from Siberia, which made it to California but ultimately was unsuccessful?  Is the damage to the bones caused by something other than butchering?  Are the dates of the bones in error?

The answer to the last-asked question, at least, seems to be no.  If a site shows anomalous dates, the first thing you do is... check the dates.  Which was done, and it seems like the radioisotope dating of the mastodon bones is correct.

As far as the others, here's what the authors have to say (from their 2017 paper on the topic in Nature).  The passage is a bit long, but it gives you the scope of their argument:
The earliest dispersal of humans into North America is a contentious subject, and proposed early sites are required to meet the following criteria for acceptance: (1) archaeological evidence is found in a clearly defined and undisturbed geologic context; (2) age is determined by reliable radiometric dating; (3) multiple lines of evidence from interdisciplinary studies provide consistent results; and (4) unquestionable artefacts are found in primary context.  Here we describe the Cerutti Mastodon (CM) site, an archaeological site from the early late Pleistocene epoch, where in situ hammerstones and stone anvils occur in spatio-temporal association with fragmentary remains of a single mastodon (Mammut americanum).  The CM site contains spiral-fractured bone and molar fragments, indicating that breakage occured while fresh.  Several of these fragments also preserve evidence of percussion.  The occurrence and distribution of bone, molar and stone refits suggest that breakage occurred at the site of burial.  Five large cobbles (hammerstones and anvils) in the CM bone bed display use-wear and impact marks, and are hydraulically anomalous relative to the low-energy context of the enclosing sandy silt stratum.  230Th/U radiometric analysis of multiple bone specimens using diffusion–adsorption–decay dating models indicates a burial date of 130.7 ± 9.4 thousand years ago.  These findings confirm the presence of an unidentified species of Homo at the CM site during the last interglacial period (MIS 5e; early late Pleistocene), indicating that humans with manual dexterity and the experiential knowledge to use hammerstones and anvils processed mastodon limb bones for marrow extraction and/or raw material for tool production.  Systematic proboscidean bone reduction, evident at the CM site, fits within a broader pattern of Palaeolithic bone percussion technology in Africa, Eurasia, and North America.  The CM site is, to our knowledge, the oldest in situ, well-documented archaeological site in North America and, as such, substantially revises the timing of arrival of Homo into the Americas.
As I discussed yesterday, the process of science is that when someone makes a claim, his/her fellow scientists immediately jump in and tear it apart, looking at it from every angle, asking if there are other explanations that account for all the evidence, and in general trying to refute it.  They don't do this to be mean.  The idea is to see if the proposed model can withstand scrutiny -- to see, in scientific parlance, if it is "robust."

And this claim had an additional caveat; to accept it meant to undo our entire previous understanding of how the Americas were colonized by humans.  Carl Sagan's "ECREE Principle" applies here -- "Extraordinary Claims Require Extraordinary Evidence."  So the cross-checking and verification was especially intense.

Most archaeologists were unconvinced.  In an article in BBC Online by BBC science editor Paul Rincon, David Meltzer, professor of archaeology at the Southern Methodist University of Dallas, Texas, put it most succinctly: "Nature is mischievous and can break bones and modify stones in a myriad of ways.  With evidence as inherently ambiguous as the broken bones and nondescript broken stones described in the paper, it is not enough to demonstrate they could have been broken/modified by humans; one has to demonstrate they could not have been broken by nature.  This is an equifinality problem: multiple processes can cause the same product."

Which is the problem in all of science.  In order to demonstrate your claim, you have not only to provide evidence, but to show that there is no other explanation that explains it equally well.  Especially here, when to accept the claim requires rewriting everything we know about Western Hemisphere archaeology -- invoking another good rule of thumb, Ockham's Razor.

Usually framed as, "if there are competing explanations, the one that requires the fewest ad hoc assumptions is most likely to be correct."

So at the moment, the consensus about the Cerutti Mastondon Site claim is "maybe, but probably not."  The Holen et al. paper has not stood the test of scientific scrutiny.  Which doesn't mean the claim is wrong; plenty of weird claims have later been shown, by virtue of additional evidence, to have been correct.  But in the absence of that evidence, we have to be able to say, "We don't know."

As an extraordinary claim, thus far it seems not to have reached the bar of support we expect in science.

*********************************

This week's Skeptophilia book of the week is a dark one, but absolutely gripping: the brilliant novelist Haruki Murakami's Underground: The Tokyo Gas Attack and the Japanese Psyche.

Most of you probably know about the sarin attack in the subways of Tokyo in 1995, perpetrated by members of the Aum Shinrikyo cult under the leadership of Shoko Asahara.  Asahara, acting through five Aum members, set off nerve gas containers during rush hour, killing fifty people outright and injuring over a thousand others.  All six of them were hanged in 2018 for the crimes, along with a seventh who acted as a getaway driver.

Murakami does an amazing job in recounting the events leading up to the attack, and getting into the psyches of the perpetrators.  Amazingly, most of them were from completely ordinary backgrounds and had no criminal records at all, nor any other signs of the horrors they had planned.  Murakami interviewed commuters who were injured by the poison and also a number of first responders, and draws a grim but fascinating picture of one of the darkest days in Japanese history.

You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, February 11, 2020

Self-correction, paradigm shifts, and Rapa Nui

It's not even 7 AM and I already broke the cardinal rule of the internet, which is, "Don't feed the trolls."

Posting as I do about controversial issues like climate change, evolution, and religion, it's only to be expected that I get pretty heated commentary sometimes.  While I'm always ready and willing to consider a well-reasoned argument against my viewpoints, I'm usually smart enough to let the "You only say that because you're a (choose one): radical leftist, godless heathen, anti-American, tree-hugger, cynic, whiny liberal, complete idiot" comments roll off me.

Usually.

The one that got me today was the person who responded to a story I retweeted about a recent discovery in evolutionary biology with the time-honored snarl, "Your sciencism is more of a faith than my religion is.  I don't get why you trust something that could be completely disproven tomorrow.  I'll stick with eternal truths."

I spent a long time (well, to quote Lieutenant Commander Data, "0.68 seconds... to an android, that is nearly an eternity") trying to talk myself out of responding.  That effort being unsuccessful, I wrote, "... and I don't get why you trust something that is completely incapable of self-correcting, and therefore wouldn't recognize if something was wrong, much less fix it."

This resulted in another time-honored snarl, namely, "fuck you, asshole," which I am always surprised to hear from someone whose holy book says lots of stuff about "turn the other cheek" and "love thy enemies" and "pray for those who persecute you" but very little about "say 'fuck you' to any assholes who challenge you."

Anyhow, I do find it puzzling that self-correction is considered some kind of fault.  Upon some consideration, I have come to the conclusion that the problem is based in a misunderstanding of what kind of self-correction science does.  The usual sort is on the level of details -- a rearrangement of an evolutionary tree for a particular group of animals (that was the link that started the whole argument), a revision of our model for stellar evolution, an adjustment to estimates for the rate of global temperature increase.  As I've pointed out more than once here at Skeptophilia, truly paradigm-changing reversals in science -- something like the discovery of plate tectonics in the 1960s -- stand out primarily because of how uncommon they are.

So "this all might be proven wrong tomorrow" should be followed up with, "yeah, but it almost certainly won't."

As an example of how self-correction in science actually works, consider the paper that appeared last week in The Journal of Archaeological Science questioning a long-held narrative of the history of Rapa Nui, more commonly known by its European-given name of Easter Island.

[Image licensed under the Creative Commons TravelingOtter, Moai at Rano Raraku - Easter Island (5956405378), CC BY 2.0]

The previous model, made famous in Jared Diamond's book Collapse, was that the inhabitants of Rapa Nui did themselves in by felling all the trees (in part to make rollers and skids for moving the massive blocks of stone that became the iconic moai, the stone heads that dot the island's terrain) and ignoring the signs of oncoming ecological catastrophe.  The story is held as a cautionary tale about our own overutilization of natural resources and blindness to the danger signals from the Earth's environmental state, and the whole thing had such resonance with the eco-minded that it wasn't questioned.

Well, a team researchers, led by Robert DiNapoli of the University of Oregon, have said, "Not so fast."

In "A Model-Based Approach to the Tempo of 'Collapse': The Case of Rapa Nui (Easter Island)," the anthropologists and archaeologists studying Rapa Nui found that it may not be so clear-cut.  The pre-European-contact ecological collapse was much more gradual than previously believed, and it looks like the building of moai went on long after Diamond and others had estimated.  The authors write:
Rapa Nui (Easter Island, Chile) presents a quintessential case where the tempo of investment in monumentality is central to debates regarding societal collapse, with the common narrative positing that statue platform (ahu) construction ceased sometime around AD 1600 following an ecological, cultural, and demographic catastrophe.  This narrative remains especially popular in fields outside archaeology that treat collapse as historical fact and use Rapa Nui as a model for collapse more generally.  Resolving the tempo of “collapse” events, however, is often fraught with ambiguity given a lack of formal modeling, uncritical use of radiocarbon estimates, and inattention to information embedded in stratigraphic features.  Here, we use a Bayesian model-based approach to examine the tempo of events associated with arguments about collapse on Rapa Nui.  We integrate radiocarbon dates, relative architectural stratigraphy, and ethnohistoric accounts to quantify the onset, rate, and end of monument construction as a means of testing the collapse hypothesis.  We demonstrate that ahu construction began soon after colonization and increased rapidly, sometime between the early-14th and mid-15th centuries AD, with a steady rate of construction events that continued beyond European contact in 1722.  Our results demonstrate a lack of evidence for a pre-contact ‘collapse’ and instead offer strong support for a new emerging model of resilient communities that continued their long-term traditions despite the impacts of European arrival.
All of which points out that -- like yesterday's post about the Greenland Vikings' demise -- complex events seldom result from single causes.  Note that the researchers are not saying that the inhabitants of Rapa Nui's felling of the native trees was inconsequential, nor that the contact with Europeans was without negative repercussions for the natives, just as the paper cited yesterday didn't overturn completely the prior understanding that the Greenland Vikings had been done in by the onset of the Little Ice Age.  What it demonstrates is that we home in on the truth in science by questioning prior models and looking at the actual evidence, not by simply continuing to hang on to the previous explanation because it fits our version of how we think the world works.

So the self-correction about the history of Rapa Nui isn't a paradigm shift, unless you count the fact that its use as a caution about our own perilous ecological situation won't be quite so neat and tidy any more.  And what it definitely doesn't do is to call into question the methods of science themselves.  The proponents of "it could all be proven wrong tomorrow" seem to feel that all scientists are doing is making wild guesses, then finding out the guesses are wrong and replacing them with other wild guesses.  The reality is closer to an electrician trying to figure out what's wrong with the wiring in your house, first testing one thing and then another, gradually ruling out hypotheses about what the problem might be and homing in on where the fault lies so it can be repaired.  (And ultimately, ending up with functional circuitry that works every time you use it.)

But the truth is, I probably shouldn't have engaged with the person who posted the original comment.  There seems little to be gained by online snark-fests other than raising the blood pressure on both sides.  So I'm recommitting myself to not feeding the trolls, and hoping that my resolution will last longer than 0.68 seconds this time.

*********************************

This week's Skeptophilia book of the week is a dark one, but absolutely gripping: the brilliant novelist Haruki Murakami's Underground: The Tokyo Gas Attack and the Japanese Psyche.

Most of you probably know about the sarin attack in the subways of Tokyo in 1995, perpetrated by members of the Aum Shinrikyo cult under the leadership of Shoko Asahara.  Asahara, acting through five Aum members, set off nerve gas containers during rush hour, killing fifty people outright and injuring over a thousand others.  All six of them were hanged in 2018 for the crimes, along with a seventh who acted as a getaway driver.

Murakami does an amazing job in recounting the events leading up to the attack, and getting into the psyches of the perpetrators.  Amazingly, most of them were from completely ordinary backgrounds and had no criminal records at all, nor any other signs of the horrors they had planned.  Murakami interviewed commuters who were injured by the poison and also a number of first responders, and draws a grim but fascinating picture of one of the darkest days in Japanese history.

You won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]