Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Friday, June 26, 2020

Jumbo shrimp

After yesterday's rather humbling post about how easy it is to fool the human senses, today we get knocked down another peg or two with some new research showing our visual perception is beat hands down...

... by a species of shrimp.

You've probably heard the term refresh rate used in regards to computer monitors, but it also applies to our eyes.  The photoreceptors in your retina have to reset after firing, and during that time -- the refractory period -- the receptor cell is insensitive to further stimuli.  I recall finding out about this in my animal physiology class at the University of Washington thirty years ago, and finding out that human photoreceptors reset in about 1/60th of a second.  This is why the flicker in a fluorescent light is barely detectable to the human eye; it's driven by the oscillations of alternating current at a frequency of sixty hertz; and the fact that we have millions of photoreceptors, all out of phase with each other, smooths out the signal and makes it look like one continuous, evenly-bright light.

To a fly, however, which has a refresh rate double ours -- about 120 times per second -- a fluorescent light would look like a strobe, brightening and dimming every sixtieth of a second.

Must be really freakin' annoying.  Yet another reason I'm glad I'm not a fly.

But even they are not the fastest.  A paper in Biology Letters this week describes research into the visual systems of a species of snapping shrimp (Alpheus heterochaelis), which already is badass enough -- it snaps its claws together with such force that it creates a shock wave in the water, stunning its prey.  And this little marine crustacean has a refresh rate of 160 times per second.

So what looks like a blur of motion to other animals is visible as clear, discrete images moving across its field of vision.

Not only that, they have one of the widest ranges of sensitivity to light level known, functioning well with only 1 lux of incident light (the light intensity of late twilight) all the way up to 100,000 lux (direct, intense sunlight).

[Image licensed under the Creative Commons Rickard Zerpe, Snapping shrimp (Synalpheus sp.) (23806570264), CC BY-SA 2.0]

The snapping shrimp isn't the only amazing crustacean out there.  Its cousins, the mantis shrimps (Order Stomatopoda) don't just snap their claws and stun their prey, they actually punch the shit out of them.  They can accelerate their claws at the astonishing rate of 102,000 m/s^2, delivering a force of 1,500 Newtons (equivalent to the Earth's gravitational pull on a 150 kilogram mass).  Not only that, but they move their claws so quickly they overcome the cohesion of the water molecules as they pass through, creating vapor-filled bubbles (a process called cavitation) in their wake.  These bubbles then collapse with astounding force, delivering a second deadly shock wave to the unfortunate recipient.

No wonder the folks in the Caribbean have nicknamed the native species of mantis shrimp "the thumb-splitter."

But wild as that is, it's not why I brought up mantis shrimp.  They have the most sensitive color vision of any animal known.  Humans are (mostly) trichromats, having three functioning types of color receptor in our eyes.  Dogs are dichromats -- they have only two, which is why their color acuity is worse than ours.  A few lucky humans, and a great many bird species, are tetrachromats, having four kinds of color receptors.

Mantis shrimp have sixteen.  They can not only see in the ultraviolet region of the spectrum -- a range of light completely invisible to the human eye -- they can detect polarization angle, and even have sensors for detecting circular polarization, something that is thought to be unique in the animal kingdom.

Why they need this many different kinds of light receptors is unknown, although it probably has to do with predator-prey interactions -- finding lunch and avoiding being made into lunch.  With so many different strategies used by shallow tropical marine species to confound the eye -- shimmering scales, transparency, cryptic coloration, countershading -- having eyes that beat everyone else in sensitivity and range would be a pretty neat adaptation.

So that's yet another excursion into the weird world of sensory perception.  It never fails to fascinate me to think about what a different kind of animal's experience of the world must be like.  As philosopher Thomas Nagel pointed out, the only way to know what it's like to be a bat is to be a bat; all of our ideas of echolocation and flight and being nocturnal only gives us the answer to what it's like for a human to think about being a bat.

But even so, and all pondering about the mind/body problem aside, I can't help but wonder what the world looks like to a shrimp.

**************************************

I know I sometimes wax rhapsodic about books that really are the province only of true science geeks like myself, and fling around phrases like "a must-read" perhaps a little more liberally than I should.  But this week's Skeptophilia book recommendation of the week is really a must-read.

No, I mean it this time.

Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error is something that everyone should read, because it points out the remarkable frailty of the human mind.  As wonderful as it is, we all (as Schulz puts it) "walk around in a comfortable little bubble of feeling like we're absolutely right about everything."  We accept that we're fallible, in a theoretical sense; yeah, we all make mistakes, blah blah blah.  But right now, right here, try to think of one think you might conceivably be wrong about.

Not as easy as it sounds.

She shocks the reader pretty much from the first chapter.  "What does it feel like to be wrong?" she asks.  Most of us would answer that it can be humiliating, horrifying, frightening, funny, revelatory, infuriating.  But she points out that these are actually answers to a different question: "what does it feel like to find out you're wrong?"

Actually, she tells us, being wrong doesn't feel like anything.  It feels exactly like being right.

Reading Schulz's book makes the reader profoundly aware of our own fallibility -- but it is far from a pessimistic book.  Error, Schulz says, is the window to discovery and the source of creativity.  It is only when we deny our capacity for error that the trouble starts -- when someone in power decides that (s)he is infallible.

Then we have big, big problems.

So right now, get this book.  I promise I won't say the same thing next week about some arcane tome describing the feeding habits of sea slugs.  You need to read Being Wrong.

Everyone does.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Thursday, June 25, 2020

The stone hand illusion

One of the reasons I trust science is that I have so little trust in my own brain's ability to assess correctly the nature of reality.

Those may sound like contradictions, but they really aren't.  Science is a method that allows us to evaluate hard data -- measurements by devices that are designed to have no particular biases.  By relying on measurements from machines, we are bypassing our faulty sensory equipment, which can lead us astray in all sorts of ways.  In astronomer Neil deGrasse Tyson's words, "[Our brains] are poor data-taking devices... that's why we have machines that don't care what side of the bed they woke up on that morning, that don't care what they said to their spouse that day, that don't care whether they had their morning caffeine.  They'll get the data right regardless."

We still believe that we're seeing what's real, don't we?  "I saw it with my own eyes" is still considered the sine qua non for establishing what reality is.  Eyewitness testimony is still the strongest evidence in courts of law.  Because how could it be otherwise?  Maybe we miss minor things, but how could we get it so far wrong?

But as I wrote about two weeks ago, even our perception of something as simple as color is flawed, and is mostly a construct of the brain, not a function of what's really out there.  We are ignoring as much as we perceive, making stuff up to bridge gaps, and in general, creating a montage of what's actually there, what your brain decides is important enough to pay attention to, and inferences to fill in the spaces in between.

If that's not bad enough, a scientist in Italy has knocked another gaping hole in our confidence that our brain can correctly interpret the sensory information it's given -- this time with an actual hammer.

Some of you may have heard of the "rubber hand illusion" that was created in an experiment back in 1998 by Matthew Botvinick and Jonathan Cohen.  In this experiment, the two scientists placed a rubber hand in view of a person whose actual hand is shielded from view by a curtain.  The rubber hand is stroked with a feather at the same time as the person's real (but out-of-sight) hand receives a similar stroke -- and within minutes, the person becomes strangely convinced that the rubber hand is his hand.

The Italian experiment, which I found out about in an article in Discover Online, substitutes an auditory stimulus for the visual one -- with an even more startling result.

Irene Senna, professor of psychology at Milono-Bicocca University in Milan, rigged up a similar scenario to Botvinick and Cohen's.  A subject sits with one hand through a screen.  On the back of the subject's hand is a small piece of foil which connects an electrical lead to a computer.  The subject sees a hammer swinging toward her hand -- but the hammer stops just short of smashing her hand, and only touches the foil gently (but, of course, she can't see this).  The touch of the hammer sends a signal to the computer -- which then produces a hammer-on-marble clink sound.

After repeating this only a few times, the subject feels absolutely convinced that her hand has turned to stone.

[Image is in the Public Domain]

What is impressive about this illusion is that the feeling persists even after the experiment ends, and the screen is removed -- and even though the test subjects knew what was going on.  Subjects felt afterwards as if their hands were cold, stiff, heavier, less sensitive.  They reported difficulty bending their wrists.

To me, the coolest (and freakiest) thing about this is that our knowledge centers, the logical and rational prefrontal cortex and associated areas, are completely overcome by the sensory-processing centers when presented with this scenario.  We can know something isn't real, and simultaneously cannot shake the brain's decision that it is real.  None of the test subjects was crazy; they all knew that their hands weren't made of stone.  But presented with sensory information that contradicted that knowledge, they couldn't help but come to the wrong conclusion.

And this once again illustrates why I trust science, and am suspicious of eyewitness reports of UFOs, Bigfoot, ghosts, and the like.  Our brains are simply too easy to fool, especially when emotions (particularly fear) run high.  We can be convinced that what we're seeing or hearing is the real deal, to the point that we are unwilling to admit the possibility of a different explanation.

But as Senna's elegant little experiment shows, we can't rely on what our senses tell us.  Data from scientific measuring devices will always be better than pure sensory information.  To quote Tyson again: "We think that the eyewitness testimony of an authority -- someone wearing a badge, or a pilot, or whatever -- is somehow better than the testimony of an average person.  But no.  I'm sorry... but it's all bad."

**************************************

I know I sometimes wax rhapsodic about books that really are the province only of true science geeks like myself, and fling around phrases like "a must-read" perhaps a little more liberally than I should.  But this week's Skeptophilia book recommendation of the week is really a must-read.

No, I mean it this time.

Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error is something that everyone should read, because it points out the remarkable frailty of the human mind.  As wonderful as it is, we all (as Schulz puts it) "walk around in a comfortable little bubble of feeling like we're absolutely right about everything."  We accept that we're fallible, in a theoretical sense; yeah, we all make mistakes, blah blah blah.  But right now, right here, try to think of one think you might conceivably be wrong about.

Not as easy as it sounds.

She shocks the reader pretty much from the first chapter.  "What does it feel like to be wrong?" she asks.  Most of us would answer that it can be humiliating, horrifying, frightening, funny, revelatory, infuriating.  But she points out that these are actually answers to a different question: "what does it feel like to find out you're wrong?"

Actually, she tells us, being wrong doesn't feel like anything.  It feels exactly like being right.

Reading Schulz's book makes the reader profoundly aware of our own fallibility -- but it is far from a pessimistic book.  Error, Schulz says, is the window to discovery and the source of creativity.  It is only when we deny our capacity for error that the trouble starts -- when someone in power decides that (s)he is infallible.

Then we have big, big problems.

So right now, get this book.  I promise I won't say the same thing next week about some arcane tome describing the feeding habits of sea slugs.  You need to read Being Wrong.

Everyone does.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Wednesday, June 24, 2020

Descent into chaos

There's an interesting concept called sensitive dependence on initial conditions.

Here's a simple example.  If you take a deep bowl, and drop a marble into it, it doesn't take any great intelligence or insight to predict what the end state will be.  Marble on the bottom of the bowl.  It doesn't matter how high you drop it from or where exactly it hits the sides first.  After a bit of rolling around, the marble will stop moving at the bottom.

Now, do the same thing -- but with the bowl flipped over.  Where will the marble end up?

Impossible to say, because it is an inherently chaotic system.  You could do it a hundred times and the marble will end up in a different place each time, because its final location depends on exactly the speed and angle of its path, where it hits the curved edge of the bowl, even whether the marble is spinning a little or not.  A system like this is said to be "sensitive to initial conditions" -- therefore unpredictable.  Perturb it a little by altering it in a tiny way, and you get a completely different outcome.

Here's a much cooler example, that I stumbled across in doing research for this post.  It's called a double compound pendulum.  Take two rigid rods, and suspend one so it's free to swing.  Then tie the second rod to the bottom of the first.  Start with the rods pulled horizontal, then let it go.  Can you predict how the whole system will move?

Simple answer: no.  It's a chaotic system.


[GIF is in the Public Domain]

A little mesmerizing to watch, isn't it?

The reason this comes up is because there's decent evidence that the intersection between the Earth's climate and human society is a chaotic system that has at least some degree of sensitive dependence to initial conditions.  If you perturb it, it may not respond the way you expect -- and sometimes small changes in one location can lead to big ones somewhere else.  (This concept was made famous as "the butterfly effect.")

As an example of this, take the research that was released just last week in Proceedings of the National Academy of Sciences, the link to which was sent to me by a friend and loyal reader of Skeptophilia yesterday.  In "Extreme Climate After Massive Eruption of Alaska’s Okmok Volcano in 43 BCE and Effects on the Late Roman Republic and Ptolemaic Kingdom," by a team led by Joseph R. McConnell of the University of Cambridge, we find out about an Alaskan volcanic eruption that may have been one of the significant factors leading to the collapse of the Roman Republic, and its consolidation as an empire -- events that radically changed the course of history in Europe and North Africa.

Geologists on the team identified tephra (volcanic ash) in ice cores from the Arctic that were fingerprinted chemically and shown to come from the volcano named Okmok in the Aleutian Islands.  The dating of the tephra deposit shows that the eruption happened in 43 B.C.E. -- right after the assassination of Julius Caesar, during a time when Rome was in chaos as various political factions were duking it out for control.  The eruption of this volcano halfway around the world is also correlated with the coldest year Europe had for centuries, possibly longer.  Snow fell in summer, crops failed, there were famines and repeated uprisings by desperate and starving citizens.

This sudden drop in temperature was one of the factors that contributed to the realignment of the Roman government as someone emerged who said he knew what to do to fix the situation -- Octavian (later known as Augustus), Julius Caesar's great-nephew.  And he did it, establishing the Pax Romana, quelling the revolts and ushering in two centuries of relative peace and prosperity for Roman citizens (and wreaking havoc on the Gauls, Celts, Teutons, and whatever other tribes happened to be in the way of the Roman Legions).

It helped, of course, that once the volcanic tephra from Okmok settled out, the temperature rebounded, and the first years of Augustus's reign were noted for a beneficent climate and rich crop yields.  Not all of the good bits of the Pax Romana were due to Augustus's skill as an emperor; he got lucky because of conditions he had no control over and could not have predicted, just as the last leaders of the Republic got unlucky for the same reasons.

The point here is that we should be wary of perturbing chaotic systems, which is exactly what we're doing by our rampant dumping of carbon dioxide into the atmosphere.  And what we're seeing over the last decades is exactly the sort of unpredictable response -- some areas experiencing droughts, others floods; deadly heat waves and trapped polar vortexes that drop areas into the deep freeze for weeks; increased hurricanes, tornadoes, and bomb cyclones.  One of the frustrations felt by the people who understand climate systems is that the average layperson doesn't see this kind of  unpredictability as precisely what you'd expect from pushing on an inherently chaotic system.  If you can't make predictions to pinpoint accuracy -- "okay, because the climate is changing, you can expect it to be 95 F in Omaha on July 19" -- it's nothing to be concerned about.

"The scientists don't even know what's going on," you'll hear them say.  "Why should we believe it's a problem if they can't tell us what the outcome is going to be?"

But that's exactly why we shouldn't be messing with it.  Systems that have sensitive dependence to initial conditions are dramatically unpredictable, and get pushed out of equilibrium quickly and sometimes with catastrophic results.

As the leaders in the final years of the Roman Republic found out.

I feel like another figure from the Classical world -- Cassandra -- for even bringing this up.  Cassandra, you may recall, is the woman who was cursed by the gods to having accurate foresight and knowledge of the future, but with the difficulty that whatever she says, no one believes.  The climatologists have been sounding the alarm about this for decades, to little effect.  If you can't accurately predict the outcome, to most politicians, it doesn't exist.

Which makes me wonder if before we try to get our leaders to get on board with addressing anthropogenic climate change, we should require they sit through some lectures on chaos theory.

**************************************

I know I sometimes wax rhapsodic about books that really are the province only of true science geeks like myself, and fling around phrases like "a must-read" perhaps a little more liberally than I should.  But this week's Skeptophilia book recommendation of the week is really a must-read.

No, I mean it this time.

Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error is something that everyone should read, because it points out the remarkable frailty of the human mind.  As wonderful as it is, we all (as Schulz puts it) "walk around in a comfortable little bubble of feeling like we're absolutely right about everything."  We accept that we're fallible, in a theoretical sense; yeah, we all make mistakes, blah blah blah.  But right now, right here, try to think of one think you might conceivably be wrong about.

Not as easy as it sounds.

She shocks the reader pretty much from the first chapter.  "What does it feel like to be wrong?" she asks.  Most of us would answer that it can be humiliating, horrifying, frightening, funny, revelatory, infuriating.  But she points out that these are actually answers to a different question: "what does it feel like to find out you're wrong?"

Actually, she tells us, being wrong doesn't feel like anything.  It feels exactly like being right.

Reading Schulz's book makes the reader profoundly aware of our own fallibility -- but it is far from a pessimistic book.  Error, Schulz says, is the window to discovery and the source of creativity.  It is only when we deny our capacity for error that the trouble starts -- when someone in power decides that (s)he is infallible.

Then we have big, big problems.

So right now, get this book.  I promise I won't say the same thing next week about some arcane tome describing the feeding habits of sea slugs.  You need to read Being Wrong.

Everyone does.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Tuesday, June 23, 2020

A monumental change

You've probably heard the recent controversy about removing the statues of Confederate officers from prominent positions in the Deep South, with the anti-removal-crowd saying "It's our heritage" and the pro-removal-crowd saying "... but it's celebrating racism."  I don't intend to explore the reasoning behind either position, since I suspect that (1) we all know what our opinions on the issue are, and (2) it's unlikely anything I say would change anyone's mind.  But I do want to offer an alternative, which was (unfortunately) not my idea but the brainstorm of some folks in West Virginia.  They want to replace the statues celebrating the Confederacy with...

... statues of Mothman.

West Virginia high school teacher Jay Sisson explains:
To many West Virginians, Mothman carries more significance than any Confederate general.  In fact, the legend originated in the town of Point Pleasant, when locals spotted a “man-sized bird creature” prior to the 1967 Silver Bridge collapse that killed 46 people.  Mothman was blamed and retroactively seen as a bad omen that foreshadowed the disaster.  From there, the story of the Mothman spread across the country and became an urban legend of sorts.
Twitter user Brenna (@HumanBrennapede) has an additional reason for preferring Mothman; unlike most Confederate generals, she says, Mothman has "a six-pack and an objectively good ass."  The statue of the creature in Point Pleasant, West Virginia, illustrates this:



And I have to admit she's right that he has quite a shapely posterior.  It does remind me, however, of my days teaching Ancient Greek to high schoolers.  One of my classes complained to me one day that they were sick of learning phrases of limited modern utility like "O Zeus, accept my sacrifice" and "Prometheus's liver is being devoured by an eagle."

"Well, what do you want to learn how to say?" I asked.

One boy said, "How about, 'You have a nice ass.'"

I shrugged and said, "Okay.  It'd be, 'kalein pygian ekheis.'"  (Transliterated roughly into English letters.)

They all laughed, and I added, "I guess if you know how to say, 'you have a nice ass,' you'd better learn how to say 'thank you.'"  So I had them repeat after me, 'sas eukharisto.'"

At this point, the class was in hysterics.  Something seemed off -- it wasn't that funny.  So I turned around...

... and the principal was standing in the doorway.

Fortunately, he has an awesome sense of humor, and joined in the laughter at my obvious discombobulation.  And the students used that as their greeting to each other in the hall for the rest of the school year: "Kalein pygian ekheis."  "Sas eukharisto!"

Never let it be said that I didn't make an impact as a teacher.

But I digress.

Anyhow, I think the Mothman statue idea is brilliant.  It could be applied to lots of other states, too, each of which has its own terrifying and inhuman monster.  Florida could have statues of the Skunk Ape.  Louisiana has the Grunch.  Arkansas has the Boggy Creek Monster.  Kentucky has Mitch McConnell.

You get the idea.

So that would solve the problem of injuring state pride, and focus people's attention away from a bunch of military leaders who (to be brutally frank) lost anyhow.

But I'm not expecting it to catch on.  Inspired ideas usually don't.  Even ones that involve making statues of a creature with "an objectively good ass."

**************************************

I know I sometimes wax rhapsodic about books that really are the province only of true science geeks like myself, and fling around phrases like "a must-read" perhaps a little more liberally than I should.  But this week's Skeptophilia book recommendation of the week is really a must-read.

No, I mean it this time.

Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error is something that everyone should read, because it points out the remarkable frailty of the human mind.  As wonderful as it is, we all (as Schulz puts it) "walk around in a comfortable little bubble of feeling like we're absolutely right about everything."  We accept that we're fallible, in a theoretical sense; yeah, we all make mistakes, blah blah blah.  But right now, right here, try to think of one think you might conceivably be wrong about.

Not as easy as it sounds.

She shocks the reader pretty much from the first chapter.  "What does it feel like to be wrong?" she asks.  Most of us would answer that it can be humiliating, horrifying, frightening, funny, revelatory, infuriating.  But she points out that these are actually answers to a different question: "what does it feel like to find out you're wrong?"

Actually, she tells us, being wrong doesn't feel like anything.  It feels exactly like being right.

Reading Schulz's book makes the reader profoundly aware of our own fallibility -- but it is far from a pessimistic book.  Error, Schulz says, is the window to discovery and the source of creativity.  It is only when we deny our capacity for error that the trouble starts -- when someone in power decides that (s)he is infallible.

Then we have big, big problems.

So right now, get this book.  I promise I won't say the same thing next week about some arcane tome describing the feeding habits of sea slugs.  You need to read Being Wrong.

Everyone does.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Monday, June 22, 2020

Along came a spider...

Sometimes there's a discovery that's just so cool that I need to tell y'all about it.  It's not apropos of much except that you never know what science is going to uncover next.

Have you ever wondered why some animals' eyes glow in the dark, and others don't?  I've seen raccoons, cats, and (given where I live, lots of) deer get caught in my car headlights at night, and been fascinated by the eerie reflections from their eyes.

But not all animals do this.  You can shine a bright light into a squirrel's eyes all you want, and you'll never see anything but a pissed-off squirrel.

The reason is a fascinating structure you only find in nocturnal animals called the tapetum lucidum.  This is basically a mirror behind the retina.  When light enters the eye and focuses on the retina, some of it passes through that tissue-thin layer of cells without activating one of the light-sensing structures -- i.e., it gets absorbed without contributing to vision.  If you're a diurnal animal (like us) this is no big deal; most of the time we have light to spare, and in fact a more common problem is too much light, which is why we have a sensitive iris that acts like the shutter on an old-fashioned camera, reducing the aperture so the inside of the eye doesn't get fried.

But nocturnal animals need every photon they can get, so many of them have evolved a tapetum.  Some of the light hitting the retina passes through and gets "wasted" -- but then it hits the tapetum and reflects back through the retina, providing a second opportunity to detect it.  This gives animals with this structure much better dim-light vision -- and makes their eyes glow in headlights.

I got to see another, more surprising, animal with a tapetum when I was on a night hike in Belize some years ago looking for nocturnal birds (many of which also have tapeta, for what it's worth).  We were all wearing headlamps, and the guide pointed out that on the trail there were what looked like hundreds of tiny rubies, emeralds, and sapphires, glittering as the beam of the lamps passed across.

"What do you think those are?" she asked, and her eyes were also glittering -- with mischief.

"No idea," I said.

"Get closer," she said.

I knelt down and peered at the little jewel-like sparkles, and very quickly discovered they were...

... spiders.

"Each species glows a different color," she said.  "We're not really sure why."

Fortunately, I'm not an arachnophobe, but I did stand up rather quickly.

So that brings us to the discovery, which appeared last week in the Journal of Systematic Paleontology.  Spider fossils aren't very common, given that they are small and don't have bones or teeth (the most commonly preserved parts), and most spider fossils known have come from amber.  So it was quite a surprise to find a beautifully-preserved fossilized spider in Korea in a formation of chert.  Chert is a sedimentary rock (obviously, since that's pretty much where you find fossils) made up of tiny crystals of quartz.  Most of it comes from the layers of the evocatively-named siliceous ooze that coats the deep ocean floor and is made mostly of the silica skeletons of diatoms, microscopic algae that build shells out of glass.  But some chert forms when water passes through cracks in silica-rich rocks, dissolving bits of it that are then deposited in layers somewhere else.  (This is, essentially, the process that forms petrified wood.)

Here, it preserved this little spider so well that after 110 million years, you can still see its tapetum -- meaning it was a nocturnal hunter, rather like a modern wolf spider.

Without further ado, here he is:


... and the tapeta still reflect light.

"This is an extinct family of spiders that were clearly very common in the Cretaceous and were occupying niches now occupied by jumping spiders that didn’t evolve until later," said Paul Selden of the University of Kansas, who co-authored the paper, in a press release.  "But these spiders were doing things differently.  Their eye structure is different from jumping spiders.  It’s nice to have exceptionally well-preserved features of internal anatomy like eye structure.  It’s really not often you get something like that preserved in a fossil."

So pretty amazing stuff, and my apologies to the arachnophobes in the studio audience.  Hopefully the title of the post was enough to forewarn you.  Me, I think they're cool, although I wasn't as sanguine as my guide in Belize was about getting nose-to- ... um... nose-to-chelicerae with them.  But this illustrates something I've mentioned many times before; science never loses its capacity to astonish us with the complexity and beauty of the natural world.

**************************************

I know I sometimes wax rhapsodic about books that really are the province only of true science geeks like myself, and fling around phrases like "a must-read" perhaps a little more liberally than I should.  But this week's Skeptophilia book recommendation of the week is really a must-read.

No, I mean it this time.

Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error is something that everyone should read, because it points out the remarkable frailty of the human mind.  As wonderful as it is, we all (as Schulz puts it) "walk around in a comfortable little bubble of feeling like we're absolutely right about everything."  We accept that we're fallible, in a theoretical sense; yeah, we all make mistakes, blah blah blah.  But right now, right here, try to think of one think you might conceivably be wrong about.

Not as easy as it sounds.

She shocks the reader pretty much from the first chapter.  "What does it feel like to be wrong?" she asks.  Most of us would answer that it can be humiliating, horrifying, frightening, funny, revelatory, infuriating.  But she points out that these are actually answers to a different question: "what does it feel like to find out you're wrong?"

Actually, she tells us, being wrong doesn't feel like anything.  It feels exactly like being right.

Reading Schulz's book makes the reader profoundly aware of our own fallibility -- but it is far from a pessimistic book.  Error, Schulz says, is the window to discovery and the source of creativity.  It is only when we deny our capacity for error that the trouble starts -- when someone in power decides that (s)he is infallible.

Then we have big, big problems.

So right now, get this book.  I promise I won't say the same thing next week about some arcane tome describing the feeding habits of sea slugs.  You need to read Being Wrong.

Everyone does.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Saturday, June 20, 2020

Mirror, mirror, on the wall

Sometimes my mental processes are like a giant exercise in free association.

I've always been this way.  My personal motto could be, "Oh, look, something shiny!"  When I was a kid my parents had a nice set of the Encyclopedia Brittanica, and in those pre-internet days I used them for research for school projects.  So I'd start by looking something up -- say, the provisions of the Twelfth Amendment to the United States Constitution -- and I'd notice something in the article, which I'd then have to look up, then I'd notice something there, and so forth and so on, and pretty soon I was reading the entry about the mating habits of wombats.

My younger son inherited this tendency.  Conversations between the two of us resemble a pinball game.  More than once we've stopped and tried to figure out how we got from Point A to Point Z, but sometimes the pathway is just too weird and convoluted to reconstruct.  Maybe that's why I love James Burke's iconic television series Connections; the lightning-fast zinging from event to event and topic to topic, which Burke uses to brilliant (and often comical) effect, is what's happening inside my skull pretty much all the time.

It's a wonder I ever get anything done.

The reason this comes up is because I was chatting with a friend of mine, the wonderful author K. D. McCrite, about trying to find a topic for Skeptophilia that I hadn't covered before.  She asked if I'd ever looked at the role of mirrors in claims of the paranormal.  I said I hadn't, but that it was an interesting idea.

So I started by googling "mirrors paranormal," and this led me to the Wikipedia article on "scrying."  Apparently this was the practice of gazing into one of a wide variety of objects or substances to try to contact the spirit world.  The article says:
The media most commonly used in scrying are reflective, refractive, translucent, or luminescent surfaces or objects such as crystals, stones, or glass in various shapes such as crystal balls, mirrors, reflective black surfaces such as obsidian, water surfaces, fire, or smoke, but there is no special limitation on the preferences or prejudices of the scryer; some may stare into pitch dark, clear sky, clouds, shadows, or light patterns against walls, ceilings, or pond beds.  Some prefer glowing coals or shimmering mirages. Some simply close their eyes, notionally staring at the insides of their own eyelids, and speak of "eyelid scrying."
I think next time I'm taking a nap and my wife wants me to get up and do yard chores, I'm going to tell her to leave me alone because I'm "eyelid scrying."

Yeah, that'll work.

Anyhow, what scrying seems like to me is staring into something until you see something, with no restrictions on what either something is.  It does mean that you're almost guaranteed success, which is more than I can say for some divinatory practices.  But this brought me to the "Hermetic Order of the Golden Dawn," because they apparently recommended mirror-scrying as a way of seeing who was exerting a positive or negative effect on you, and believed that if you stared into a mirror you'd see faces of those people standing behind you.  This was preferably done in a dimly-lit room, because there's nothing like making everything harder to see for facilitating your seeing whatever you thought you were gonna see.

[Image is in the Public Domain]

On this site, there is a list of famous members, and to my surprise one of them was Charles Williams, a novelist who was a close friend of J. R. R. Tolkien and C. S. Lewis.  His novels Descent into Hell, All Hallow's Eve, The Greater Trumps, The Place of the Lion, and War in Heaven are fascinatingly weird, like nothing else I've ever read -- a combination of urban fantasy and fever dream.  He was also a devout Christian, so his membership in the Golden Dawn strikes me as odd, but I guess he wasn't the only one to try blending Christianity with neo-druidic mysticism.

At this point I felt I was getting a little far afield from my original intent, so I decided to leave Wikipedia (with its multiple internal links and temptations to wander) and found a site about the history of mirrors and their uses.  On this site I learned that there's a tradition of covering all the mirrors in the house when a family member dies, to prevent the dear departed's soul from becoming trapped in the mirror.  The problem is, if the deceased's spirit wants to hang around, it can simply sidestep -- there's a whole lore about spirits and other paranormal entities which can only be seen out of the corner of your eye.

This immediately grabbed my attention because it's the basis of my novella Periphery, which is scheduled to come out in a collection called A Quartet for Diverse Instruments in the summer of 2021.  The idea of the story is that an elderly woman decides to have laser surgery to correct her nearsightedness, and afterwards she starts seeing things in her peripheral vision that no one else sees, and which disappear (or resolve into ordinary objects) when she looks at them straight-on.

The problem is, these things are real.

*cue scary music*

This led me to look into accounts of "shadow people" who exist on the fringes of reality and are only (partly) visible as dark silhouettes that flicker into and out of existence in your peripheral vision.  From there, I jumped to a page over at the ever-entertaining site Mysterious Universe about "static entities," which are not only vague and shadowy but appear to be made of the same stuff as static on a television screen.  I don't want to steal the thunder from Brent Swancer (the post's author) because the whole thing is well worth reading, but here's one example of an account he cites:
All of a sudden I had a really powerful urge to look at the end of the hallway.  We had recently brought a coat stand from a bootsale and this was in the middle of the hallway now.  As I stood there I saw a human outline but entirely filled with TV like static, I remember little bits of yellow and blue in it but was mainly white and it came out of the bedroom on the left and was in a running stance but it was really weird because it was in slow motion and it ran from the left to the back door on the right.  As it ran it grabbed the coat stand and pulled it down with it and it fell to the floor. I was just standing there after in shock...  I ran to my sister and told her what happened and when we went back to the hallway the stand was still on the floor.  That was the only time I saw it, I don’t know why I saw it or why it pulled the stand down, it was all just surreal.  I did have some other experiences in that house that were paranormal so maybe it was connected.
But unfortunately at the end of this article was a list of "related links," and one of them was, "Raelians' ET Embassy Seeks UN Help and Endorsement," which is about a France-based group who believes that the Elohim of the Bible were extraterrestrials who are coming back, and they want the United Nations to prepare a formal welcome for them, so of course I had to check that out.

At this point, I stopped and said, "Okay, what the hell was I researching again?"  The only one in the room with me was my dog, and he clearly had no idea.  So my apologies to K. D., not to mention my readers.  The whole mirrors thing was honestly a good idea, and it probably would have made an awesome post in the hands of someone who has an attention span longer than 2.8 seconds and isn't distracted every time a squirrel farts in the back yard.  But who knows?  Maybe you learned something anyhow.  And if you followed any of the links, tell me where you ended up.  I can always use a new launch point for my digressions.

****************************************

These days, I think we all are looking around for reasons to feel optimistic -- and they seem woefully rare.  This is why this week's Skeptophilia book recommendation of the week is Hans Rosling's wonderful Factfulness: Ten Reasons We're Wrong About the World--and Why Things Are Better Than You Think.  

Rosling looks at the fascinating bias we have toward pessimism.  Especially when one or two things seem seriously amiss with the world, we tend to assume everything's falling apart.  He gives us the statistics on questions that many of us think we know the answers to -- such as:  What percentage of the world’s population lives in poverty, and has that percentage increased or decreased in the last fifty years?  How many girls in low-income countries will finish primary school this year, and once again, is the number rising or falling?  How has the number of deaths from natural disasters changed in the past century?

In each case, Rosling considers our intuitive answers, usually based on the doom-and-gloom prognostications of the media (who, after all, have an incentive to sensationalize information because it gets watchers and sells well with a lot of sponsors).  And what we find is that things are not as horrible as a lot of us might be inclined to believe.  Sure, there are some terrible things going on now, and especially in the past few months, there's a lot to be distressed about.  But Rosling's book gives you the big picture -- which, fortunately, is not as bleak as you might think.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Friday, June 19, 2020

The walk of life

It's remarkably difficult to decide exactly what we mean when we say something is alive.

As a biologist, this is kind of embarrassing.  After all, "biology" means "the study of life."  So in a very real sense, we're studying something when we're not even sure what it is.  Oh, sure, there are some clear-cut examples; a dog is alive, a rock is not.  But amazingly enough, when you try to pinpoint what the dog is doing that the rock is not, you get into some shaky ground -- and rules that are rife with exceptions.

How about "capable of reproducing?"  You can't just say "reproduces," because a good many organisms don't reproduce because of choice or circumstance.  And let's even throw out the timeworn exceptions of the hybrid, infertile mules and ligers as being genetic anomalies.  But what about worker ants?  Worker ants are females that had the development of functional reproductive anatomy suppressed, so they are completely infertile; but they're not genetic accidents like infertile hybrids are.  They are not even theoretically capable of reproducing, but I doubt seriously anyone would argue that they're not alive.

Then, there's "limited life span."  Living things die, usually after a length of time characteristic of the particular species.  However, the bristlecone pine (Pinus longaeva) doesn't seem to have an upper bound on its life span.  Most plants, even most trees, age out after a while -- birch trees live thirty to forty years, silver maples eighty or ninety, red oaks two hundred, white oaks as much as eight hundred -- bristlecone pines don't do that.  Unless they meet with misfortune, they just keep on living.  A bristlecone in the Inyo National Forest of California is 4,851 years old.  To put that into perspective, when the Great Pyramid at Giza was built, this tree was already three hundred years old.

And it isn't just plants.  There's a jellyfish, Turritopsis dohrnii, that is effectively immortal -- when it reaches senescence, it begins to despecialize its cells, returning to the polyp (juvenile) stage, then redifferentiating.  There seems to be no limit to the number of times it can do this -- putting the Time Lords with their twelve regeneration cycles to considerable shame.

And don't get me started on viruses, which are an exception to the majority of the usually-accepted characteristics of life.

The upshot is that the whole topic is way more controversial than you'd think.  Even such seemingly-obvious ones as "composed of one or more cells" and "encodes genetic information as DNA or RNA" may be looking at things from an Earth-bound perspective; life on other planets might well compartmentalize their metabolic processes and store their genetic information in entirely different ways, and still be recognizably "alive."

There's one characteristic, though, that very few people whose opinions count will argue over; living things are subject to evolution by natural selection.  (Okay, the creationists will argue like hell about it, but they conspicuously fail on the "opinions counting" qualification.)  Clearly living things evolve, and it's hard to imagine a non-living thing that would do so.  This, then, would make "evolution by natural selection" not only a necessary condition for being alive, but a sufficient one.

Which would settle once and for all the questions of whether viruses are alive.  They clearly evolve, which is why one flu shot doesn't make you immune for life.

[Image licensed under the Creative Commons Myworkforwiki, Major Evolutionary Transitions digital, CC BY-SA 4.0]

Well, as I am wont to do, I've been leading you down the garden path.  Because if you have been nodding your head and saying, "Okay, that makes sense" to what I've written above...

... scientists in a research lab in Germany have just created life.

Christian Mayer, a chemist at the Center for Nanointegration, and Ulrich Schreiber, a geologist at the University of Duisberg-Essen, have long been of the opinion that life on Earth began underground, not in shallow tide pools (the more common hypothesis).  The heat and pressure in deep crevices in the Earth create conditions that would lead to the formation of vesicles -- water-filled bubbles surrounded by a lipid-bilayer membrane.  These are thought to be the earliest cells, eventually trapping bits of RNA and leading to the first true life-forms.

So Mayer and Schreiber decided to recreate these deep crevices.  They allowed the temperature in lab apparatus simulating deep-Earth characteristics to fluctuate between 40 and 80 C, and the pressure between 60 and 80 bar.  Sure enough, under those conditions, a "primordial soup" forms vesicles readily.  Like soap bubbles, they are created and destroyed rapidly, some lasting longer than others.

But unlike soap bubbles, these vesicles evolve.

For full impact, here's the relevant quote from the press release:
In their laboratory experiment, they regularly changed the pressure in the system at 20-minute intervals, thereby changing the quality of the solvent, as it also occurs in nature through tidal forces and geysers.  In the process, the vesicles were periodically destroyed and re-formed.  Thus, a total of 1,500 generations of vesicles were created and disintegrated again within two weeks. 
The researchers discovered that an increasing number of vesicles survived the generation change.  Analyses showed that these vesicles had embedded specific sequences of 10 to 12 amino acids from the pool of possible peptides into their membrane in a cluster-like manner.  Further tests, specifically carried out with one of these peptides, revealed three effects on the vesicles in question: They became thermally more stable, smaller and hence more resistant and – most importantly – the permeability of their membrane was considerably increased.
Put simply, the vesicles underwent natural selection and evolved to increase their stability and permeability.  The embedded peptides they mention are the first approach to the transmembrane channel proteins that every cell has, allowing it to transport materials across the membrane as needed.

"As we have simulated in time-lapse, functions could have been created billions of years ago that made such vesicles stable enough to come to the surface from the depths, for example with the flow of tectonic fluids or during geyser eruptions," said study co-author Ulrich Schreiber.  "Subsequently, a first metabolism with concentration gradients as an energy source could have developed.  If the ability to self-replicate is eventually acquired, then even from a biological point of view an inanimate component slowly becomes a living organism, a first cell."

So there you are.  Mayer and Schreiber, being cautious scientists, are not saying they've created life, but the implication is there -- and even the most hesitant amongst us (not you, creationists) would have to admit that whatever you want to call it, this represents a huge step toward generating something that is unequivocally alive.

Which I find to be somewhere beyond mind-boggling.

****************************************

These days, I think we all are looking around for reasons to feel optimistic -- and they seem woefully rare.  This is why this week's Skeptophilia book recommendation of the week is Hans Rosling's wonderful Factfulness: Ten Reasons We're Wrong About the World--and Why Things Are Better Than You Think.  

Rosling looks at the fascinating bias we have toward pessimism.  Especially when one or two things seem seriously amiss with the world, we tend to assume everything's falling apart.  He gives us the statistics on questions that many of us think we know the answers to -- such as:  What percentage of the world’s population lives in poverty, and has that percentage increased or decreased in the last fifty years?  How many girls in low-income countries will finish primary school this year, and once again, is the number rising or falling?  How has the number of deaths from natural disasters changed in the past century?

In each case, Rosling considers our intuitive answers, usually based on the doom-and-gloom prognostications of the media (who, after all, have an incentive to sensationalize information because it gets watchers and sells well with a lot of sponsors).  And what we find is that things are not as horrible as a lot of us might be inclined to believe.  Sure, there are some terrible things going on now, and especially in the past few months, there's a lot to be distressed about.  But Rosling's book gives you the big picture -- which, fortunately, is not as bleak as you might think.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]