Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, August 27, 2020

Rewarding the daredevil

There were three magic words that used to be able to induce me to do almost anything, regardless how catastrophically stupid it was: "I dare you."

It's how I ended up walking the ridgeline of a friend's house when I was in eighth grade:
Friend: My house has such a steep roof.  I don't know how anyone could keep his balance up there.
Me:  I bet I could. 
Friend (dubiously):  You think? 
Me;  Yeah. 
Friend:  I dare you. 
Me:  Get me a ladder.
That I didn't break my neck was as much due to luck as skill, although it must be said that back then I did have a hell of a sense of balance, even if I didn't have much of any other kind of sense.

[Image licensed under the Creative Commons Øyvind Holmstad, A yellow house with a sheltering roof, CC BY-SA 3.0]

Research by neuroscientists Lei Zhang (University Medical Center Hamburg-Eppendorf) and Jan Gläscher (University of Vienna) has given us some insight into why I was prone to doing that sort of thing (beyond my parent's explanation, which boiled down to "you sure are an idiot").  Apparently the whole thing has to do with something called "reward prediction error" -- and they've identified the part of the brain where it occurs.

Reward prediction error occurs when there is a mismatch between the expected reward and the actual reward.  If expected reward occurs, prediction error is low, and you get some reinforcement via neurochemical release in the putamen and right temporoparietal junction, which form an important part of the brain's reward circuit.  A prediction error can go two ways: (1) the reward can be lower than the expectation, in which case you learn by changing your expectations; or (2) the reward can be higher than the expectation, in which case you get treated to a flood of endorphins.

Which explains my stupid roof-climbing behavior, and loads of other activities that begin with the words "hold my beer."  I wasn't nearly as fearless as I was acting; I fully expected to lose my balance and go tumbling down the roof.  When that didn't happen, and I came ambling back down the ladder afterward to the awed appreciation of my friend, I got a neurochemical bonus that nearly guaranteed that next time I heard "I dare you," I'd do the same thing again.

The structure of the researchers' experiment was interesting.  Here's how it was described in a press release in EurekAlert:
[The] researchers... placed groups of five volunteers in the same computer-based decision-making experiment, where each of them was presented with two abstract symbols.  Their objective was to find out which symbol would lead to more monetary rewards in the long run.  In each round of the experiment, every person first made a choice between the two symbols, and then they observed which symbols the other four people had selected; next, every person could decide to stick with their initial choice or switch to the alternative symbol.  Finally, a monetary outcome, either a win or a loss, was delivered to every one according to their second decision...  In fact, which symbol was related to more reward was always changing.  At the beginning of the experiment, one of the two symbols returned monetary rewards 70% of the time, and after a few rounds, it provided rewards only 30% of the time.  These changes took place multiple times throughout the experiment...  Expectedly, the volunteers switched more often when they were confronted with opposing choices from the others, but interestingly, the second choice (after considering social information) reflected the reward structure better than the first choice.
So social learning -- making your decisions according to your friends' behaviors and expectations -- is actually not a bad strategy.  "Direct learning is efficient in stable situations," said study co-author Jan Gläscher, "and when situations are changing and uncertain, social learning may play an important role together with direct learning to adapt to novel situations, such as deciding on the lunch menu at a new company."

Or deciding whether or not it's worth it to climb the roof of a friend's house.

We're social primates, so it's no surprise we rely a great deal on the members of our tribe for information about what we should and should not do.  This works well when we're looking to older and wiser individuals, and not so well when the other members of our tribe are just as dumb as we are.  (This latter bit explains a lot of the behavior we're currently seeing in the United States Senate.)  But our brains are built that way, for better or for worse.

Although for what it's worth, I no longer do ridiculous stunts when someone says "I dare you."  So if you were planning on trying it, don't get your hopes up.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Wednesday, August 26, 2020

The rising waters

2020 has become some kind of nihilist joke.  How many versions of the "2020 Apocalypse Bingo Card" have you heard?  And it does seem like things are just piling on.  Right now, we have a major category-3 hurricane bearing down on southern Louisiana -- that just got hit by a totally different tropical storm three days ago.  We're in the middle of a pandemic that is showing no signs of letting up.  There are record-setting wildfires in California.  The economy is giving serious signs of tanking in a big way.  Protests against police brutality seem to erupt every other day.  Last, we're in the middle of the Republican National Convention, where the platform seems to be, "Look how fucked up everything has gotten in the last four years!  Give us another four and we'll do the same thing again but even bigger this time!"

In a situation like this, I'm always reluctant to add to the doom and gloom.  But I would be remiss in not pointing out that all of the above is small potatoes, really.  A lot of us, in fact, are concerned at how the current chaos has distracted us from a far, far bigger problem.  We are facing an unprecedented climate catastrophe, not in a hundred years, not in fifty years, but right now. and three papers in the past two weeks have added to what was already a clarion call to action.

Let's start with the deep oceans.  The abyssal region of the Earth's oceans is supposed to be one of the most stable ecosystems on Earth.  Saline, completely pitch dark, crushing pressures, and always at just shy of four degrees Celsius -- the temperature at which water is its densest.  No change, no matter what's happening up above.

But last week a paper in Nature Climate Change looked into the deeps of the ocean, and found something terrifying.  The anthropogenic climate change signature is showing up in a place that is supposed to be about as insulated from human effects as you could imagine.

A team led by oceanographer Yona Silvy of the Université Sorbonne wrote the following:
[U]sing 11 climate models, we define when anthropogenic temperature and salinity changes are expected to emerge from natural variability in the ocean interior along density surfaces.  The models predict that in 2020, 20–55% of the Atlantic, Pacific and Indian basins have an emergent anthropogenic signal; reaching 40–65% in 2050 and 55–80% in 2080.  The well-ventilated Southern Ocean water masses emerge very rapidly, as early as the 1980–1990s, while the Northern Hemisphere water masses emerge in the 2010–2030s.  Our results highlight the importance of maintaining and augmenting an ocean observing system capable of detecting and monitoring persistent anthropogenic changes.
Perhaps this should have been unsurprising, considering that 93% of the anthropogenic heating the Earth is experiencing is being absorbed by bodies of water.  But the idea that this absorption isn't limited to the surface -- that we're actually impacting the deepest parts of the world's oceans -- is seriously scary to anyone who knows anything about the environment and climate models.

Scientists have long been concerned about the tipping point -- the point that climatic catastrophe becomes inevitable no matter what we do.  A second study out of Ohio State University has shown conclusively that we've passed that point with respect to one of the Earth's systems, the melting of the Greenland Ice Sheet.

What the researchers found was that up until about the year 2000, the glaciers in Greenland were pretty well in balance.  The amount of ice loss during the summer was nearly equal to the amount of ice gain from snowfall during the winter.  But around 2000, the situation changed, and since then Greenland has lost a staggering 50 gigatons (50 billion tons) more ice than it gained.

"Glacier retreat has knocked the dynamics of the whole ice sheet into a constant state of loss," said Ian Howat, who co-authored the paper.  "Even if the climate were to stay the same or even get a little colder, the ice sheet would still be losing mass."

[Image licensed under the Creative Commons Christine Zenino from Chicago, US, Greenland Glaciers outside of Ammassalik (5562580093), CC BY 2.0]

The polar bears aren't the only ones who should be concerned.  Greenland is second only to Antarctica in its potential effect on sea level rise.  If the Greenland Ice Sheet melts -- which is has sometimes done during warm periods in Earth's climate -- it would raise the sea levels by six meters.  Everywhere under six meters of elevation would be under water.

So wave goodbye at New Orleans, Antwerp, Charleston, Boston, a good chunk of New York City and Long Island, and most of Florida, Delaware, the Netherlands, and Bangladesh.

If that's not bad enough, a paper in The Cryosphere last week, authored by a team from three universities -- Leeds, Edinburgh, and University College London -- considered the situation worldwide, and found that in the past twenty-three years, the Earth lost 28 trillion tons of ice.

"To put that in context, every centimeter of sea-level rise means about a million people will be displaced from their low-lying homelands," said Andy Shepherd, director of Leeds University's Centre for Polar Observation and Modelling, in an interview with The Guardian.  "In the past researchers have studied individual areas – such as the Antarctic or Greenland – where ice is melting.  But this is the first time anyone has looked at all the ice that is disappearing from the entire planet...  What we have found has stunned us.  There can be little doubt that the vast majority of Earth's ice loss is a direct consequence of climate warming."

It's easy to focus on what's right in front of your face and forget about the big picture.  This would be okay if the big picture wasn't so deeply horrifying.  I hate to be another purveyor of pessimism, but we have got to start taking this seriously.  I'm as upset about the pandemic and the global political chaos as the next guy, but this isn't a time to be distracted away from a much bigger issue -- the long-term habitability of the planet.

Let's keep our eyes on the ball, here.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Tuesday, August 25, 2020

The left-handed universe

I first ran into the concept of chirality when I was a fifteen-year-old Trekkie science fiction nerd.

I grew up watching the original Star Trek, which impressed the hell out of me as a kid even though rewatching some of the episodes now generates painful full-body cringes at the blatant sexism and near-jingoistic chauvinism.  Be that as it may, after going through the entire series I don't even know how many times, I started reading some of the fan fiction.

The fan fiction, of course, was more uneven than the show had been.  Some of it was pretty good, some downright terrible.  One that had elements of both, putting it somewhere in the "fair to middling" category, was Spock Must Die by James Blish.  Blish had gotten into the Star Trek universe writing short-story adaptations of most of the original series episodes, but this one was entirely new.

Well, mostly.  It springboarded off an original series episode, "Errand of Mercy," in which the Federation and the Klingons are fighting over the planet Organia, which is populated by a peaceful, pastoral society.  Kirk et al. are trying to stop the Klingons from massacring the Organians, but much to Kirk's dismay, the Organians refuse Federation protection, insisting they don't need any help.  And it turns out they don't -- in the end, you find out that the Organians are super-powerful aliens who only assumed human-ish form to communicate with the two humanoid invading forces, and are so far beyond both of them that they indeed had nothing to fear.

In Spock Must Die, the crew of the Enterprise is sent to investigate why Organia has suddenly gone radio-silent.  It turns out that the Klingons have surrounded the entire planet with a force field.  Spock volunteers to try to transport through it, which fails -- but after the attempt, suddenly there are two Spocks in the transporter room, each claiming to be the real, original Vulcan.

[spoiler alert, if anyone is actually going to go back and read it...]  What happened is that the transporter beam was reflected off the surface of the force field, and it duplicated Spock -- there was the original (who never left the transporter pad) and the duplicate (the reflection, recreated in place).  Since both the original and the duplicate were identical down to the last neuron, each of them had the same memories, and each was convinced he was the real Spock.

The key turned out to be the fact that the duplicate had been reflected all the way down to the molecular level.

Why this matters is that a number of molecules in our bodies -- amino acids and sugars being two common examples -- are chiral, meaning they have a "handedness."  Just like a glove, they exist in two possible forms, a "right-handed" and a "left-handed" one, which are mirror images of each other.  And for reasons unknown, all of our amino acids are left-handed.  No organism known manufactures right-handed amino acids.  Further, if you synthesized right-handed amino acids -- which could be done in the laboratory -- and fed them to a terrestrial organism, the organism would starve.

But the reflected Spock, of course, is exactly the opposite.  Kirk eventually figures out what's happened because one of the Spocks barricades himself in one of the science laboratories, claiming the other Spock wants to kill him.  The truth was he had to have access to a lab in order to synthesize the right-handed amino acids without which he'd die.

Clever concept for a story, right there.

[Image licensed under the Creative Commons Petritap, Finnish mittens, CC BY-SA 3.0]

Chirality is quite a mystery.  Like I said, the left-handedness of amino acids is shared by all known terrestrial organisms, so that bias must have happened very early in the generation of life.

Why it happened is another matter entirely.  A persistent question in scientific inquiries into the origin of life on Earth (and the possibility of life elsewhere) is how much of our own biochemistry and metabolism is constrained.  We code our genetic information as DNA; could it be done a different way elsewhere?  Our primary energy driver is ATP.  Are there other ways organisms might store and access chemical energy?  The question of constraint goes all the way up the scale to macroscopic features, such as cephalization -- the clustering of the sensory processing organs near the anterior end of the animal.  Makes sense; you want your sensors facing (1) the direction you're traveling, and (2) what you're eating.  But are there other equally sensible ways to put an animal together?

Some things we take for granted almost certainly aren't constrained, like bilateral symmetry.  So many animals are bilaterally symmetrical that the ones that aren't (like adult flounders) stand out as bizarre.  Aficionados of H. P. Lovecraft might remember that amongst the innovative ideas he used was that the aliens in "At the Mountains of Madness" weren't bilateral, but had five-way symmetry -- something completely unknown on Earth.  (You may be thinking, "wait... starfish?"  Starfish have what I'd call pseudo-pentaradial symmetry.  As larvae, they're clearly bilateral, and they lose a lot of bilateral features when they mature.  But some characteristics -- like the position of the sieve plate, their water-intake device -- give away that deep down, they are still basically bilateral.)

Anyhow, all this comes up because of a recent discovery by astrobiologists at NASA's Goddard Space Flight Center.  In a press release, we hear about a meteorite discovered in Antarctica called Asuka 12236, which is a carbonaceous chondrite -- a peculiar type of meteorite that is rich in organic compounds.  Asuka 12236 contained large quantities of amino acids, which isn't as bizarre as it sounds; amino acids have been shown to form relatively easily if there are raw materials and a source of energy.

What stands out is that all of the amino acids in Asuka 12236 are left-handed -- just like the ones on Earth.

The scientists studying the meteorite are up front that the first thing to do is rule out that the amino acids in the meteorite aren't contaminants absorbed after the rock crash-landed.  Most of the experts, however, think this is unlikely, and that we're looking at a genuine sample of extraterrestrial amino acids.  And the fact that they all show left-handed chirality is pretty remarkable -- suggesting that the chirality of our biochemicals might, in fact, be constrained, and that we could well find biochemistry similar to our own on other planets.

In that way, at least.

So that's one less thing to worry about if we ever go to an alien world.  Unlike the right-handed reflected Mr. Spock, we'd be able to metabolize alien amino acids just fine.

Of course, how familiar-looking everything else would be is still open to question.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Monday, August 24, 2020

How to prove you exist

Let me say right up front that I don't mean any of what I'm saying here as criticism of the researchers themselves.

But there are times that it is damn frustrating that the research has to be done in the first place.

This comes up because of a paper that was published in Proceedings of the National Academy of Sciences a couple of weeks ago, by a team led by Jeremy Jabbour of the Department of Psychology at Northwestern University.  In "Robust Evidence for Bisexual Orientation Among Men," we read:
The question whether some men have a bisexual orientation—that is, whether they are substantially sexually aroused and attracted to both sexes—has remained controversial among both scientists and laypersons.  Skeptics believe that male sexual orientation can only be homosexual or heterosexual, and that bisexual identification reflects nonsexual concerns, such as a desire to deemphasize homosexuality.  Although most bisexual-identified men report that they are attracted to both men and women, self-report data cannot refute these claims.  Patterns of physiological (genital) arousal to male and female erotic stimuli can provide compelling evidence for male sexual orientation.  (In contrast, most women provide similar physiological responses to male and female stimuli.)  We investigated whether men who self-report bisexual feelings tend to produce bisexual arousal patterns.  Prior studies of this issue have been small, used potentially invalid statistical tests, and produced inconsistent findings.  We combined nearly all previously published data (from eight previous studies in the United States, United Kingdom, and Canada), yielding a sample of 474 to 588 men (depending on analysis).  All participants were cisgender males.  Highly robust results showed that bisexual-identified men’s genital and subjective arousal patterns were more bisexual than were those who identified as exclusively heterosexual or homosexual.  These findings support the view that male sexual orientation contains a range, from heterosexuality, to bisexuality, to homosexuality.
So basically what they did was to show naked pics of both men and women to self-identified bisexual guys, and check to see if they got hard-ons from both.

Like I said in the first sentence, I'm glad this research was done, because there is doubt out there.  I've heard that doubt go two ways -- that bisexuals are straight people looking for attention or for a kinky thrill, or that bisexuals are gay people who are afraid to admit it.  I remember clearly being told by a student -- long before I was out of the closet -- that she could understand there being homosexuals and heterosexuals, but she couldn't see how there could be bisexuals.  "How can they be attracted to both at the same time?" she asked me.  "Why don't they just make up their minds?"

I fell back on the research -- that bisexuality and the spectrum-nature of sexual orientation was well-established -- but even after seeing the data, she wasn't convinced.  "I just don't believe it," she said.

Not only was I appalled by this because, in essence, she was talking about me -- telling me that my own identity was an impossibility -- but because even presented with evidence, she went with her "feelings" on the topic rather than (1) the conclusions of the scientists, and worse, (2) people's assessment of their own orientation.

Because that's the thing, isn't it?  How does anyone have the fucking temerity to say, "No, that's not who you are.  I know better.  Here's who you actually are."?  People in the trans community know this all too well; how often are they told that someone else knows their gender better than they do?

And here, we're told we have to prove we even exist.

How about just believing us?

[Image licensed under the Creative Commons Peter Salanki from San Francisco, USA, The bisexual pride flag (3673713584), CC BY 2.0]

I've known I was bisexual since I was fifteen years old.  There was never any doubt about my attraction to both men and women.  Hell, I knew it before I'd ever even heard the word "bisexuality."  The fact that now, over forty years later, there has to be a study published in a major scientific journal to convince people that I actually know who I am -- that I'm not delusional or lying -- is nothing short of infuriating.

So thanks to Jabbour et al. for establishing peer-reviewed research that I hope and pray will put this question to rest once and for all.  I know it won't convince everyone -- my long-ago evidence-proof student as a case in point -- but maybe we'll move toward accepting that gender and sexual orientation are complex and completely non-binary, and better still, toward valuing people's understanding of who they are over society's pronouncements of who they should be.

And as I've said before: I wish I'd been strong enough and fearless enough to claim my own identity when I first realized it as a teenager.  I have often wondered what trajectory my life would have taken if I'd spent all those years free of the humiliation and fear I was raised with, and proud of who I was instead of ashamed of it.  You can't change past mistakes, more's the pity, but at least I can state who I am now and hope that my voice will add more volume to the call that each of us should be free to celebrate who we are without having to prove anything to anyone.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Saturday, August 22, 2020

A prehistoric hoax

One of the hazards of becoming more aware of how biased and (sometimes) duplicitous popular media can be is that you might finally, de facto, stop believing everything you read and hear.

It's called being a "cynic," and it's just as lazy as being gullible.  However, because the credulous are often derided as silly or ignorant, cynics sometimes feel that they must therefore be highly intelligent, and that disbelieving everything means that you're too smart to be "taken in."

In reality, cynicism is an excuse, a justification for having stopped thinking.  "The media always lies" isn't any closer to the truth than "everything you eat causes cancer" or "all of the science we're being told now could be wrong."  It give you an automatic reason not to read (or not to watch your diet or not to learn science), and in the end, is simply a statement of willful ignorance.

Take, for example, the site Clues Forum, which has as its tagline, "Exposing Media Fakery."  In particular, consider the thread that was started several years ago, but which continues to circulate, lo up unto this very day... entitled "The (Non-religious) Dinosaur Hoax Question."


Muttaburrasaurus skeleton (Queensland Museum)  [Image is in the Public Domain]

And yes, it means what you think it means.  And yes, the "Question" should simply be answered "No."  But let's look a little more deeply at what they're saying... because I think it reveals something rather insidious.

Take a look at how it starts:
Dinosaurs have, in recent years, become a media subject rivaling the space program in popularity and eliciting similar levels of public adoration towards its researchers and scientists.  The science of dinosaurs and other prehistoric life is also directly linked to other controversial scientific topics such as evolution, fuel production, climate and even the space program (i.e., what allegedly killed them).
So right from the outset, we've jumped straight into the Motive Fallacy -- the idea that a particular individual's motive for saying something has any bearing on that statement's truth value.  Those scientists, the author says, have a motive for our believing in dinosaurs.  Supporting controversial ideas for their own nefarious reasons.  Getting us worried about the climate and the potential for cataclysmic asteroid strikes.  Therefore: they must be lying.  We're never told, outright, why the scientists would lie about such things, but the seed is planted, right there in the first paragraph.

Then, we're thrown more reason for doubt our way, when we're told that (*gasp*) scientists make mistakes.  A dinosaur skeleton found in New Jersey, and now on display at the New Jersey State Museum, was reconstructed with a skull based on an iguana, since the actual skull could not be found.  The article, though, uses the word "fake" -- as if the museum owners, and the scientists, were deliberately trying to pull the wool over people's eyes, instead of interpolating the missing pieces -- something that is routinely done by paleontologists.  And those wily characters even gave away the game by admitting what they were up to, right beneath a photograph of the skeleton:
Above is the full-size Hadrosaurus mount currently on display at the New Jersey State Museum in Trenton.  The posture is now recognized as incorrect.  At the same time the skeleton is fitted with the wrong skull of another type of duck-bill dinosaur.  Signs at the exhibit acknowledge that both the mounted skeleton as well as nearby illustrated depictions of what the living animal looked like are both wrong.  Both are slated for correction at some unspecified future date.
Because that's what clever conspirators these scientists are.  Covering up the fact that they're giving out erroneous information on dinosaurs by... um... admitting they had some erroneous information about dinosaurs.

But according to Clues Forum, this is yet another hole punched in our confidence, with the revelation that (*horrors*) there are things scientists don't know.  Instead of looking at that as a future line of inquiry, this article gives you the impression that such holes in our knowledge are an indication that everything is suspect.

Last, we're told that it's likely that the paleontologists are creating the fossils themselves, because fossils are just "rock in rock," leaving it a complete guessing game as to where the matrix rock ends and the fossil begins.  So for their own secret, evil reasons, paleontologists spend days and weeks out in the field, living in primitive and inhospitable conditions, grinding rocks into the shape of bones so as to hoodwink us all:
But, in our hoax-filled world of fake science, doesn't this rock-in-rock situation make it rather easy for creative interpretations of what the animal really looked like?  And, once a particular animal is “approved” by the gods of the scientific community, wouldn't all subsequent representations of that same animal have to conform with that standard?
By the time you've read this far, you're so far sunk in the mire of paranoia that you would probably begin to doubt that gravity exists.  Those Evil, Evil Scientists!  They're lying to us about everything!

Of course, what we're seeing here is the phenomenon I started with; substituting lazy gullibility with lazy disbelief.  All the writer would have to do is sign up for a paleontology class, or (better yet) go on a fossil dig, to find out how the science is really done.

But I've found that people like this will seldom take any of those steps.  Once you suspect everyone, there's no one to lean on but yourself -- and (by extension) on your own ignorance.  At that point, you're stuck.  

So I should correct a statement I made earlier.  There is a difference between gullibility and cynicism.

Gullibility is far easier to cure.

***************************

Fan of true crime stories?  This week's Skeptophilia book recommendation of the week is for you.

In The Poisoner's Handbook:Murder and the Birth of Forensic Medicine in Jazz Age New York, by Deborah Blum, you'll find out about how forensic science got off the ground -- through the efforts of two scientists, Charles Norris and Alexander Gettler, who took on the corruption-ridden law enforcement offices of Tammany Hall in order to stop people from literally getting away with murder.

In a book that reads more like a crime thriller than it does history, Blum takes us along with Norris and Gettler as they turned crime detection into a true science, resulting in hundreds of people being brought to justice for what would otherwise have been unsolved murders.  In Blum's hands, it's a fast, brilliant read -- if you're a fan of CSI, Forensics Files, and Bones, get a copy of The Poisoner's Handbook, you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Friday, August 21, 2020

Deadly fireworks

I've always thought it would be amazingly cool to witness a supernova.

Imagine it.  Within a few hours, a dim, ordinary-looking star increases in luminosity until it outshines every other astronomical object in the sky except the Sun and Moon.  It's visible during the day and you can read by its light at night.  It's not a blink-and-you'll-miss-it phenomenon, either; the light from the massive explosion peaks rapidly but declines slowly.  Most supernovae will be visible for months, before dimming to near-invisibility, ending as neutron stars or black holes.

There are lots of candidates for what could be the next supernova, although don't get your hopes up; most of these fall into the "some time in the next million years" category.  Yeah, it could happen tomorrow, but I wouldn't put money on it.  Still, the list is sizable, and here are five of the best possibilities:
  • Betelgeuse (720 light years away, in the constellation Orion).  This one got some serious press a few months ago because it suddenly started to decrease in brightness, and astronomers wondered if this was a prelude to an explosion.  What appears to have happened is that there was turbulence in the star's core that blew a cloud of dust from its surface, obscuring the star and making it appear to dim.  So we're still waiting for this red supergiant to explode, and probably will be for a while.
  • IK Pegasi (154 light years away, in the constellation Pegasus).  IK Pegasi isn't well known because at an apparent magnitude of 6, it's not visible to the naked eye, but it bears mention as the nearest serious supernova candidate.  It's a double star -- a main-sequence star and a massive white dwarf orbiting a common center of mass.  As the main-sequence star evolves, it will become a red giant, with a radius large enough that its white dwarf companion will start suctioning matter from its surface.  When the white dwarf reaches what's called the Chandrasekhar Limit -- 1.4 solar masses -- it will explode cataclysmically as a Type 1a supernova.  This will not only be spectacular but potentially dangerous -- a topic we will revisit shortly.
  • VY Canis Majoris (3,820 light years away, in the constellation Canis Major).  Another star not visible to the naked eye, VY Canis Majoris is a lot more spectacular than you'd think to look at it.  It's the largest star known, with a mass fifteen times that of the Sun, and a radius so large that if you put it where the Sun is, its surface would be about at the orbit of Jupiter (so we'd be inside the star).  This "hypergiant" is one of the most luminous stars in the Milky Way, and is only dim because it's so far away.  This one is certain to go supernova, probably some time in the next 100,000 years, and the remnants will collapse into a black hole.
  • Eta Carinae (7,500 light years away, in the constellation Carina).  Eta Carinae is another huge star, with a radius twenty times that of the Sun, but what makes this one stand out is its bizarre behavior.  In 1837 it suddenly brightened to being one of the five brightest stars in the night sky, then over the next sixty years faded to the point that it was only visible in binoculars.  Detailed observations have shown that it blew out a huge cloud of material in "The Great Eruption," which is now the Homunculus Nebula.  It's a unique object, which makes it hard to predict its future behavior.  What seems certain is that it'll eventually explode, but there's no telling when that might occur.
The consensus amongst astronomers, however, is that the next likely supernova probably isn't on the list -- that it will be a previously-unknown white dwarf or an unremarkable-looking red giant.  We know so little about supernovas that it's impossible to predict them with any kind of accuracy.  And while this is an exciting prospect, we'd better hope that the next supernova isn't too close.

The Homunculus Nebula with Eta Carinae at the center [Image licensed under the Creative Commons ESA/Hubble, Cosmic Fireworks in Ultraviolet Eta Carinae Nebula, CC BY 4.0]

Not only do supernovas produce a lot of light, they generate a tremendous amount of radiation of other kinds, including cosmic rays.  A close supernova could produce enough cosmic rays to wipe out the ozone layer -- leading to a huge influx of ultraviolet light from the Sun, with devastating effects.

Scarily, this may have already happened in Earth's history.  One of the lesser-known mass extinctions occurred at the end of the Devonian Period, 359 million years ago.  Because it is poorly understood, and was dwarfed by the cataclysmic Permian-Triassic Extinction a little over a hundred million years later, it's not one you tend to read about in the paleontology-for-the-layperson books.  Even so, it was pretty significant, wiping out 19% of known families and 50% of known genera, including placoderms (armored fish), cystoids (a relative of the starfish), and graptolites (colonial animals not closely related to any living species).  Most striking were the collapse of reef-forming corals -- reefs didn't begin to form again on any significant scale until the Mesozoic Era, almost two hundred million years later -- and the near-complete wipeout of vertebrates.  The latter left no vertebrate species over a meter long (most of them were under ten centimeters), and again, it was millions of years before any kind of recovery took place.

Fortunately for us, it eventually did, because we're talking about our ancestors, here.

The cause of this catastrophe has been a matter of speculation, but a team led by Brian Fields, astrophysicist at the University of Illinois, may have found a smoking gun.  In a paper this week in Proceedings of the National Academy of Sciences, we find out that the most likely cause for the End-Devonian Extinction is a nearby supernova that caused the collapse of the ozone layer, leading to the Earth's surface being scorched by ultraviolet light.  This triggered a massive die-off of plants -- which had only recently colonized the land -- and worldwide anoxia.  

The result?  A mass extinction that hit just about every taxon known.

The idea that a supernova might have been to blame for the End-Devonian Extinction came from the presence of hundreds of thousands of plant spores in sedimentary rock layers that showed evidence of what appeared to be radiation damage.  This isn't conclusive, of course; the Fields et al. team is up front that this is only a working hypothesis.  What they'll be looking for next is isotopes of elements in those same rock layers that are only produced by bombardment with radiation, such as plutonium-244 and samarium-146.  "When you see green bananas in Illinois, you know they are fresh, and you know they did not grow here," Fields said, in an interview in Science Daily.  "Like bananas, Pu-244 and Sm-146 decay over time.  So if we find these radioisotopes on Earth today, we know they are fresh and not from here -- the green bananas of the isotope world -- and thus the smoking guns of a nearby supernova."

So as much as I'd love to witness a supernova in my lifetime, it'd be nice if it was one well outside of the terrifyingly-named "kill zone" (thought to be about 25 light years or so).  And chances are, there's nothing inside that radius we need to worry about.  If any of the known supernova candidates explode, we'll almost certainly be able to enjoy the fireworks from a safe distance.

***************************

Fan of true crime stories?  This week's Skeptophilia book recommendation of the week is for you.

In The Poisoner's Handbook:Murder and the Birth of Forensic Medicine in Jazz Age New York, by Deborah Blum, you'll find out about how forensic science got off the ground -- through the efforts of two scientists, Charles Norris and Alexander Gettler, who took on the corruption-ridden law enforcement offices of Tammany Hall in order to stop people from literally getting away with murder.

In a book that reads more like a crime thriller than it does history, Blum takes us along with Norris and Gettler as they turned crime detection into a true science, resulting in hundreds of people being brought to justice for what would otherwise have been unsolved murders.  In Blum's hands, it's a fast, brilliant read -- if you're a fan of CSI, Forensics Files, and Bones, get a copy of The Poisoner's Handbook, you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Thursday, August 20, 2020

Of rhinos and puppies

You're not alone if you immediately think "Africa" when you hear the word "rhinoceros."  The two largest and best-known species -- the black (Diceros bicornis) and white (Ceratotherium simum) rhinos -- are both native to the southern parts of Africa.  There are three additional extant species in southern Asia, however; the Indian (Rhinoceros unicornis), Javan (Rhinoceros sondaicus), and Sumatran (Dicerorhinus sumatrensis) rhinos.  The latter two are amongst the most endangered mammals in the world, with only about 60 and 245 individuals left, respectively.

Rhinos, though, used to be much more diverse, and much more common.  One of the most remarkable fossils ever discovered is the Blue Lake rhino, a fifteen-million-year-old cast of an extinct rhinoceros species called Diceratherium in what is now eastern Washington state.  The "remarkable" part is that it's fossilized in igneous rock, which isn't supposed to happen -- fossils are supposed to all be in sedimentary rock, right?  But what happened is there was a colossal eruption fifteen million years ago that produced the Columbia River Flood Basalts, releasing an estimated 174,000 cubic kilometers of lava, an amount that's hard to fathom.  Anyhow, this poor rhino was peacefully grazing, minding its own business, and suddenly BAM, it gets hit by a fast-moving, highly liquid lava flow, its body entombed then burned away.  Fast forward to 1935, when a fossil hunter named Haakon Friele discovered a strange cave in a basalt formation, crawled inside with a flashlight, and somehow thought, "Hey, this hole is shaped just like a rhino."  A bit later, a crew of paleontologists from the University of California - Berkeley were called in, and they made a plaster cast of the interior -- and sure enough, it's a cast of a very surprised-looking rhino who was very much in the wrong place at the wrong time.

There were other rhino species more recently, however.  The woolly rhinoceros (Coelodonta antiquitatis) was an ice-age species that lived pretty much everywhere in what is now Asia and Europe, but started declining in population about forty thousand years ago, dwindling until only a remnant population was left in Siberia.  The last ones died fourteen thousand years ago, give or take.

[Image licensed under the Creative Commons ДиБгд, Wooly Rhino15, CC BY-SA 4.0]

The blame for the woolly rhino's demise has been attributed to overhunting by early humans, but recent research suggests the cause was actually climate change.  In the paper, "Pre-Extinction Demographic Stability and Genomic Signatures in the Woolly Rhinoceros," by a team led by Edana Lord of the Swedish Museum of Natural History, we read the following:
Ancient DNA has significantly improved our understanding of the evolution and population history of extinct megafauna.  However, few studies have used complete ancient genomes to examine species responses to climate change prior to extinction.  The woolly rhinoceros (Coelodonta antiquitatis) was a cold-adapted megaherbivore widely distributed across northern Eurasia during the Late Pleistocene and became extinct approximately 14 thousand years before present (ka BP).  While humans and climate change have been proposed as potential causes of extinction, knowledge is limited on how the woolly rhinoceros was impacted by human arrival and climatic fluctuations.  Here, we use one complete nuclear genome and 14 mitogenomes to investigate the demographic history of woolly rhinoceros leading up to its extinction.  Unlike other northern megafauna, the effective population size of woolly rhinoceros likely increased at 29.7 ka BP and subsequently remained stable until close to the species’ extinction.  Analysis of the nuclear genome from a ∼18.5-ka-old specimen did not indicate any increased inbreeding or reduced genetic diversity, suggesting that the population size remained steady for more than 13 ka following the arrival of humans.  The population contraction leading to extinction of the woolly rhinoceros may have thus been sudden and mostly driven by rapid warming in the Bølling-Allerød interstadial.
So at least that's one calamity we're not responsible for.

On the other hand, another recent discovery shows that we might not have doomed the woolly rhino, but our best friends might have had a hand -- um, a paw -- in it.  A friend and long-time loyal reader of Skeptophilia sent me a link to an article about a mummified body of a dog found in Siberia that, when analyzed, was found to have bits of meat from a woolly rhino it its stomach.  "This puppy, we know already, has been dated to roughly 14,000 years ago," said researcher Love Dalén, also of the Swedish Museum of Natural History.  "We also know that the woolly rhinoceros goes extinct 14,000 years ago.  So, potentially, this puppy has eaten one of the last remaining woolly rhinos."

Dogs: Eating Stuff They Shouldn't Eat For the Past Fourteen Thousand Years.

So that's today's excursion into weird cul-de-sacs of zoology.  And honestly, I'm just as glad the temperate-area rhino species are gone, cool as they undoubtedly were.  We have enough trouble keeping the groundhogs and rabbits out of the vegetable garden, I can't imagine how we'd deal with rhinos tromping around the place.

***************************

Fan of true crime stories?  This week's Skeptophilia book recommendation of the week is for you.

In The Poisoner's Handbook:Murder and the Birth of Forensic Medicine in Jazz Age New York, by Deborah Blum, you'll find out about how forensic science got off the ground -- through the efforts of two scientists, Charles Norris and Alexander Gettler, who took on the corruption-ridden law enforcement offices of Tammany Hall in order to stop people from literally getting away with murder.

In a book that reads more like a crime thriller than it does history, Blum takes us along with Norris and Gettler as they turned crime detection into a true science, resulting in hundreds of people being brought to justice for what would otherwise have been unsolved murders.  In Blum's hands, it's a fast, brilliant read -- if you're a fan of CSI, Forensics Files, and Bones, get a copy of The Poisoner's Handbook, you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]