Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, August 31, 2020

Fighting visual malware with water

One of the things that baffles me about woo-woos is how they never, ever give up.

When I'm proven wrong, I usually (1) feel extremely embarrassed about it, and (2) retreat in disarray.  Oh, and (3) do everything I can to make sure I don't make the same error the next time.  I mean, everyone makes mistakes, so I probably shouldn't overreact to it the way I do; but I like to think that as a writer on science and skepticism, I'm conscientious enough to check my facts and sources.  Otherwise, I'm really no better than the people I rail against on a daily basis.

So getting caught out hits me where it hurts, you know?

Not so, apparently, in the woo-woo world.  You can be laughed into oblivion, and you just keep on moseying on ahead as if nothing was wrong.

As an especially good example of this, remember Dr. Charlene Werner?  She was the star of a viral YouTube video a few years back called "Crazy Homeopathy Lady," the title of which you'd think would be devastating enough.  In this video, she attempts to explain homeopathy thusly:
  • The mass in the universe is "infinitesimal."  Since mass is the "m" in E = mc2 , she says that because the mass is so small this crosses the "m" out, which means that "energy = light."  (The whole effect is accentuated by the fact that she pronounces the word "infant-esimal," which sounds like a descriptor for a really little baby.)
  • Something about "Stephen Hawkings" and vibrations and quantum.
  • Fascinating discourse on string theory, starting with the fact that strings are "little u-ies."
  • A bizarre analogy wherein she compares homeopathy's effects to a neighbor's dog pooping on your lawn, causing you to throw a bomb at your neighbor's house.
If you've never seen this video, I highly recommend it.  I can say from experience that it's even more fun to watch while drunk, although I won't be held responsible if you laugh so hard you fall out of your chair and spill scotch all over your carpet.

[Image is in the Public Domain]

So anyhow.  This video has received millions of views, and tens of thousands of comments, most of which were of the "Holy shit, this woman is insane" variety.  So you'd think that any normal human being who got this kind of feedback would sort of vanish from the public eye.  Most of us, in fact, would probably want to crawl under a rock.

Not so Charlene Werner.  She's baaaaaack, on a website called "Simply Healthy Self," wherein she makes statements that very nearly exceed the wackiness of the ones she made in the video.  Here's a sampler:
Imagine your vision system has qualities similar to a computer.  The photoreceptors are like your keys on your keyboard.  There are approximately 1.2 million of them in each eye.  When clicked or activated with light, the data from your 'visual keyboard' relays to your brain.  Your brain has characteristics similar to a hard drive with an operating system that runs all the 'software programs' or functions in your body, such as moving your eye muscles, tracking, focusing, and visual memory.  Even your heart, kidney, lungs, and all your bodily functions depend on accurate key strokes from your photoreceptors and other sensory input, access to your brain (hard drive), a powerful operating system, and efficient use of software programs.
Yup.  Your kidneys depend on information from your eyes.  Which explains why blind people never have to pee.
Homeopathy then scans your system to eliminate 'viruses' or 'malware', which are often belief systems or programmed patterns that interrupt your system's smooth functioning.
So a bottle of water with no active ingredients is the medical equivalent of Norton AntiVirus?  If only we'd realized sooner that these "remedies" can fix faulty belief systems, we might have avoided having our government turned into Corruption "R" Us by a man whose chief claim to fame seems to be embodying all Seven Deadly Sins in one individual.
When we consider the whole of man we can even make a further leap……that mass in the universe by definition is matter, matter is substance, the substance of man is cells, and cells can be broken down into compounds, compounds into elements, and elements into tiny particles of energy called electrons, protons, neutrons, and sub-atomic particles held together by an “invisible” force such that what may look like a physical body is merely energy.
An explanation which is to physics what "The foot-bone's connected to the shin-bone, the shin-bone's connected to the knee-bone" is to medical science.

Then we get bunches of testimonials about how Dr. Werner's treatments have cured everything from rheumatoid arthritis to bad eyesight to being lousy at sports.

Which is pretty impressive, because homeopathy has failed to show measurable results in every controlled study ever done.  Ever.  Clear enough?  What she's proposing is unscientific horse waste, and her "success stories" are the result of the placebo effect at best.

None of which, of course, is going to change a thing.  If the reception her bizarre YouTube video received didn't make her reconsider her position, nothing will.  Unfortunately, there are still people who buy what she's selling (literally and figuratively), although it's to be hoped that the support for such completely disproven modalities as homeopathy is waning.

The chance of convincing Dr. Werner, however, is "infant-esimal."

**********************************

This week's Skeptophilia book recommendation of the week should be in everyone's personal library.  It's the parting gift we received from the brilliant astrophysicist Stephen Hawking, who died two years ago after beating the odds against ALS's death sentence for over fifty years.

In Brief Answers to the Big Questions, Hawking looks at our future -- our chances at stopping anthropogenic climate change, preventing nuclear war, curbing overpopulation -- as well as addressing a number of the "big questions" he references in the title.  Does God exist?  Should we colonize space?  What would happen if the aliens came here?  Is it a good idea to develop artificial intelligence?

And finally, what is humanity's chance of surviving?

In a fascinating, engaging, and ultimately optimistic book, Hawking gives us his answers to the questions that occupy the minds of every intelligent human.  Published posthumously -- Hawking died in March of 2018, and Brief Answers hit the bookshelves in October of that year -- it's a final missive from one of the finest brains our species ever produced.  Anyone with more than a passing interest in science or philosophy should put this book on the to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, August 29, 2020

Goodness gracious...

Are you feeling like your love life is a little cooler than you'd like?  Are you lacking in the ardor department?  Does it seem like you just don't have the romantic sizzle you once had?

If so, I have the solution.

All you have to be willing to do is to have someone set your crotch on fire.

I'm not making this up, and I wish I was, because after researching this I now feel like I need to spend the rest of the day in a protective crouch.  According to a link sent to me by a loyal reader of Skeptophilia, we find out that in China, there has been a surge in the popularity of treating waning sex drive by placing towels soaked with alcohol over guys' privates, and then setting them on fire.

If the description wasn't enough, we have photographs:


I don't know about you, but I can't imagine that my reaction to having flames spouting from my reproductive region would be just to lie there, hands behind my head, with a blissful expression on my face.  Now that I come to think of it, I can imagine no circumstance in which I'd allow anyone to come near my reproductive region with flames in the first place.  But apparently, there are guys in China who love this.  The article quotes a 33 year old banker, Ken Cho, who says, "It is all about keeping blood flow moving rapidly.  The warmth from the burning towels speeds the blood through the body and it makes me perform 50% better in bed.  I have tried all sorts of therapies in the past to keep my sexual performance up to speed but this is by far the best."

Which raises several questions.  With guys, the issue isn't with getting the blood to flow rapidly, it's more with getting the blood to stay put.  If you get my drift.  And the whole "50% better" statistic just makes me think he's making shit up.  50% better for whom?  Did he query his girlfriend one night, asking her to rate his performance, and then he went to get the Great Balls Afire Treatment, and they did the deed again, and she said afterwards, "Yes, dear, that was at least 50% better than last time?"

Somehow I don't think this is the kind of thing that lends itself to a controlled study.

What I really wonder, though, is how anyone thought of this to begin with.  Because, after all, some poor schmuck had to be the first to try it.  Can't you picture it?  Dude goes to his doctor, and says, "Doc, I've been experiencing low sex drive lately," and the doctor says, "Oh, we can treat that.  All we have to do is set your penis on fire."

I don't know about you, but I would run, not walk, out of the office.  Even if many of us would fancy being a Hunka Hunka Burnin' Love, this is not the way to do it.

So what we have here is a combination of the placebo effect, self-delusion, wishful thinking, and high tolerance of risk.  If there was any doubt.

Anyhow, that's our contribution from the Extremely Alternative Medicine department for today.  Bringing up yet again my contention that every time I think I have found the most completely idiotic idea humanity is capable of, someone breaks the previous record.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Friday, August 28, 2020

The body exchange

When we think about our own bodies, we tend to externalize them.  It's subtle, but ponder it for a moment; when I say "this is my hand," where is the "me" who is the hand's owner?  We usually put our "selves" in our heads (or hearts), so the rest of the pieces belong to whoever that "self" actually is.

As support of this, consider the unpleasant possibility of losing a limb, a sense, the ability to walk.  Something huge and devastating.  Even with such a major change, most of us feel that our "self" would remain intact.  Switch brains, though (if such a thing were possible) and you wouldn't be you any more -- there's something about that sense of self that resides there, in what my neurophysiology professor called "the meat machine."

René Descartes's illustration of mind-body dualism

Predictably, the reality may be more complex than that.  In a fascinating experiment run at the Karolinska Institutet of Sweden, researchers used virtual reality headsets to give two friends lying near each other the sense that they'd switched bodies.  In "Perception of Our Own Body Influences Self-Concept and Self-Incoherence Impairs Episodic Memory," by Pawel Tacikowski, Marieke Weijs, and Henrik Ehrsson, which came out in iScience this week, we find out that the sense of who we are is much more intimately connected to our bodies than we might realize.

The researchers did personality assessments prior to the swap.  Each participant ranked both him/herself and the friend on a number of characteristics.  While wearing the headsets, they were asked to re-rate both themselves and their friends -- and across the board, while they were in the body swap they ranked themselves as closer to what they had previously ranked their friend!

Another interesting feature was that both before and after the swap, participants were given memory tests.  They were also asked how convincing the illusion was -- how real it seemed that they were inhabiting their friend's body while the headset was on.  Last, how comfortable were they with the illusion?  Did they find it intriguing, exciting, scary, disorienting?  Curiously, the people who were the most comfortable and curious about being "inside a friend's body" did significantly better on the memory tests, leading to the conjecture that a skew between your bodily awareness and your sense of self can interfere with cognitive activity.

"We show that the self-concept has the potential to change really quickly, which brings us to some potentially interesting practical implications," said study lead author Pawel Tacikowski, in an interview with Neuroscience News.  "People who suffer from depression often have very rigid and negative beliefs about themselves that can be devastating to their everyday functioning.  If you change this illusion slightly, it could potentially make those beliefs less rigid and less negative."

The authors write:
[Our findings extend] previous knowledge in several important ways.  First, it challenges a common assumption that self-concept is relatively fixed over time and emphasizes the role of the body in the continuous construction of our sense of who we are; this role has been largely neglected in past social psychology research.  Second, this result shows that perceptual aspects of the bodily self dynamically shape multiple, abstract beliefs that constitute our conscious self-concept rather than only selected aspects of self-representation that are perceptual, body-related, or implicit.  Third, this finding clarifies that the illusory ownership of another person's body not only modifies attitudes toward this person or toward a social group to which this person belongs but also, and perhaps predominantly, modifies beliefs about the self.
What this immediately made me think of is people with body dysmorphia -- often at the root of not only disorders like anorexia, in which a person who is thin to the point of emaciation looks in a mirror and sees him/herself as overweight, but in trans individuals, who often describe the feeling as "not being in the right body."  It's no wonder both conditions are devastating, and linked to depression and suicidal ideation.  What the Tacikowski et al. study showed is that our sense of self is deeply connected to our own bodies -- and a disconnect between the self and the body has profound cognitive and emotional effects.

Naturally, the next step is to find out what's actually happening in the brain during the illusion.  "Now, my mind is occupied with the question of how this behavioral effect works — what the brain mechanism is,"  Tacikowski said. "Then, we can use this model for more specific clinical applications to possibly develop better treatments."  I'm also curious to find out how long-lasting the effects were.  Did this trigger a long-term change in how the person sees his/her friend?  Or did the change evaporate as soon as the headset was turned off and the participant was "back in your his/her own body?"

No question, though, that it's a fascinating result, and worthy of a lot more inquiry.  It gives some new insight into the age-old "mind-body problem" that has plagued philosophers since the time of Plato.  Perhaps the mind and the body aren't as independent of each other as it seems -- and our sense of self is much more tied to our physical flesh-and-blood presence than was apparent.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Thursday, August 27, 2020

Rewarding the daredevil

There were three magic words that used to be able to induce me to do almost anything, regardless how catastrophically stupid it was: "I dare you."

It's how I ended up walking the ridgeline of a friend's house when I was in eighth grade:
Friend: My house has such a steep roof.  I don't know how anyone could keep his balance up there.
Me:  I bet I could. 
Friend (dubiously):  You think? 
Me;  Yeah. 
Friend:  I dare you. 
Me:  Get me a ladder.
That I didn't break my neck was as much due to luck as skill, although it must be said that back then I did have a hell of a sense of balance, even if I didn't have much of any other kind of sense.

[Image licensed under the Creative Commons Øyvind Holmstad, A yellow house with a sheltering roof, CC BY-SA 3.0]

Research by neuroscientists Lei Zhang (University Medical Center Hamburg-Eppendorf) and Jan Gläscher (University of Vienna) has given us some insight into why I was prone to doing that sort of thing (beyond my parent's explanation, which boiled down to "you sure are an idiot").  Apparently the whole thing has to do with something called "reward prediction error" -- and they've identified the part of the brain where it occurs.

Reward prediction error occurs when there is a mismatch between the expected reward and the actual reward.  If expected reward occurs, prediction error is low, and you get some reinforcement via neurochemical release in the putamen and right temporoparietal junction, which form an important part of the brain's reward circuit.  A prediction error can go two ways: (1) the reward can be lower than the expectation, in which case you learn by changing your expectations; or (2) the reward can be higher than the expectation, in which case you get treated to a flood of endorphins.

Which explains my stupid roof-climbing behavior, and loads of other activities that begin with the words "hold my beer."  I wasn't nearly as fearless as I was acting; I fully expected to lose my balance and go tumbling down the roof.  When that didn't happen, and I came ambling back down the ladder afterward to the awed appreciation of my friend, I got a neurochemical bonus that nearly guaranteed that next time I heard "I dare you," I'd do the same thing again.

The structure of the researchers' experiment was interesting.  Here's how it was described in a press release in EurekAlert:
[The] researchers... placed groups of five volunteers in the same computer-based decision-making experiment, where each of them was presented with two abstract symbols.  Their objective was to find out which symbol would lead to more monetary rewards in the long run.  In each round of the experiment, every person first made a choice between the two symbols, and then they observed which symbols the other four people had selected; next, every person could decide to stick with their initial choice or switch to the alternative symbol.  Finally, a monetary outcome, either a win or a loss, was delivered to every one according to their second decision...  In fact, which symbol was related to more reward was always changing.  At the beginning of the experiment, one of the two symbols returned monetary rewards 70% of the time, and after a few rounds, it provided rewards only 30% of the time.  These changes took place multiple times throughout the experiment...  Expectedly, the volunteers switched more often when they were confronted with opposing choices from the others, but interestingly, the second choice (after considering social information) reflected the reward structure better than the first choice.
So social learning -- making your decisions according to your friends' behaviors and expectations -- is actually not a bad strategy.  "Direct learning is efficient in stable situations," said study co-author Jan Gläscher, "and when situations are changing and uncertain, social learning may play an important role together with direct learning to adapt to novel situations, such as deciding on the lunch menu at a new company."

Or deciding whether or not it's worth it to climb the roof of a friend's house.

We're social primates, so it's no surprise we rely a great deal on the members of our tribe for information about what we should and should not do.  This works well when we're looking to older and wiser individuals, and not so well when the other members of our tribe are just as dumb as we are.  (This latter bit explains a lot of the behavior we're currently seeing in the United States Senate.)  But our brains are built that way, for better or for worse.

Although for what it's worth, I no longer do ridiculous stunts when someone says "I dare you."  So if you were planning on trying it, don't get your hopes up.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Wednesday, August 26, 2020

The rising waters

2020 has become some kind of nihilist joke.  How many versions of the "2020 Apocalypse Bingo Card" have you heard?  And it does seem like things are just piling on.  Right now, we have a major category-3 hurricane bearing down on southern Louisiana -- that just got hit by a totally different tropical storm three days ago.  We're in the middle of a pandemic that is showing no signs of letting up.  There are record-setting wildfires in California.  The economy is giving serious signs of tanking in a big way.  Protests against police brutality seem to erupt every other day.  Last, we're in the middle of the Republican National Convention, where the platform seems to be, "Look how fucked up everything has gotten in the last four years!  Give us another four and we'll do the same thing again but even bigger this time!"

In a situation like this, I'm always reluctant to add to the doom and gloom.  But I would be remiss in not pointing out that all of the above is small potatoes, really.  A lot of us, in fact, are concerned at how the current chaos has distracted us from a far, far bigger problem.  We are facing an unprecedented climate catastrophe, not in a hundred years, not in fifty years, but right now. and three papers in the past two weeks have added to what was already a clarion call to action.

Let's start with the deep oceans.  The abyssal region of the Earth's oceans is supposed to be one of the most stable ecosystems on Earth.  Saline, completely pitch dark, crushing pressures, and always at just shy of four degrees Celsius -- the temperature at which water is its densest.  No change, no matter what's happening up above.

But last week a paper in Nature Climate Change looked into the deeps of the ocean, and found something terrifying.  The anthropogenic climate change signature is showing up in a place that is supposed to be about as insulated from human effects as you could imagine.

A team led by oceanographer Yona Silvy of the Université Sorbonne wrote the following:
[U]sing 11 climate models, we define when anthropogenic temperature and salinity changes are expected to emerge from natural variability in the ocean interior along density surfaces.  The models predict that in 2020, 20–55% of the Atlantic, Pacific and Indian basins have an emergent anthropogenic signal; reaching 40–65% in 2050 and 55–80% in 2080.  The well-ventilated Southern Ocean water masses emerge very rapidly, as early as the 1980–1990s, while the Northern Hemisphere water masses emerge in the 2010–2030s.  Our results highlight the importance of maintaining and augmenting an ocean observing system capable of detecting and monitoring persistent anthropogenic changes.
Perhaps this should have been unsurprising, considering that 93% of the anthropogenic heating the Earth is experiencing is being absorbed by bodies of water.  But the idea that this absorption isn't limited to the surface -- that we're actually impacting the deepest parts of the world's oceans -- is seriously scary to anyone who knows anything about the environment and climate models.

Scientists have long been concerned about the tipping point -- the point that climatic catastrophe becomes inevitable no matter what we do.  A second study out of Ohio State University has shown conclusively that we've passed that point with respect to one of the Earth's systems, the melting of the Greenland Ice Sheet.

What the researchers found was that up until about the year 2000, the glaciers in Greenland were pretty well in balance.  The amount of ice loss during the summer was nearly equal to the amount of ice gain from snowfall during the winter.  But around 2000, the situation changed, and since then Greenland has lost a staggering 50 gigatons (50 billion tons) more ice than it gained.

"Glacier retreat has knocked the dynamics of the whole ice sheet into a constant state of loss," said Ian Howat, who co-authored the paper.  "Even if the climate were to stay the same or even get a little colder, the ice sheet would still be losing mass."

[Image licensed under the Creative Commons Christine Zenino from Chicago, US, Greenland Glaciers outside of Ammassalik (5562580093), CC BY 2.0]

The polar bears aren't the only ones who should be concerned.  Greenland is second only to Antarctica in its potential effect on sea level rise.  If the Greenland Ice Sheet melts -- which is has sometimes done during warm periods in Earth's climate -- it would raise the sea levels by six meters.  Everywhere under six meters of elevation would be under water.

So wave goodbye at New Orleans, Antwerp, Charleston, Boston, a good chunk of New York City and Long Island, and most of Florida, Delaware, the Netherlands, and Bangladesh.

If that's not bad enough, a paper in The Cryosphere last week, authored by a team from three universities -- Leeds, Edinburgh, and University College London -- considered the situation worldwide, and found that in the past twenty-three years, the Earth lost 28 trillion tons of ice.

"To put that in context, every centimeter of sea-level rise means about a million people will be displaced from their low-lying homelands," said Andy Shepherd, director of Leeds University's Centre for Polar Observation and Modelling, in an interview with The Guardian.  "In the past researchers have studied individual areas – such as the Antarctic or Greenland – where ice is melting.  But this is the first time anyone has looked at all the ice that is disappearing from the entire planet...  What we have found has stunned us.  There can be little doubt that the vast majority of Earth's ice loss is a direct consequence of climate warming."

It's easy to focus on what's right in front of your face and forget about the big picture.  This would be okay if the big picture wasn't so deeply horrifying.  I hate to be another purveyor of pessimism, but we have got to start taking this seriously.  I'm as upset about the pandemic and the global political chaos as the next guy, but this isn't a time to be distracted away from a much bigger issue -- the long-term habitability of the planet.

Let's keep our eyes on the ball, here.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Tuesday, August 25, 2020

The left-handed universe

I first ran into the concept of chirality when I was a fifteen-year-old Trekkie science fiction nerd.

I grew up watching the original Star Trek, which impressed the hell out of me as a kid even though rewatching some of the episodes now generates painful full-body cringes at the blatant sexism and near-jingoistic chauvinism.  Be that as it may, after going through the entire series I don't even know how many times, I started reading some of the fan fiction.

The fan fiction, of course, was more uneven than the show had been.  Some of it was pretty good, some downright terrible.  One that had elements of both, putting it somewhere in the "fair to middling" category, was Spock Must Die by James Blish.  Blish had gotten into the Star Trek universe writing short-story adaptations of most of the original series episodes, but this one was entirely new.

Well, mostly.  It springboarded off an original series episode, "Errand of Mercy," in which the Federation and the Klingons are fighting over the planet Organia, which is populated by a peaceful, pastoral society.  Kirk et al. are trying to stop the Klingons from massacring the Organians, but much to Kirk's dismay, the Organians refuse Federation protection, insisting they don't need any help.  And it turns out they don't -- in the end, you find out that the Organians are super-powerful aliens who only assumed human-ish form to communicate with the two humanoid invading forces, and are so far beyond both of them that they indeed had nothing to fear.

In Spock Must Die, the crew of the Enterprise is sent to investigate why Organia has suddenly gone radio-silent.  It turns out that the Klingons have surrounded the entire planet with a force field.  Spock volunteers to try to transport through it, which fails -- but after the attempt, suddenly there are two Spocks in the transporter room, each claiming to be the real, original Vulcan.

[spoiler alert, if anyone is actually going to go back and read it...]  What happened is that the transporter beam was reflected off the surface of the force field, and it duplicated Spock -- there was the original (who never left the transporter pad) and the duplicate (the reflection, recreated in place).  Since both the original and the duplicate were identical down to the last neuron, each of them had the same memories, and each was convinced he was the real Spock.

The key turned out to be the fact that the duplicate had been reflected all the way down to the molecular level.

Why this matters is that a number of molecules in our bodies -- amino acids and sugars being two common examples -- are chiral, meaning they have a "handedness."  Just like a glove, they exist in two possible forms, a "right-handed" and a "left-handed" one, which are mirror images of each other.  And for reasons unknown, all of our amino acids are left-handed.  No organism known manufactures right-handed amino acids.  Further, if you synthesized right-handed amino acids -- which could be done in the laboratory -- and fed them to a terrestrial organism, the organism would starve.

But the reflected Spock, of course, is exactly the opposite.  Kirk eventually figures out what's happened because one of the Spocks barricades himself in one of the science laboratories, claiming the other Spock wants to kill him.  The truth was he had to have access to a lab in order to synthesize the right-handed amino acids without which he'd die.

Clever concept for a story, right there.

[Image licensed under the Creative Commons Petritap, Finnish mittens, CC BY-SA 3.0]

Chirality is quite a mystery.  Like I said, the left-handedness of amino acids is shared by all known terrestrial organisms, so that bias must have happened very early in the generation of life.

Why it happened is another matter entirely.  A persistent question in scientific inquiries into the origin of life on Earth (and the possibility of life elsewhere) is how much of our own biochemistry and metabolism is constrained.  We code our genetic information as DNA; could it be done a different way elsewhere?  Our primary energy driver is ATP.  Are there other ways organisms might store and access chemical energy?  The question of constraint goes all the way up the scale to macroscopic features, such as cephalization -- the clustering of the sensory processing organs near the anterior end of the animal.  Makes sense; you want your sensors facing (1) the direction you're traveling, and (2) what you're eating.  But are there other equally sensible ways to put an animal together?

Some things we take for granted almost certainly aren't constrained, like bilateral symmetry.  So many animals are bilaterally symmetrical that the ones that aren't (like adult flounders) stand out as bizarre.  Aficionados of H. P. Lovecraft might remember that amongst the innovative ideas he used was that the aliens in "At the Mountains of Madness" weren't bilateral, but had five-way symmetry -- something completely unknown on Earth.  (You may be thinking, "wait... starfish?"  Starfish have what I'd call pseudo-pentaradial symmetry.  As larvae, they're clearly bilateral, and they lose a lot of bilateral features when they mature.  But some characteristics -- like the position of the sieve plate, their water-intake device -- give away that deep down, they are still basically bilateral.)

Anyhow, all this comes up because of a recent discovery by astrobiologists at NASA's Goddard Space Flight Center.  In a press release, we hear about a meteorite discovered in Antarctica called Asuka 12236, which is a carbonaceous chondrite -- a peculiar type of meteorite that is rich in organic compounds.  Asuka 12236 contained large quantities of amino acids, which isn't as bizarre as it sounds; amino acids have been shown to form relatively easily if there are raw materials and a source of energy.

What stands out is that all of the amino acids in Asuka 12236 are left-handed -- just like the ones on Earth.

The scientists studying the meteorite are up front that the first thing to do is rule out that the amino acids in the meteorite aren't contaminants absorbed after the rock crash-landed.  Most of the experts, however, think this is unlikely, and that we're looking at a genuine sample of extraterrestrial amino acids.  And the fact that they all show left-handed chirality is pretty remarkable -- suggesting that the chirality of our biochemicals might, in fact, be constrained, and that we could well find biochemistry similar to our own on other planets.

In that way, at least.

So that's one less thing to worry about if we ever go to an alien world.  Unlike the right-handed reflected Mr. Spock, we'd be able to metabolize alien amino acids just fine.

Of course, how familiar-looking everything else would be is still open to question.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Monday, August 24, 2020

How to prove you exist

Let me say right up front that I don't mean any of what I'm saying here as criticism of the researchers themselves.

But there are times that it is damn frustrating that the research has to be done in the first place.

This comes up because of a paper that was published in Proceedings of the National Academy of Sciences a couple of weeks ago, by a team led by Jeremy Jabbour of the Department of Psychology at Northwestern University.  In "Robust Evidence for Bisexual Orientation Among Men," we read:
The question whether some men have a bisexual orientation—that is, whether they are substantially sexually aroused and attracted to both sexes—has remained controversial among both scientists and laypersons.  Skeptics believe that male sexual orientation can only be homosexual or heterosexual, and that bisexual identification reflects nonsexual concerns, such as a desire to deemphasize homosexuality.  Although most bisexual-identified men report that they are attracted to both men and women, self-report data cannot refute these claims.  Patterns of physiological (genital) arousal to male and female erotic stimuli can provide compelling evidence for male sexual orientation.  (In contrast, most women provide similar physiological responses to male and female stimuli.)  We investigated whether men who self-report bisexual feelings tend to produce bisexual arousal patterns.  Prior studies of this issue have been small, used potentially invalid statistical tests, and produced inconsistent findings.  We combined nearly all previously published data (from eight previous studies in the United States, United Kingdom, and Canada), yielding a sample of 474 to 588 men (depending on analysis).  All participants were cisgender males.  Highly robust results showed that bisexual-identified men’s genital and subjective arousal patterns were more bisexual than were those who identified as exclusively heterosexual or homosexual.  These findings support the view that male sexual orientation contains a range, from heterosexuality, to bisexuality, to homosexuality.
So basically what they did was to show naked pics of both men and women to self-identified bisexual guys, and check to see if they got hard-ons from both.

Like I said in the first sentence, I'm glad this research was done, because there is doubt out there.  I've heard that doubt go two ways -- that bisexuals are straight people looking for attention or for a kinky thrill, or that bisexuals are gay people who are afraid to admit it.  I remember clearly being told by a student -- long before I was out of the closet -- that she could understand there being homosexuals and heterosexuals, but she couldn't see how there could be bisexuals.  "How can they be attracted to both at the same time?" she asked me.  "Why don't they just make up their minds?"

I fell back on the research -- that bisexuality and the spectrum-nature of sexual orientation was well-established -- but even after seeing the data, she wasn't convinced.  "I just don't believe it," she said.

Not only was I appalled by this because, in essence, she was talking about me -- telling me that my own identity was an impossibility -- but because even presented with evidence, she went with her "feelings" on the topic rather than (1) the conclusions of the scientists, and worse, (2) people's assessment of their own orientation.

Because that's the thing, isn't it?  How does anyone have the fucking temerity to say, "No, that's not who you are.  I know better.  Here's who you actually are."?  People in the trans community know this all too well; how often are they told that someone else knows their gender better than they do?

And here, we're told we have to prove we even exist.

How about just believing us?

[Image licensed under the Creative Commons Peter Salanki from San Francisco, USA, The bisexual pride flag (3673713584), CC BY 2.0]

I've known I was bisexual since I was fifteen years old.  There was never any doubt about my attraction to both men and women.  Hell, I knew it before I'd ever even heard the word "bisexuality."  The fact that now, over forty years later, there has to be a study published in a major scientific journal to convince people that I actually know who I am -- that I'm not delusional or lying -- is nothing short of infuriating.

So thanks to Jabbour et al. for establishing peer-reviewed research that I hope and pray will put this question to rest once and for all.  I know it won't convince everyone -- my long-ago evidence-proof student as a case in point -- but maybe we'll move toward accepting that gender and sexual orientation are complex and completely non-binary, and better still, toward valuing people's understanding of who they are over society's pronouncements of who they should be.

And as I've said before: I wish I'd been strong enough and fearless enough to claim my own identity when I first realized it as a teenager.  I have often wondered what trajectory my life would have taken if I'd spent all those years free of the humiliation and fear I was raised with, and proud of who I was instead of ashamed of it.  You can't change past mistakes, more's the pity, but at least I can state who I am now and hope that my voice will add more volume to the call that each of us should be free to celebrate who we are without having to prove anything to anyone.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Saturday, August 22, 2020

A prehistoric hoax

One of the hazards of becoming more aware of how biased and (sometimes) duplicitous popular media can be is that you might finally, de facto, stop believing everything you read and hear.

It's called being a "cynic," and it's just as lazy as being gullible.  However, because the credulous are often derided as silly or ignorant, cynics sometimes feel that they must therefore be highly intelligent, and that disbelieving everything means that you're too smart to be "taken in."

In reality, cynicism is an excuse, a justification for having stopped thinking.  "The media always lies" isn't any closer to the truth than "everything you eat causes cancer" or "all of the science we're being told now could be wrong."  It give you an automatic reason not to read (or not to watch your diet or not to learn science), and in the end, is simply a statement of willful ignorance.

Take, for example, the site Clues Forum, which has as its tagline, "Exposing Media Fakery."  In particular, consider the thread that was started several years ago, but which continues to circulate, lo up unto this very day... entitled "The (Non-religious) Dinosaur Hoax Question."


Muttaburrasaurus skeleton (Queensland Museum)  [Image is in the Public Domain]

And yes, it means what you think it means.  And yes, the "Question" should simply be answered "No."  But let's look a little more deeply at what they're saying... because I think it reveals something rather insidious.

Take a look at how it starts:
Dinosaurs have, in recent years, become a media subject rivaling the space program in popularity and eliciting similar levels of public adoration towards its researchers and scientists.  The science of dinosaurs and other prehistoric life is also directly linked to other controversial scientific topics such as evolution, fuel production, climate and even the space program (i.e., what allegedly killed them).
So right from the outset, we've jumped straight into the Motive Fallacy -- the idea that a particular individual's motive for saying something has any bearing on that statement's truth value.  Those scientists, the author says, have a motive for our believing in dinosaurs.  Supporting controversial ideas for their own nefarious reasons.  Getting us worried about the climate and the potential for cataclysmic asteroid strikes.  Therefore: they must be lying.  We're never told, outright, why the scientists would lie about such things, but the seed is planted, right there in the first paragraph.

Then, we're thrown more reason for doubt our way, when we're told that (*gasp*) scientists make mistakes.  A dinosaur skeleton found in New Jersey, and now on display at the New Jersey State Museum, was reconstructed with a skull based on an iguana, since the actual skull could not be found.  The article, though, uses the word "fake" -- as if the museum owners, and the scientists, were deliberately trying to pull the wool over people's eyes, instead of interpolating the missing pieces -- something that is routinely done by paleontologists.  And those wily characters even gave away the game by admitting what they were up to, right beneath a photograph of the skeleton:
Above is the full-size Hadrosaurus mount currently on display at the New Jersey State Museum in Trenton.  The posture is now recognized as incorrect.  At the same time the skeleton is fitted with the wrong skull of another type of duck-bill dinosaur.  Signs at the exhibit acknowledge that both the mounted skeleton as well as nearby illustrated depictions of what the living animal looked like are both wrong.  Both are slated for correction at some unspecified future date.
Because that's what clever conspirators these scientists are.  Covering up the fact that they're giving out erroneous information on dinosaurs by... um... admitting they had some erroneous information about dinosaurs.

But according to Clues Forum, this is yet another hole punched in our confidence, with the revelation that (*horrors*) there are things scientists don't know.  Instead of looking at that as a future line of inquiry, this article gives you the impression that such holes in our knowledge are an indication that everything is suspect.

Last, we're told that it's likely that the paleontologists are creating the fossils themselves, because fossils are just "rock in rock," leaving it a complete guessing game as to where the matrix rock ends and the fossil begins.  So for their own secret, evil reasons, paleontologists spend days and weeks out in the field, living in primitive and inhospitable conditions, grinding rocks into the shape of bones so as to hoodwink us all:
But, in our hoax-filled world of fake science, doesn't this rock-in-rock situation make it rather easy for creative interpretations of what the animal really looked like?  And, once a particular animal is “approved” by the gods of the scientific community, wouldn't all subsequent representations of that same animal have to conform with that standard?
By the time you've read this far, you're so far sunk in the mire of paranoia that you would probably begin to doubt that gravity exists.  Those Evil, Evil Scientists!  They're lying to us about everything!

Of course, what we're seeing here is the phenomenon I started with; substituting lazy gullibility with lazy disbelief.  All the writer would have to do is sign up for a paleontology class, or (better yet) go on a fossil dig, to find out how the science is really done.

But I've found that people like this will seldom take any of those steps.  Once you suspect everyone, there's no one to lean on but yourself -- and (by extension) on your own ignorance.  At that point, you're stuck.  

So I should correct a statement I made earlier.  There is a difference between gullibility and cynicism.

Gullibility is far easier to cure.

***************************

Fan of true crime stories?  This week's Skeptophilia book recommendation of the week is for you.

In The Poisoner's Handbook:Murder and the Birth of Forensic Medicine in Jazz Age New York, by Deborah Blum, you'll find out about how forensic science got off the ground -- through the efforts of two scientists, Charles Norris and Alexander Gettler, who took on the corruption-ridden law enforcement offices of Tammany Hall in order to stop people from literally getting away with murder.

In a book that reads more like a crime thriller than it does history, Blum takes us along with Norris and Gettler as they turned crime detection into a true science, resulting in hundreds of people being brought to justice for what would otherwise have been unsolved murders.  In Blum's hands, it's a fast, brilliant read -- if you're a fan of CSI, Forensics Files, and Bones, get a copy of The Poisoner's Handbook, you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Friday, August 21, 2020

Deadly fireworks

I've always thought it would be amazingly cool to witness a supernova.

Imagine it.  Within a few hours, a dim, ordinary-looking star increases in luminosity until it outshines every other astronomical object in the sky except the Sun and Moon.  It's visible during the day and you can read by its light at night.  It's not a blink-and-you'll-miss-it phenomenon, either; the light from the massive explosion peaks rapidly but declines slowly.  Most supernovae will be visible for months, before dimming to near-invisibility, ending as neutron stars or black holes.

There are lots of candidates for what could be the next supernova, although don't get your hopes up; most of these fall into the "some time in the next million years" category.  Yeah, it could happen tomorrow, but I wouldn't put money on it.  Still, the list is sizable, and here are five of the best possibilities:
  • Betelgeuse (720 light years away, in the constellation Orion).  This one got some serious press a few months ago because it suddenly started to decrease in brightness, and astronomers wondered if this was a prelude to an explosion.  What appears to have happened is that there was turbulence in the star's core that blew a cloud of dust from its surface, obscuring the star and making it appear to dim.  So we're still waiting for this red supergiant to explode, and probably will be for a while.
  • IK Pegasi (154 light years away, in the constellation Pegasus).  IK Pegasi isn't well known because at an apparent magnitude of 6, it's not visible to the naked eye, but it bears mention as the nearest serious supernova candidate.  It's a double star -- a main-sequence star and a massive white dwarf orbiting a common center of mass.  As the main-sequence star evolves, it will become a red giant, with a radius large enough that its white dwarf companion will start suctioning matter from its surface.  When the white dwarf reaches what's called the Chandrasekhar Limit -- 1.4 solar masses -- it will explode cataclysmically as a Type 1a supernova.  This will not only be spectacular but potentially dangerous -- a topic we will revisit shortly.
  • VY Canis Majoris (3,820 light years away, in the constellation Canis Major).  Another star not visible to the naked eye, VY Canis Majoris is a lot more spectacular than you'd think to look at it.  It's the largest star known, with a mass fifteen times that of the Sun, and a radius so large that if you put it where the Sun is, its surface would be about at the orbit of Jupiter (so we'd be inside the star).  This "hypergiant" is one of the most luminous stars in the Milky Way, and is only dim because it's so far away.  This one is certain to go supernova, probably some time in the next 100,000 years, and the remnants will collapse into a black hole.
  • Eta Carinae (7,500 light years away, in the constellation Carina).  Eta Carinae is another huge star, with a radius twenty times that of the Sun, but what makes this one stand out is its bizarre behavior.  In 1837 it suddenly brightened to being one of the five brightest stars in the night sky, then over the next sixty years faded to the point that it was only visible in binoculars.  Detailed observations have shown that it blew out a huge cloud of material in "The Great Eruption," which is now the Homunculus Nebula.  It's a unique object, which makes it hard to predict its future behavior.  What seems certain is that it'll eventually explode, but there's no telling when that might occur.
The consensus amongst astronomers, however, is that the next likely supernova probably isn't on the list -- that it will be a previously-unknown white dwarf or an unremarkable-looking red giant.  We know so little about supernovas that it's impossible to predict them with any kind of accuracy.  And while this is an exciting prospect, we'd better hope that the next supernova isn't too close.

The Homunculus Nebula with Eta Carinae at the center [Image licensed under the Creative Commons ESA/Hubble, Cosmic Fireworks in Ultraviolet Eta Carinae Nebula, CC BY 4.0]

Not only do supernovas produce a lot of light, they generate a tremendous amount of radiation of other kinds, including cosmic rays.  A close supernova could produce enough cosmic rays to wipe out the ozone layer -- leading to a huge influx of ultraviolet light from the Sun, with devastating effects.

Scarily, this may have already happened in Earth's history.  One of the lesser-known mass extinctions occurred at the end of the Devonian Period, 359 million years ago.  Because it is poorly understood, and was dwarfed by the cataclysmic Permian-Triassic Extinction a little over a hundred million years later, it's not one you tend to read about in the paleontology-for-the-layperson books.  Even so, it was pretty significant, wiping out 19% of known families and 50% of known genera, including placoderms (armored fish), cystoids (a relative of the starfish), and graptolites (colonial animals not closely related to any living species).  Most striking were the collapse of reef-forming corals -- reefs didn't begin to form again on any significant scale until the Mesozoic Era, almost two hundred million years later -- and the near-complete wipeout of vertebrates.  The latter left no vertebrate species over a meter long (most of them were under ten centimeters), and again, it was millions of years before any kind of recovery took place.

Fortunately for us, it eventually did, because we're talking about our ancestors, here.

The cause of this catastrophe has been a matter of speculation, but a team led by Brian Fields, astrophysicist at the University of Illinois, may have found a smoking gun.  In a paper this week in Proceedings of the National Academy of Sciences, we find out that the most likely cause for the End-Devonian Extinction is a nearby supernova that caused the collapse of the ozone layer, leading to the Earth's surface being scorched by ultraviolet light.  This triggered a massive die-off of plants -- which had only recently colonized the land -- and worldwide anoxia.  

The result?  A mass extinction that hit just about every taxon known.

The idea that a supernova might have been to blame for the End-Devonian Extinction came from the presence of hundreds of thousands of plant spores in sedimentary rock layers that showed evidence of what appeared to be radiation damage.  This isn't conclusive, of course; the Fields et al. team is up front that this is only a working hypothesis.  What they'll be looking for next is isotopes of elements in those same rock layers that are only produced by bombardment with radiation, such as plutonium-244 and samarium-146.  "When you see green bananas in Illinois, you know they are fresh, and you know they did not grow here," Fields said, in an interview in Science Daily.  "Like bananas, Pu-244 and Sm-146 decay over time.  So if we find these radioisotopes on Earth today, we know they are fresh and not from here -- the green bananas of the isotope world -- and thus the smoking guns of a nearby supernova."

So as much as I'd love to witness a supernova in my lifetime, it'd be nice if it was one well outside of the terrifyingly-named "kill zone" (thought to be about 25 light years or so).  And chances are, there's nothing inside that radius we need to worry about.  If any of the known supernova candidates explode, we'll almost certainly be able to enjoy the fireworks from a safe distance.

***************************

Fan of true crime stories?  This week's Skeptophilia book recommendation of the week is for you.

In The Poisoner's Handbook:Murder and the Birth of Forensic Medicine in Jazz Age New York, by Deborah Blum, you'll find out about how forensic science got off the ground -- through the efforts of two scientists, Charles Norris and Alexander Gettler, who took on the corruption-ridden law enforcement offices of Tammany Hall in order to stop people from literally getting away with murder.

In a book that reads more like a crime thriller than it does history, Blum takes us along with Norris and Gettler as they turned crime detection into a true science, resulting in hundreds of people being brought to justice for what would otherwise have been unsolved murders.  In Blum's hands, it's a fast, brilliant read -- if you're a fan of CSI, Forensics Files, and Bones, get a copy of The Poisoner's Handbook, you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]