Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Friday, April 3, 2020

The risk of knowing

One of the hallmarks of the human condition is curiosity.  We spend a lot of our early years learning by exploring, by trial-and-error, so it makes sense that curiosity should be built into our brains.

Still, it comes at a cost.  "Curiosity killed the cat" isn't a cliché for nothing.  The number of deaths in horror movies alone from someone saying, "I hear a noise in that abandoned house, I think I'll go investigate" is staggering.  People will take amazing risks out of nothing but sheer inquisitiveness -- so the gain in knowledge must be worth the cost.

[Image is in the Public Domain]

The funny thing is that we'll pay the cost even when what we gain isn't worth anything.  This was demonstrated by a clever experiment described in a paper by Johnny King Lau and Kou Murayama (of the University of Reading (U.K.)), Hiroko Ozono (of Kagoshima University) and Asuka Komiya (of Hiroshima University) that came out two days ago.  Entitled "Shared Striatal Activity in Decisions to Satisfy Curiosity and Hunger at the Risk of Electric Shocks," we hear about a set of experiments showing that humans will risk a painful shock to find out entirely useless information (in this case, how a card trick was performed).  The cleverest part of the experiments, though, is that they told test subjects ahead of time how much of a chance there was of being shocked -- so they had a chance to decide, "how much is this information worth?"

What they found was that even when told that there was a higher than 50% of being shocked, most subjects were still curious enough to take the risk.  The authors write:
Curiosity is often portrayed as a desirable feature of human faculty.  However, curiosity may come at a cost that sometimes puts people in harmful situations.  Here, using a set of behavioural and neuroimaging experiments with stimuli that strongly trigger curiosity (for example, magic tricks), we examine the psychological and neural mechanisms underlying the motivational effect of curiosity.  We consistently demonstrate that across different samples, people are indeed willing to gamble, subjecting themselves to electric shocks to satisfy their curiosity for trivial knowledge that carries no apparent instrumental value.
The researchers added another neat twist -- they used neuroimaging techniques to see what was going on in the curiosity-driven brain, and they found a fascinating overlap with another major driver of human behavior:
[T]his influence of curiosity shares common neural mechanisms with that of hunger for food.  In particular, we show that acceptance (compared to rejection) of curiosity-driven or incentive-driven gambles is accompanied by enhanced activity in the ventral striatum when curiosity or hunger was elicited, which extends into the dorsal striatum when participants made a decision.
So curiosity, then, is -- in nearly a literal sense -- a hunger.  The satisfaction we feel at taking a big bite of our favorite food when we're really hungry causes the same reaction in the brain as having a curiosity satisfied.  And like hunger, we're willing to take significant risks to satisfy our curiosity.  Even if -- to re-reiterate it -- the person in question knows ahead of time that the information they're curious about is technically useless.

I can definitely relate to this.  In me, it mostly takes the form of wasting inordinate amounts of time going down a rabbit hole online because some weird question came my way.  The result is that my brain is completely cluttered up with worthless trivia.  For example, I can tell you the scientific name of the bird you're looking at or why microbursts are common in the American Midwest or the etymology of the word "juggernaut," but went to the grocery store yesterday to buy three things and came back with only two of them.  (And didn't realize I'd forgotten 1/3 of the grocery order until I walked into the kitchen and started putting away what I'd bought.)

Our curiosity is definitely a double-edged sword.  I'm honestly fine with it, because often, knowing something is all the reward I need.  As physicist Richard Feynman put it, "The chief prize (of science) is the pleasure of finding things out."

So I suspect I'd have been one of the folks taking a high risk of getting shocked to see how the card trick was performed.  Don't forget that the corollary to the quote we started with -- "Curiosity killed the cat" -- is "...but satisfaction brought him back."

*******************************

In the midst of a pandemic, it's easy to fall into one of two errors -- to lose focus on the other problems we're facing, and to decide it's all hopeless and give up.  Both are dangerous mistakes.  We have a great many issues to deal with besides stemming the spread and impact of COVID-19, but humanity will weather this and the other hurdles we have ahead.  This is no time for pessimism, much less nihilism.

That's one of the main gists in Yuval Noah Harari's recent book 21 Lessons for the 21st Century.  He takes a good hard look at some of our biggest concerns -- terrorism, climate change, privacy, homelessness/poverty, even the development of artificial intelligence and how that might impact our lives -- and while he's not such a Pollyanna that he proposes instant solutions for any of them, he looks at how each might be managed, both in terms of combatting the problem itself and changing our own posture toward it.

It's a fascinating book, and worth reading to brace us up against the naysayers who would have you believe it's all hopeless.  While I don't think anyone would call Harari's book a panacea, at least it's the start of a discussion we should be having at all levels, not only in our personal lives, but in the highest offices of government.





Thursday, April 2, 2020

A window on the deep past

When I was a kid, I always enjoyed going on walks with my dad.  My dad wasn't very well educated -- barely finished high school -- but was incredibly wise and had an amazing amount of solid, practical common sense.  His attitude was that God gave us reasoning ability and we had damn well better use it -- that most of the questions you run into can be solved if you just get your opinions and ego out of the way and look at them logically.

The result was that despite never having had a physics class in his life, he was brilliant at figuring things out about how the world works.  Like the mind-blowing (well, to ten-year-old kid, at least) idea he told me about because we saw a guy pounding in a fence post with a sledgehammer.

The guy was down the street from us -- maybe a hundred meters away or so -- and I noticed something weird.  The reverberating bang of the head of the sledge hitting the top of the post was out of sync with what we were seeing.  We'd see the sledge hit the post, then a moment later, bang.

I asked my dad about that.  He thought for a moment, and said, "Well, it's because it takes time for the sound to arrive.  The sound is slower than light is, so you see the hammer hit before you hear it."  He told me about how his father had taught him tell how close a thunderstorm is by counting the seconds between the lightning flash and the thunderclap, and that the time got shorter the closer it was.  He pointed at the guy pounding in the fence post, and said, "So the closer we get to him, the shorter the delay should be between seeing the hammer hit and hearing it."

Which, of course, turned out to be true.

But then, a crazy thought occurred to me.  "So... we're always hearing things in the past?"

"I suppose so," he said.  "Even if you're really close to something, it still takes some time for the sound to get to you."

Then, an even crazier thought.  "The light takes some time, too, right?  A shorter amount of time, but still some time.  So we're seeing things in the past, too?"

He shrugged.  "I guess so.  Light is always faster than sound."  Then he grinned.  "I guess that's why some people appear bright until you hear them talk."

It was some years later that I recognized the implications of this -- that the farther away something is, the further back into the past we're looking.  The Sun is far enough away that the light from it takes eight minutes and twenty seconds to get here, so you are always seeing the Sun not as it is now, but as it was, eight minutes and twenty seconds ago.  The closest star to us other than the Sun is Proxima Centauri, which is 4.3 light years away -- so here, you're looking at a star as it was 4.3 years ago.  There is, in fact, no way to know what it looks like now -- the Special Theory of Relativity showed that the speed of light is the fastest speed at which information can travel.  Any of the stars you see in the night sky might be exploding right now (not that it's likely, mind you), and not only would we have no way to know, the farther away they are, the longer it would take us to find out about it.

This goes up to some unimaginably huge distances.  Consider quasars, which are peculiar beasts to say the least.  When first discovered in the 1950s, they were such anomalies that they were nicknamed quasi-stellar radio sources mainly because no one knew what the hell they were.  Astrophysicist Hong-Yee Chiu contracted that clumsy appellation to quasar in 1964, and it stuck.

The funny thing about them was on first glance, they just looked like ordinary stars -- points of light.  Not even spectacular ones -- the brightest quasar has a magnitude just under +13, meaning it's not even visible in small telescopes.  But when the astronomers looked at the light coming from them, they found something extraordinary.

The light was wildly red-shifted.  You probably know that red-shift occurs because of the Doppler effect -- just as the sound of a siren from an ambulance moving away from you sounds lower in pitch because the sound waves are stretched out by the ambulance's movement, the light from something moving away from you gets stretched -- and the analog to pitch in sound is frequency in light.  The faster an object is moving away from you, the more its light drops in frequency (moves toward the red end of the spectrum).  And, because of Hubble's law and the expansion of space, the faster an object in deep space is moving away from you, the farther away it is.

So that meant two things: (1) if Hubble's law was being applied correctly, quasars were ridiculously far away (the nearest ones estimated at about a billion light years); and (2) if they really were that far away, they were far and away the most luminous objects in the universe (an average quasar, if placed thirty light years away, would be as bright as the Sun).

But what on earth (or outside of it, actually) could generate that much energy?  And why weren't there any nearby ones?  Whatever process resulted in a quasar evidently stopped happening a billion or more years ago -- otherwise we'd see ones closer to us (and therefore, ones that had occurred more recently; remember, farther away in space, further back in time).

Speculation ran wild, mostly because the luminosities and distances were so enormous that it seemed like there must be some other explanation.  Quasars, some said, were red-shifted not because the light was being stretched not by the expansion of space, but because it was escaping a gravity well -- so maybe they weren't far away, they were simple off-the-scale massive.  Maybe they were the output-end of a stellar wormhole.  Maybe they were some kind of chain reaction of millions of supernovas all at once.

See?  I told you they didn't look that interesting.  [Image licensed under the Creative Commons ESO, Quasar (24.5 mag ;z~4) in MS 1008 Field, CC BY 4.0]

Further observations confirmed the crazy velocities, and found that they were consistent with the expansion of space -- quasars are, in fact, billions of light years away, receding from us at what in Spaceballs would definitely qualify as ludicrous speed, and therefore had a luminosity that was unlike anything else.  But what could be producing such an energy output?

The answer, it seems, is that what we're seeing is the light emitted as gas and dust makes its last suicidal plunge into a galaxy-sized black hole -- as it speeds up, friction heats it up, and it emits light on a scale that boggles the mind.  Take that energy output and drag it out as space expands, and you get the longest-wavelength light there is -- radio waves -- but produced at at a staggering intensity.

All of this comes up because of a series of six papers last week in The Astronomical Journal about a discovery of three quasars that are the most energetic ever discovered (and therefore, the most energetic objects in the known universe).  The most luminous of the three is called SDSS J1042+1646, which brings up the issue of how astrophysicists name the objects they study.  I'm sorry, but "SDSS J1042+1646" just does not capture the gravitas and magnitude of this thing.  There should be a new naming convention that will give the interested layperson an idea of the scale we're talking about here.  I propose renaming it "Abso-fucking-lutely Enormous Glowing Thing, No, Really, You Don't Even Understand How Big It Is."  Although that's a little cumbersome, I maintain that it's better than SDSS J1042+1646.

But I digress.

Anyhow, the energy output of this thing is 5x10^30 gigawatts.  That's five million trillion trillion gigawatts.  By comparison, your average nuclear reactor puts out one gigawatt.  Even all the stars in the Milky Way put together are a hundred times less energetic than this one quasar.

See?  I told you.  Abso-fucking-lutely enormous.

These quasars have also given astrophysicists some insight into why we don't see any close by.  They are blowing radiation -- and debris -- out of the core of the quasar at such high rates that eventually they run out of gas.  The matter loss slows down star formation, and over time a quasar transforms into an ordinary, stable galaxy.

So billions of years ago, the Milky Way was probably a quasar, and to a civilization on a planet a billion light years away, that's what it would look like now.  If you wanted your mind blown further.

The universe is a big place, and we are by comparison really tiny.  Some people don't like that, but for me, it re-emphasizes the fact that our little toils and troubles down here are minor and transitory.  The glory of what's out there will always outshine anything we do -- which is, I think, a good thing.

*******************************

In the midst of a pandemic, it's easy to fall into one of two errors -- to lose focus on the other problems we're facing, and to decide it's all hopeless and give up.  Both are dangerous mistakes.  We have a great many issues to deal with besides stemming the spread and impact of COVID-19, but humanity will weather this and the other hurdles we have ahead.  This is no time for pessimism, much less nihilism.

That's one of the main gists in Yuval Noah Harari's recent book 21 Lessons for the 21st Century.  He takes a good hard look at some of our biggest concerns -- terrorism, climate change, privacy, homelessness/poverty, even the development of artificial intelligence and how that might impact our lives -- and while he's not such a Pollyanna that he proposes instant solutions for any of them, he looks at how each might be managed, both in terms of combatting the problem itself and changing our own posture toward it.

It's a fascinating book, and worth reading to brace us up against the naysayers who would have you believe it's all hopeless.  While I don't think anyone would call Harari's book a panacea, at least it's the start of a discussion we should be having at all levels, not only in our personal lives, but in the highest offices of government.





Wednesday, April 1, 2020

Hands down

One of the most frustrating arguments -- if I can dignify them by that name -- from creationists is that there are "no transitional fossils."

If evolution happened, they say, you should be able to find fossils of species that are halfway between the earlier form and the (different-looking) later form.  That's actually true; you should find such fossils, and we have.  Thousands of them.  But when informed of this, they usually retort with one of two idiotic responses: (1) that evolution predicts there should be "halfway" forms between any two species you pick, which is what gave rise to Ray "BananaMan" Comfort's stupid "crocoduck" and "doggit" (dog/rabbit blend) photoshop images you can find with a quick Google search if you're in the mood for a facepalm; or (2) that any transitional forms just makes the situation worse -- that if you're trying to find an intermediate between species A and species C, and you find it (species B), now you've gone from one missing transitional form to two, one between A and B and the other between B and C.

This always reminds me of the Atalanta paradox of the Greek philosopher Zeno of Elea.  The gist is that motion is impossible, because if the famous runner Atalanta runs a race, she must first reach the point halfway between her starting point and the finish line, then the point halfway between there and the finish line, then halfway again, and so on; and because there are an infinite number of those intermediate points, she'll never reach the end of the race.  Each little bit she runs just leaves an unending number of smaller distances to cross, so she's stuck.

Fortunately for Atalanta she spent more time training as a runner than reading philosophy, and doesn't know about this, so she goes ahead and crosses the finish line anyway.

But back to evolution.  The problem with the creationists' "transitional fossil" objection is that just about every time paleontologists find a new fossil bed, they discover more transitional fossils, and often find species with exactly the characteristics that had been predicted by evolutionary biologists before the discovery.  And that's the hallmark of a robust scientific model; it makes predictions that line up with the actual facts.  Transitional fossils are an argument for evolution, not against it.

We got another illustration of the power of the evolutionary model with a paper last week in Nature, authored by Richard Cloutier, Roxanne Noël, Isabelle Béchard, and Vincent Roy (of the Université du Québec à Rimouski), and Alice M. Clement, Michael S. Y. Lee, and John A. Long (of Flinders University).  One of the most striking homologies between vertebrates is their limbs -- all vertebrates that have limbs have essentially the same bone structure, with one upper arm bone, two lower arm bones, and a mess of carpals, metacarpals, and phalanges.  Doesn't matter if you're looking at a bat, a whale, a dog, a human, or a frog, we've all got the same limb bones -- and in fact, most of them have not only the same bones, but the same number in the same positions.  (I've never heard a creationist come up with a good explanation for why, if whales and humans don't have a common ancestor, whales' flippers encase a set of fourteen articulated finger bones -- just like we have.)

In any case, it's been predicted for a long time that there was a transitional form between fish and amphibians that would show an intermediate between a fish's fin and an amphibian's leg, but that fossil proved to be elusive.

Until now.

Readers, meet Elpistostege.  As far as why it's remarkable, allow me to quote the authors:
The evolution of fishes to tetrapods (four-limbed vertebrates) was one of the most important transformations in vertebrate evolution.  Hypotheses of tetrapod origins rely heavily on the anatomy of a few tetrapod-like fish fossils from the Middle and Late Devonian period (393–359 million years ago). These taxa—known as elpistostegalians—include Panderichthys, Elpistostege, and Tiktaalik, none of which has yet revealed the complete skeletal anatomy of the pectoral fin.  Here we report a 1.57-metre-long articulated specimen of Elpistostege watsoni from the Upper Devonian period of Canada, which represents—to our knowledge—the most complete elpistostegalian yet found.  High-energy computed tomography reveals that the skeleton of the pectoral fin has four proximodistal rows of radials (two of which include branched carpals) as well as two distal rows that are organized as digits and putative digits.  Despite this skeletal pattern (which represents the most tetrapod-like arrangement of bones found in a pectoral fin to date), the fin retains lepidotrichia (fin rays) distal to the radials.  We suggest that the vertebrate hand arose primarily from a skeletal pattern buried within the fairly typical aquatic pectoral fin of elpistostegalians.  Elpistostege is potentially the sister taxon of all other tetrapods, and its appendages further blur the line between fish and land vertebrates.
Well, that seems like a slam-dunk to me.  An amphibian-like limb bone arrangement -- with fish-like fin rays at the end of it.

No transitional forms, my ass.

[Image licensed under the Creative Commons Placoderm2, Elpistostege watsoni, CC BY-SA 4.0]

Study lead author Richard Cloutier said basically the same thing, but more politely, in an interview with Science Daily: "The origin of digits relates to developing the capability for the fish to support its weight in shallow water or for short trips out on land.  The increased number of small bones in the fin allows more planes of flexibility to spread out its weight through the fin...  The other features the study revealed concerning the structure of the upper arm bone or humerus, which also shows features present that are shared with early amphibians.  Elpistostege is not necessarily our ancestor, but it is closest we can get to a true 'transitional fossil', an intermediate between fishes and tetrapods."

So there you have it.  Evolution delivers again.  I'm not expecting this will convince the creationists -- probably nothing would -- but at least it's one more fantastic piece of evidence for anyone who's on the fence.  Now y'all'll have to excuse me, because I'm off to the kitchen to get another cup of coffee, and it's going to take me an infinite amount of time to get there, so I better get started.

*******************************

In the midst of a pandemic, it's easy to fall into one of two errors -- to lose focus on the other problems we're facing, and to decide it's all hopeless and give up.  Both are dangerous mistakes.  We have a great many issues to deal with besides stemming the spread and impact of COVID-19, but humanity will weather this and the other hurdles we have ahead.  This is no time for pessimism, much less nihilism.

That's one of the main gists in Yuval Noah Harari's recent book 21 Lessons for the 21st Century.  He takes a good hard look at some of our biggest concerns -- terrorism, climate change, privacy, homelessness/poverty, even the development of artificial intelligence and how that might impact our lives -- and while he's not such a Pollyanna that he proposes instant solutions for any of them, he looks at how each might be managed, both in terms of combatting the problem itself and changing our own posture toward it.

It's a fascinating book, and worth reading to brace us up against the naysayers who would have you believe it's all hopeless.  While I don't think anyone would call Harari's book a panacea, at least it's the start of a discussion we should be having at all levels, not only in our personal lives, but in the highest offices of government.





Tuesday, March 31, 2020

Fungus fracas

I suppose it's kind of a forlorn hope that popular media starts doing a better job of reporting on stories about science research.

My most recent example of attempting to find out what was really going on started with an article from Popular Mechanics sent to me by a friend, called "You Should Know About This Chernobyl Fungus That Eats Radiation."  The kernel of the story -- that there is a species of fungus that has evolved extreme radiation tolerance, and apparently now uses high-energy ionizing radiation to power its metabolism -- is really cool, and immediately put me in mind of the wonderful line from Ian Malcolm in Jurassic Park -- "Life finds a way."

There were a few things about the article, though, that made me give it my dubious look:


The first was that the author repeatedly says the fungus is taking radiation and "converting it into energy."  This is a grade-school mistake -- like saying "we turn our food into energy" or "plants convert sunlight into energy."  Nope, sorry, the First Law of Thermodynamics is strictly enforced, even at nuclear disaster sites; no production of energy allowed.  What the fungus is apparently doing is harnessing the energy the radiation already had, and storing it as chemical energy for later use.  The striking thing is that it's able to do this without its tissue (and genetic material) suffering irreparable damage.  Most organisms, upon exposure to ionizing radiation, either end up with permanently mutated DNA or are killed outright.

Apparently the fungus is able to pull off this trick by having huge amounts of melanin, a dark pigment that is capable of absorbing radiation.  In the melanin in our skin, the solar energy absorbed is converted to heat, but this fungus has hitched its melanin absorbers to its metabolism, allowing it to function a bit like chlorophyll does in plants.

Another thing that made me wonder was the author's comment that the fungus could be used to clean up nuclear waste sites.  This put me in mind of a recent study of pillbugs, little terrestrial crustaceans that apparently can survive in soils contaminated with heavy metals like lead, cadmium, and mercury.  Several "green living" sites misinterpreted this, and came to the conclusion that pillbugs are somehow "cleaning the soil" -- in other words, getting rid of the heavy metals entirely.  Of course, the truth is that the heavy metals are still there, they're just inside the pillbug, and when the pillbug dies and decomposes they're right back in the soil where they started.  Same for the radioactive substances in Chernobyl; the fungus's ability to use radiation as a driver for its metabolism doesn't mean it's somehow miraculously destroyed the radioactive substances themselves.

Anyhow, I thought I'd dig a little deeper into the radioactive fungus thing and see if I could figure out what the real scoop was, and I found an MSN article that does a bit of a better job at describing the radiation-to-chemical-energy process (termed radiosynthesis), and says that the scientists investigating it are considering its use as a radiation blocker (not a radiation destroyer).  Grow it on the walls of the International Space Station, where long-term exposure to cosmic rays is a potential health risk to astronauts, and it might not only shield the interior but use the absorbed cosmic rays to fuel its own growth.

Then I saw that the MSN article named the actual species of fungus, Cryptococcus neoformans.  And when I read this name, I said, "... wait a moment."

Cryptococcus neoformans is a fungal pathogen, responsible for a nasty lung infection called cryptococcosis.  It's an opportunist, most often causing problems in people with compromised immune systems, but once you've got it it's hard to get rid of -- like many fungal infections, it doesn't respond quickly or easily to medication.  And if it becomes systemic -- escapes from your lungs and infects the rest of your body -- the result is cryptococcal meningitis, which has a mortality rate of about 20%.

So not really all that sanguine about painting the stuff on the interior walls of the ISS.

Anyhow, all this is not to say the fungus and its evolutionary innovation are not fascinating.  I just wish science reporting in popular media could do a better job.  I know journalists can't put in all the gruesome details and technical jargon, but boiling something down and making it understandable does not require throwing in stuff that's downright misleading.  I probably come off as a grumpy curmudgeon for even pointing this out, but I guess that's inevitable because I am a grumpy curmudgeon.

So while they're at it, those damn journalists should get off my lawn.

*******************************

In the midst of a pandemic, it's easy to fall into one of two errors -- to lose focus on the other problems we're facing, and to decide it's all hopeless and give up.  Both are dangerous mistakes.  We have a great many issues to deal with besides stemming the spread and impact of COVID-19, but humanity will weather this and the other hurdles we have ahead.  This is no time for pessimism, much less nihilism.

That's one of the main gists in Yuval Noah Harari's recent book 21 Lessons for the 21st Century.  He takes a good hard look at some of our biggest concerns -- terrorism, climate change, privacy, homelessness/poverty, even the development of artificial intelligence and how that might impact our lives -- and while he's not such a Pollyanna that he proposes instant solutions for any of them, he looks at how each might be managed, both in terms of combatting the problem itself and changing our own posture toward it.

It's a fascinating book, and worth reading to brace us up against the naysayers who would have you believe it's all hopeless.  While I don't think anyone would call Harari's book a panacea, at least it's the start of a discussion we should be having at all levels, not only in our personal lives, but in the highest offices of government.





Monday, March 30, 2020

All that glitters

If you own anything made of gold, take a look at it now.

I'm looking at my wedding ring, made of three narrow interlocked gold bands.   It's a little scratched up after almost eighteen years, but still shines.


Have you ever wondered where gold comes from?  Not just "a gold mine," but before that.  If you know a little bit of physics, it's kind of weird that the periodic table doesn't end at 26.  The reason is a subtle but fascinating one, and has to do with the binding energy curve.


The vertical axis is a measure of how tightly the atom's nucleus is held together.  More specifically, it's the amount of energy (in millions of electron-volts) that it would take to completely disassemble the nucleus into its component protons and neutrons.  From hydrogen (atomic number = 1) up to iron (atomic number = 26), there is a relatively steady increase in binding energy.  So in that part of the graph, fusion is an energy-releasing process (moves upward on the graph) and fission is an energy-consuming process (moves downward on the graph).  This, in fact, is what powers the Sun; going from hydrogen to helium is a jump of seven million electron-volts per proton or neutron, and that energy release is what produces the light and heat that keeps us all alive.

After iron, though -- specifically after an isotope of iron, Fe-56, with 26 protons and 30 neutrons -- there's a slow downward slope in the graph.  So after iron, the situation is reversed; fusion would consume energy, and fission would release it.  This is why the fission of uranium-235 generates energy, which is how a nuclear power plant works.

It does generate a question, though.  If fusion in stars is energetically favorable, increasing stability and releasing energy, up to but not past iron -- how do the heavier elements form in the first place?  Going from iron to anywhere would require a consumption of energy, meaning those will not be spontaneous reactions.  They need a (powerful) energy driver.  And yet, some higher-atomic-number elements are quite common -- zinc, iodine, and lead come to mind.

Well, it turns out that there are two ways this can happen, and they both require a humongous energy source.  Like, one that makes the core of the Sun look like a wet firecracker.  Those are supernova explosions, and neutron star collisions.  And just last week, two astrophysicists -- Szabolcs Marka of Columbia University and Imre Bartos of the University of Florida -- found evidence that the heavy elements on the Earth were produced in a collision between two neutron stars, on the order of a hundred million years before the Solar System formed.

This is an event of staggering magnitude.  "If you look up at the sky and you see a neutron-star merger 1,000 light-years away," Marka said, "it would outshine the entire night sky."

What apparently happens is when two neutron stars -- the ridiculously dense remnants of massive stellar cores -- run into each other, it is such a high-energy event that even thermodynamically unfavorable (energy-consuming) reactions can pick up enough energy from the surroundings to occur.  Then some of the debris blasted away from the collision gets incorporated into forming stars and planets -- and here we are, with tons of lightweight elements, but a surprisingly high amount of heavier ones, too.

But how do they know it wasn't a nearby supernova?  Those are far more common in the universe than neutron star collisions.  Well, the theoretical yield of heavy elements is known for each, and the composition of the Solar System is far more consistent with a neutron star collision than with a supernova.  And as for the timing, a chunk of the heavy isotopes produced are naturally unstable, so decaying into lighter nuclei is favored (which is why heavy elements are often radioactive; the products of decay are higher on the binding energy curve than the original element was).  Since this happens at a set rate -- most often calculated as a half-life -- radioactive isotopes act like a nuclear stopwatch, analogous to the way radioisotope decay is used to calculate the ages of artifacts, fossils, and rocks.  Backtracking that stopwatch to t = 0 gives an origin of about 4.7 billion years ago, or a hundred million years before the Solar System coalesced.

So next time you look at anything made of heavier elements -- gold or silver or platinum, or (more prosaically) the zinc plating on a galvanized steel pipe -- ponder for a moment that it was formed in a catastrophically huge collision between two neutron stars, an event that released more energy in a few seconds than the Sun will produce over its entire lifetime.  Sometimes the most ordinary things have a truly extraordinary origin -- something that never fails to fascinate me.

*******************************

In the midst of a pandemic, it's easy to fall into one of two errors -- to lose focus on the other problems we're facing, and to decide it's all hopeless and give up.  Both are dangerous mistakes.  We have a great many issues to deal with besides stemming the spread and impact of COVID-19, but humanity will weather this and the other hurdles we have ahead.  This is no time for pessimism, much less nihilism.

That's one of the main gists in Yuval Noah Harari's recent book 21 Lessons for the 21st Century.  He takes a good hard look at some of our biggest concerns -- terrorism, climate change, privacy, homelessness/poverty, even the development of artificial intelligence and how that might impact our lives -- and while he's not such a Pollyanna that he proposes instant solutions for any of them, he looks at how each might be managed, both in terms of combatting the problem itself and changing our own posture toward it.

It's a fascinating book, and worth reading to brace us up against the naysayers who would have you believe it's all hopeless.  While I don't think anyone would call Harari's book a panacea, at least it's the start of a discussion we should be having at all levels, not only in our personal lives, but in the highest offices of government.





Saturday, March 28, 2020

Contagious disinformation

Well, that didn't take long.

All it took was Donald Trump harping on the "Chinese virus" thing for a few days, and all of his MAGA followers took off in a large herd, bleating angrily about how the most important thing was making China pay for causing all this.  I've already seen three people post that COVID stands for "China-Originated Viral Infectious Disease."  Worse, when someone responded that this was incorrect, that it stands for "COronaVIrus Disease," another person piped up, "Who gives a fuck?  It started in CHINA and that's all that matters."

All.  That.  Matters.

Not the fact that we currently have the highest number of cases in the world here in the United States.  Not that we are woefully behind in testing, whatever Trump and his cronies would have you believe.  Not that we're in drastic need of PPE, including masks and gloves, and that some hospitals have substituted plastic garbage bags for protective suits -- and that because Trump is a vindictive toddler, it's looking like what PPE we do have is going to be parceled out according to which states' governors kiss Trump's ass most enthusiastically.

I've said more than once recently that none of this is going to change until some miracle occurs and Fox News decides to end their nightly celebratory circle-jerk over how wonderful Dear Leader is.  Every day they're presenting nothing but lies, spin, and propaganda, and a good 50% of Americans get their news solely from Fox.

And don't even start with what-about-ism.  Yes, I know the other media sources are biased.  Show me one major American news source that lies as consistently and as maliciously as Fox.  There have been whole studies that have shown that Fox News viewers are, across the board, the least aware of the facts by comparison to viewers from six other sources -- and compared to those who don't watch the news at all.  That's right: not watching the news leaves you, on average, better informed than watching Fox.

Anyhow, because Trump et al. are now more concerned about getting people pissed off at China than they are about dealing with the problem in our own country, we also have conspiracy theories popping up all over the place that the virus didn't just originate in China, it was created by China.  In some versions, the pandemic was caused by a lab accident in Wuhan; in others, the virus was deliberately introduced into the population, for reasons that remain unclear (largely because it didn't happen, but try to tell the conspiracy theorists that).

[Image is in the Public Domain courtesy of the Center for Disease Control]

In any case, this sort of thing is becoming so widespread that a team led by virologist Kristian Andersen of Scripps just published a study analyzing the genome of the COVID-19 virus, and they found that --  beyond a shadow of a doubt -- the virus is a natural pathogen, and it looks like although it started in some non-human animal (bats and pangolins being the two top contenders), all it took was one jump to a human host to get the ball rolling.

In "The Proximal Origin of SARS-CoV-2," we read the following:
It is improbable that SARS-CoV-2 emerged through laboratory manipulation of a related SARS-CoV-like coronavirus.  As noted above, the RBD [receptor-binding domain] of SARS-CoV-2 is optimized for binding to human ACE2 with an efficient solution different from those previously predicted.  Furthermore, if genetic manipulation had been performed, one of the several reverse-genetic systems available for betacoronaviruses would probably have been used.  However, the genetic data irrefutably show that SARS-CoV-2 is not derived from any previously used virus backbone.  Instead, we propose two scenarios that can plausibly explain the origin of SARS-CoV-2: (i) natural selection in an animal host before zoonotic transfer; and (ii) natural selection in humans following zoonotic transfer.  [Italics mine]
Of course, the claim that it was bioengineered never had much going for it.  Molecular epidemiologist Emma Hodcroft, of the University of Basel, said in an interview with Science News, "Essentially their claim was the same as me taking a copy of the Odyssey and saying, 'Oh, this has the word the in it,' and then opening another book, seeing the word the in it and saying,  'Oh my gosh, it’s the same word, there must be parts of the Odyssey in this other book.'  It was a really misleading claim and really bad science."

What about the claims of China mishandling the response to the epidemic, and then lying about it?  Okay, they probably did.  But the people who are bitching the most about this seem perfectly fine with Donald Trump doing the same damn thing.  "It's another Democrat hoax."  "One day, like a miracle, it will disappear."  "Anyone who needs a test, gets a test... and the tests, they're beautiful."  "Health insurance companies agreed to waive all co-payments for coronavirus treatments, extend insurance coverage to these treatments, and to prevent surprise medical billing."  "[W]hen you have fifteen people, and the fifteen within a couple of days is going to be down to close to zero, that's a pretty good job we've done."

And after all that, he had the gall to say, "I’ve always known this is a real—this is a pandemic.  I felt it was a pandemic long before it was called a pandemic…  I’ve always viewed it as very serious."

But on Fox News apparently Trump can say one thing today and exactly the opposite tomorrow, and the loyal viewers will believe him both times.

Okay, I'm ranting.  But this is killing people.  There seems to be no way to compel Fox to stop lying, even when American citizens are being harmed as a direct result of what they air.  I'm all for freedom of speech and freedom of the press, but I'm also for personal responsibility -- and when your lies cause people to die, there should be some kind of legal recourse available.

But thus far, they've gotten away with it scot-free, and in fact are encouraging the conspiracy theories and anti-Chinese sentiment, probably to draw attention away from the abject failure of our own government to act quickly and responsibly.  The ironic thing is that the success of their own strategies has put their own viewers into the greatest likelihood of harm -- and that even that isn't stopping them from their daily smorgasbord of disinformation.

*****************************

Any guesses as to what was the deadliest natural disaster in United States history?

I'd speculate that if a poll was taken on the street, the odds-on favorites would be Hurricane Katrina, Hurricane Camille, and the Great San Francisco Earthquake.  None of these are correct, though -- the answer is the 1900 Galveston hurricane, that killed an estimated nine thousand people and basically wiped the city of Galveston off the map.  (Galveston was on its way to becoming the busiest and fastest-growing city in Texas; the hurricane was instrumental in switching this hub to Houston, a move that was never undone.)

In the wonderful book Isaac's Storm, we read about Galveston Weather Bureau director Isaac Cline, who tried unsuccessfully to warn people about the approaching hurricane -- a failure which led to a massive overhaul of how weather information was distributed around the United States, and also spurred an effort toward more accurate forecasting.  But author Erik Larson doesn't make this simply about meteorology; it's a story about people, and brings into sharp focus how personalities can play a huge role in determining the outcome of natural events.

It's a gripping read, about a catastrophe that remarkably few people know about.  If you have any interest in weather, climate, or history, read Isaac's Storm -- you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, March 27, 2020

The light show of a lifetime

Because I am absolutely saturated with the bad news about COVID-19 and our government's complete balls-up of a response -- as I suspect many of us are -- today I'm going to focus on something to look forward to.

During my lifetime, comets have largely not lived up to the hype.  Oh, they're cool, no doubt about it, but compared to accounts of the double-header of "daylight comets" that occurred in 1910 -- the unnamed "Great Comet" that appeared in January, and Halley's Comet in April -- the ones I've seen have been faint, visible to the unaided eye as a vague streak, showing their unearthly beauty only through binoculars or telescopes.  The first comet I remember anticipating, Kohoutek in 1973, fizzled miserably, and the claims that it would be the "Comet of the Century" fell far short of the mark.  Even Halley's reappearance in 1986 was a bit of an anticlimax, with a display that was nowhere near as spectacular as it had been in 1910.

But this time we may just have a winner.

Discovered this past December by the automated search program with the rather terrifying name "Asteroid Terrestrial-impact Last Alert System," Comet C/2019-Y4 (ATLAS) is already brightening rapidly, and has increased by four magnitudes in only three months.  (It's currently at a magnitude of +15, well below the threshold for unaided-eye visibility.)  If it continues on its current trajectory brightness-wise, by mid-May it could be at a magnitude of -8 -- four magnitudes brighter than Venus.

If that happens, it could actually be the best comet of the past hundred years.

Still, it's wise to remember the words of Canadian astronomer David Levy, co-discoverer of Comet Shoemaker-Levy in 1993, which made a spectacular collision with the planet Jupiter the following year.  "Comets are like cats," Levy said.  "They have tails and they do whatever they want."

So we really don't know for sure what it's going to do.  Writing for the website Astronomy, Alister Ling says, "The big unknown: Is Y4 ATLAS a lightly powdered rubble pile that produces a meager tail that dissolves into nothingness?  Or does luck strike us with a dust-choked snowball whose tail forms the magnificent sword we see in paintings of old?  A touch of aurora or noctilucent clouds would really top off the light show."

One bit of good news for those of you who, like me, live in the Northern Hemisphere; if ATLAS puts on a grand performance, we've got front-row seats.  The path of the comet against the backdrop of stars makes a swoop through the northern sky, starting near the Big Dipper, peaking in brightness in May as it passes the through the constellation of Perseus, finally disappearing from sight in June near Betelgeuse in Orion.

[Image courtesy of Alison Klesman (via TheSkyX)]

Ling waxes rhapsodic over what we may be in for:
The week of May 25 to 31 is the week when we take the iconic pictures of Y4 and forge our memories of a lifetime. ATLAS is literally diving past the Sun, brightening a magnitude per day. The top half of the tail remains above the horizon all night, drawing our view downward. As minutes flow by, the tail brightens, overcoming the rising oranges and yellows of dawn until it reaches the brilliant head of the comet, lifting off the horizon...  If the sky is a dark transparent blue and ATLAS exceeds our expectations, we might snag the elusive trophy of a historical daylight comet.
I know better than to get my hopes up too high; comets' catlike behavior has caught me too often before.  But even if it's not the brightest object in the night sky, it should give us some fine opportunities for viewing, and even more for aficionados of astrophotography.

Whatever it does, it's nice to have something positive to look forward to.  So keep your eye on the skies, and hope for a light show that will be something to remember.

*****************************

Any guesses as to what was the deadliest natural disaster in United States history?

I'd speculate that if a poll was taken on the street, the odds-on favorites would be Hurricane Katrina, Hurricane Camille, and the Great San Francisco Earthquake.  None of these are correct, though -- the answer is the 1900 Galveston hurricane, that killed an estimated nine thousand people and basically wiped the city of Galveston off the map.  (Galveston was on its way to becoming the busiest and fastest-growing city in Texas; the hurricane was instrumental in switching this hub to Houston, a move that was never undone.)

In the wonderful book Isaac's Storm, we read about Galveston Weather Bureau director Isaac Cline, who tried unsuccessfully to warn people about the approaching hurricane -- a failure which led to a massive overhaul of how weather information was distributed around the United States, and also spurred an effort toward more accurate forecasting.  But author Erik Larson doesn't make this simply about meteorology; it's a story about people, and brings into sharp focus how personalities can play a huge role in determining the outcome of natural events.

It's a gripping read, about a catastrophe that remarkably few people know about.  If you have any interest in weather, climate, or history, read Isaac's Storm -- you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]