The last scene of Doctor Faustus where the man raves and implores on the edge of hell is, perhaps, stage fire. The last moments before damnation are not often so dramatic. Often the man knows with perfect clarity that some still possible action of his own will could yet save him. But he cannot make this knowledge real to himself. Some tiny habitual sensuality, some resentment too trivial to waste on a noisy fly, the indulgence of some fatal lethargy, seems to him at that moment more important than the choice between joy and total destruction. With eyes wide open, seeing that the endless terror is just about to begin and yet (for the moment) unable to feel terrified, he watches passively, not moving a finger for his own rescue, while the last links with joy and reason are severed, and drowsily sees the trap close upon his soul. So full of sleep are they at the time when they leave the right way.
Skeptophilia
Fighting Gullibility with Sarcasm, 6 days a week
Tuesday, February 17, 2026
The meatlocker
Monday, February 16, 2026
The kids are all right
Wiser heads than mine have commented on the laziness, disrespectfulness, and general dissipation of youth. Here's a sampler:
- Parents themselves were often the cause of many difficulties. They frequently failed in their obvious duty to teach self-control and discipline to their own children.
- We defy anyone who goes about with his eyes open to deny that there is, as never before, an attitude on the part of young folk which is best described as grossly thoughtless, rude, and utterly selfish.
- The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when elders enter the room. They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.
- Never has youth been exposed to such dangers of both perversion and arrest as in our own land and day. Increasing urban life with its temptations, prematurities, sedentary occupations, and passive stimuli just when an active life is most needed, early emancipation and a lessening sense for both duty and discipline, the haste to know and do all befitting man's estate before its time, the mad rush for sudden wealth and the reckless fashions set by its gilded youth -- all these lack some of the regulatives they still have in older lands with more conservative conditions.
- Youth were never more saucy -- never more savagely saucy -- as now... the ancient are scorned, the honourable are condemned, and the magistrate is not dreaded.
- Our sires' age was worse than our grandsires'. We, their sons, are more worthless than they; so in our turn we shall give the world a progeny yet more corrupt.
- [Young people] are high-minded because they have not yet been humbled by life, nor have they experienced the force of circumstances… They think they know everything, and are always quite sure about it.
- from an editorial in the Leeds Mercury, 1938
- from an editorial in the Hull Daily Mail, 1925
- Kenneth John Freeman, Cambridge University, 1907
- Granville Stanley Hall, The Psychology of Adolescence, 1904
- Thomas Barnes, The Wise Man's Forecast Against the Evil Time, 1624
- Horace, Odes, Book III, 20 B.C.E.
- Aristotle, 4th century B.C.E.
This comes up because of a study that was published in Science Advances, by John Protzko and Jonathan Schooler, called "Kids These Days: Why the Youth of Today Seem Lacking." And its unfortunate conclusion -- unfortunate for us adults, that is -- is that the sense of today's young people being irresponsible, disrespectful, and lazy is mostly because we don't remember how irresponsible, disrespectful, and lazy we were when we were teenagers. And before you say, "Wait a moment, I was a respectful and hard-working teenager" -- okay, maybe. But so are many of today's teenagers. If you want me to buy that we're in a downward spiral, you'll have to convince me that more teenagers back then were hard-working and responsible, and that I simply don't believe.
And neither do Protzko and Schooler.
So the whole thing hinges more on idealization of the past, and our own poor memories, than on anything real. I also suspect that a good many of the older adults who roll their eyes about "kids these days" don't have any actual substantive contact with young people, and are getting their impressions of teenagers from the media -- which certainly doesn't have a vested interest in portraying anyone as ordinary, honest, and law-abiding.
Or were you -- like the youth in Aristotle's day -- guilty of thinking you knew everything, and being quite sure about it?
Saturday, February 14, 2026
With a whimper
The death of massive stars, ten or more times the mass of the Sun, is thought to have a predictable -- if violent -- trajectory.
During most of their lifetimes, stars are in a relative balance between two forces. Fusion of hydrogen into helium in the core releases heat energy, which increases the pressure in the core and generates an outward-pointing force. At the same time, the inexorable pull of gravity generates an inward-pointing force. For the majority of the star's life, the two are in equilibrium; if something makes the core cool a little bit, gravity wins for a while and the star shrinks, increasing the pressure and thus the rate of fusion. This heats the core up, increasing the outward force and stopping the collapse.
Nice little example of negative feedback and homeostasis, that. Stars in this long, relatively quiescent phase are on the "Main Sequence" of the famous Hertzsprung-Russell Diagram:
Once the hydrogen fuel starts to deplete, though, the situation shifts. Gravity wins once again, but this time there's not enough hydrogen-to-helium fusion to counteract the collapse. The core shrinks, raising the temperature to hundreds of millions of degrees Kelvin -- enough to fuse helium to carbon. This release of energy causes the outer atmosphere to balloon outward, and the star becomes a red supergiant -- the surface is cool (and thus reddish), but the interior is far hotter than the core of our Sun.
Two famous stars -- Betelgeuse (in Orion) and Antares (in Scorpio) are in this final stage of their lives.
Here's where things get interesting, because the helium fuel doesn't last forever, either. The carbon "ash" left behind needs an even higher temperature to fuse into oxygen, nitrogen, and heavier elements, which happens when the previous process repeats itself -- further core collapse, followed by further heating. But this can't go on indefinitely. When the fusion reaction starts to generate iron, the game is up. Iron represents the turnaround point on the curve of binding energy, where fusion stops being an exothermic (energy-releasing) reaction and becomes endothermic (energy-consuming). At that point, the core can't respond with anything to support the pull of gravity, and the entire star collapses. The outer atmosphere rebounds off the collapsing core, creating a shockwave called a core-collapse (type II) supernova, releasing in a few seconds as much energy as the star did during its entire life on the main sequence. What's left afterward is a super-dense remnant -- either a neutron star or a black hole, depending on its mass.
Well, that's what we thought happened. But now a paper in Science describing the collapse of a supergiant star in the Andromeda Galaxy has suggested there may be a different fate for at least some massive stars -- that they may go out not with a bang, but with a whimper.
The occurrence that spurred this discovery was so underwhelming that it took astronomers a while to realize it had happened. A star began to glow intensely in the infrared region of the spectrum, and then suddenly -- it didn't anymore. It seemed to vanish, leaving behind a faintly glowing shell of dust. Kishalay De, lead author of the paper, says what happened is that we just witnessed a black hole forming without a supernova preceding it. The core ran out of fuel, the outer atmosphere collapsed, and the star itself just kind of... winked out.
"This has probably been the most surprising discovery of my life," De said. "The evidence of the disappearance of the star was lying in public archival data and nobody noticed for years until we picked it out... The dramatic and sustained fading of this star is very unusual, and suggests a supernova failed to occur, leading to the collapse of the star’s core directly into a black hole. Stars with this mass have long been assumed to always explode as supernovae. The fact that it didn’t suggests that stars with the same mass may or may not successfully explode, possibly due to how gravity, gas pressure, and powerful shock waves interact in chaotic ways with each other inside the dying star."Friday, February 13, 2026
The hazard of "just-so stories"
One of the problems with scientific research is there's a sneaky bias that can creep in -- manifesting as explaining a phenomenon a certain way because the explanation lines up with a narrative that seems so intuitive it's not even questioned.
Back in 1978, evolutionary biologist Stephen Jay Gould nicknamed these "just-so stories," after the 1902 book by Rudyard Kipling containing fairy tales about how animals gained particular traits (the most famous of which is "How the Leopard Got His Spots"). Gould was mainly pointing his finger at the relatively new field of evolutionary psychology -- giving straightforward evolutionary explanations for complex human behaviors -- but his stinging criticism can be levied against a great many other fields, too.
The difficulty is, this bias slips its way in because these explanations seem so damned reasonable. It's not quite like confirmation bias -- where we accept thin corroborative evidence for ideas we already agreed with, and demand ridiculously high standards for counter-evidence that might falsify them. It's almost like confirmation bias, only backwards -- after hearing it, we experience a "wow, I never knew that!" sort of delight. We didn't already believe the explanation; but when we find out about it, we respond with open-armed acceptance.
One good example, that I had to contend with every single year while teaching high school biology, was the whole "right-brained versus left-brained personality" thing, which was roundly debunked a long time ago. It's certainly true that our brains are lateralized, and most of us have a physically dominant hemisphere; also, it's undeniable that some of us are more holistic and creative and others more reductionistic and analytical; and it's also true that the cognitive parts of the right and left brain seem to process information differently. Putting these three together seems natural. The truth is, however, that any connection between brain dominance and personality type is tenuous in the extreme.
But it seems like it should be true, doesn't it? That's the hallmark of a "just-so story."
The reason this topic comes up is a recent paper in the journal Global Ecology and Conservation that challenges one of the most appealing of the "just-so stories" -- that the reintroduction of wolves to Yellowstone National Park caused a "trophic cascade," positively affecting the landscape and boosting species richness and species diversity in the entire region.
The original claim came from research by William Ripple et al., and connected the extirpation of wolves with the corresponding higher survival rate of elk and deer. This, they said, resulted in overbrowsing of willow and alder, to the point that as older plants died they were not being replaced by saplings. This, in turn, led to higher erosion into streams, silting of the gravel bottoms required for salmon and trout to spawn, so a drop in fish population. Last in the chain, this resulted in less food for bears, so a reduction in survival rates for bear cubs, and a decrease in the numbers of grizzly and black bears.
The reintroduction of wolves -- well, supposedly it undid all that. Within a few years of the establishment of a stable wolf population, the willows and alders rebounded because of higher predation on elk and deer -- leading to a resurgence of trout and salmon and an increase in the bear population.
This all sounds pretty cool, and doesn't it line up with what we'd like to be true? The eco-minded amongst us just love wolves. There's a reason they're featured in every wildlife calendar ever printed.
It's why I almost hate to tell you about the new paper, by Daniel MacNulty, Michael Procko, and T. J. Clark-Wolf of Utah State University, and David Cooper of Colorado State University. Here's the upshot, in their own words:
Ripple et al.... argued that large carnivore recovery in Yellowstone National Park triggered one of the world’s strongest trophic cascades, citing a 1500% increase in willow crown volume derived from plant height data... [W]e show that their conclusion is invalid due to fundamental methodological flaws. These include use of a tautological volume model, violations of key modeling assumptions, comparisons across unmatched plots, and the misapplication of equilibrium-based metrics in a non-equilibrium system. Additionally, Ripple et al. rely on selectively framed photographic evidence and omit critical drivers such as human hunting in their causal attribution. These shortcomings explain the apparent conflict with Hobbs et al., who found evidence for a relatively weak trophic cascade based on the same height data and a long-term factorial field experiment. Our critique underscores the importance of analytical rigor and ecological context for understanding trophic cascade strength in complex ecosystems like Yellowstone.
MacNulty et al. demonstrate that if you re-analyze the same data and rigorously address these flaws, the trophic cascade effect largely vanishes. "Once these problems are accounted for, there is no evidence that predator recovery caused a large or system-wide increase in willow growth," said study co-author David Cooper. "The data instead support a more modest and spatially variable response influenced by hydrology, browsing, and local site conditions."
It's kind of a shame, isn't it? Definitely one of those "it'd be nice if it were true" things. It'll be interesting to see how Ripple et al. respond. I'm reminded of a video on astronomer David Kipping's wonderful YouTube channel The Cool Worlds Lab about his colleague Matthew Bailes -- who in 1990 announced what would have been the first hard evidence of an exoplanet, and then a few months later had to retract the announcement because he and his co-authors had realized there'd been an unrecognized bias in the data. Such admissions are, naturally, deeply embarrassing to make, but to Bailes's credit, he and his co-authors Andrew Lyne and Setnam Shemar owned up and retracted the paper, which was certainly the honest thing to do.
Here, though -- well, perhaps Ripple et al. will be able rebut this criticism, although having read both papers, it's hard for me to see how. We'll have to wait and see.
Note, too, that MacNulty et al. are not saying that there's anything wrong with reintroducing wolves to Yellowstone -- just that the response of a complex system to tweaking a variable is going to be, well, complex. And we shouldn't expect anything different, however much we like neat tales of How the Leopard Got His Spots.
So that's today's kind of disappointing news from the world of science. How we have to be careful about ideas that have an immediate intuitive appeal. Just keep in mind physicist Richard Feynman's wise words: "The first rule in science is that you must not fool yourself -- and you are the easiest person to fool."
Thursday, February 12, 2026
Echoes of the ancestor
One of the most persuasive pieces of evidence of the common ancestry of all life on Earth is genetic overlap -- and the fact that the percent overlap gets higher when you compare more recently-diverged species.
What is downright astonishing, though, is that there is genetic overlap between all life on Earth. Yeah, okay, it's easy enough to imagine there being genetic similarity between humans and gorillas, or dogs and foxes, or peaches and plums; but what about more distant relationships? Are there shared genes between humans... and bacteria?
The answer, amazingly, is yes, and the analysis of these universal paralogs was the subject of a fascinating paper in the journal Cell Genomics last week. Pick any two organisms on Earth -- choose them to be as distantly-related to each other as you can, if you like -- and they will still share five groups of genes, used for making the following classes of enzymes:
- aminotransferases
- imidazole-4-carboxamide isomerase
- carbamoyl phosphate synthetases
- aminoacyl-tRNA synthetases
- initiation facter IF2
The first three are connected with amino acid metabolism; the last two, with the process of translation -- which decodes the message in mRNA and uses it to synthesize proteins.
The fact that all life forms on Earth have these five gene groups suggests something wild; that we're looking at genes that were present in LUCA -- the Last Universal Common Ancestor, our single-celled, bacteria-like forebear that lived in the primordial seas an estimated four billion years ago. Since then, two things happened -- the rest of LUCA's genome diverged wildly, under the effects of mutation and selection, so that now we have kitties and kangaroos and kidney beans; and those five gene groups were under such extreme stabilizing selection that they haven't significantly changed, in any of the branches of the tree of life, in millions or billions of generations.
The authors write:
Universal paralog families are an important tool for understanding early evolution from a phylogenetic perspective, offering a unique and valuable form of evidence about molecular evolution prior to the LUCA. The phylogenetic study of ancient life is constrained by several fundamental limitations. Both gene loss across multiple lineages and low levels of conservation in some gene families can obscure the ancient origin of those gene families. Furthermore, in the absence of an extensive diagnostic fossil record, the dependence of molecular phylogenetics on conserved gene sequences means that periods of evolution that predated the emergence of the genetic system cannot be studied. Even so, emerging technologies across a number of areas of computational biology and synthetic biology will expand our ability to reconstruct pre-LUCA evolution using these protein families. As our understanding of the LUCA solidifies, universal paralog protein families will provide an indispensable tool for pushing our understanding of early evolutionary history even further back in time, thereby describing the foundational processes that shaped life as we know it today.
Wednesday, February 11, 2026
Watching the clock
The First Law of Thermodynamics says that the total quantity of energy and mass in a closed system never changes; it's sometimes stated as, "Mass and energy cannot be destroyed, only transformed." The Second Law states that in a closed system, the total disorder (entropy) always increases. As my long-ago thermodynamics professor put it, "The First Law says you can't win; the Second Law says you can't break even."
Hell of a way to run a casino, that.
So far, there doesn't seem to be anything particularly non-intuitive about this. Even from our day-to-day experience, we can surmise that the amount of stuff seems to remain pretty constant, and that if you leave something without maintenance, it tends to break down sooner or later. But the interesting (and less obvious) side starts to appear when you ask the question, "If the Second Law says that systems tend toward disorder, how can a system become more orderly? I can fling a deck of cards and make them more disordered, but if I want I can pick them up and re-order them. Doesn't that break the Second Law?"
It doesn't, of course, but the reason why is quite subtle, and has some pretty devastating implications. The solution to the question comes from asking how you accomplish re-ordering a deck of cards. Well, you use your sensory organs and brain to figure out the correct order, and the muscles in your arms and hands (and legs, depending upon how far you flung them in the first place) to put them back in the correct order. How did you do all that? By using energy from your food to power the organs in your body. And to get the energy out of those food molecules -- especially glucose, our primary fuel -- you broke them to bits and jettisoned the pieces after you were done with them. (When you break down glucose to extract the energy, a process called cellular respiration, the bits left are carbon dioxide and water. So the carbon dioxide you exhale is actually broken-down sugar.)
Here's the kicker. If you were to measure the entropy decrease in the deck of cards, it would be less -- way less -- than the entropy increase in the molecules you chopped up to get the energy to put the cards back in order. Every time you increase the orderliness of a system, it always (1) requires an input of energy, and (2) increases the disorderliness somewhere else. We are, in fact, little chaos machines, leaving behind a trail of entropy everywhere we go, and the more we try to fix things, the worse the situation gets.
I've heard people arguing that the Second Law disproves evolution because the evolutionary model claims we're in a system that has become more complex over time, which according to the Second Law is impossible. It's not; and in fact, that statement betrays a fundamental lack of understanding of what the Second Law means. The only reason why any increase in order occurs -- be it evolution, or embryonic development, or stacking a deck of cards -- is because there's a constant input of energy, and the decrease in entropy is offset by a bigger increase somewhere else. The Earth's ecosystems have become more complex in the 4.5 billion year history of life because there's been a continuous influx of energy from the Sun. If that influx were to stop, things would break down.
Fast.
The reason all this comes up is because of a paper in Physical Review X that gives another example of trying to make things better, and making them worse in the process. This one has to do with the accuracy of clocks -- a huge deal to scientists who are studying the rate of reactions, where the time needs to be measured to phenomenal precision, on the scale of nanoseconds or better. The problem is, we learn from "Measuring the Thermodynamic Cost of Timekeeping," the more accurate the clock is, the higher the entropy produced by its workings. So, in effect, you can only measure time in a system to the extent you're willing to screw the system up.
[Image licensed under the Creative Commons Robbert van der Steeg, Eternal clock, CC BY-SA 2.0]
The authors write:
All clocks, in some form or another, use the evolution of nature towards higher entropy states to quantify the passage of time. Due to the statistical nature of the second law and corresponding entropy flows, fluctuations fundamentally limit the performance of any clock. This suggests a deep relation between the increase in entropy and the quality of clock ticks... We show theoretically that the maximum possible accuracy for this classical clock is proportional to the entropy created per tick, similar to the known limit for a weakly coupled quantum clock but with a different proportionality constant. We measure both the accuracy and the entropy. Once non-thermal noise is accounted for, we find that there is a linear relation between accuracy and entropy and that the clock operates within an order of magnitude of the theoretical bound.Study co-author Natalia Ares, of the University of Oxford, summarized their findings succinctly in an article in Science News; "If you want a better clock," she said, "you have to pay for it."
So a little like the Heisenberg Uncertainty Principle, the more you try to push things in a positive direction, the more the universe pushes back in the negative direction.
Apparently, even if all you want to know is what time it is, you still can't break even.
So that's our somewhat depressing science for the day. Entropy always wins, no matter what you do. Maybe I can use this as an excuse for not doing housework. Hey, if I make things more orderly here, all it does is mess things up elsewhere, so what's the point?
Nah, never mind. My wife'll never buy it.
Tuesday, February 10, 2026
Falling rock zone
Stones fell like rain in the Qingyang district. The larger ones were four to five catties [a catty is a traditional Chinese unit of mass, equal to about a half a kilogram], and the smaller ones were two to three catties. Numerous stones rained in Qingyang. Their sizes were all different. The larger ones were like goose's eggs and the smaller ones were like water-chestnuts. More than ten thousand people were struck dead. All of the people in the city fled to other places.








