Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label accuracy. Show all posts
Showing posts with label accuracy. Show all posts

Thursday, March 13, 2025

Old fake news

Last year I did a post about the remarkable Byzantine Emperor Constantine VII Porphyrogenitus, whose passion for history (coupled with an understanding of how fragile and easily lost books are) led him to compile a 53-volume set of transcripts of the writings from historians of antiquity.  His work preserved accounts for which we have no other copies, so without his tireless efforts, huge chunks of the history of early Europe would now be unknown and unknowable.

And that's even taking into account that of his original 53 volumes, only four of them survived.

So many works of ancient writers are lost forever, some to natural disasters like fire, flood, earthquakes, and volcanic eruptions, but others to deliberate destruction -- often motivated by religious fervor, or the desperation by rulers to discredit their rivals and predecessors.  This latter, which was all too common after there'd been conflict over succession, led to the systematic purging of works painting previous regimes in a positive light.

The loss of primary sources makes the job of modern historians hard enough.  But a further complication arises when you consider the question of what happens when one of the documents that did survive is unreliable.

This is exactly the situation with regards to a major source of our knowledge of the later Roman Empire, from the reign of the Emperor Hadrian (117 - 138 C.E.) to the Emperor Marcus Aurelius Carinus (283-285 C.E.).  The document is called the Historia Augusta and seems to have been written during the reign of the Emperor Diocletian (285-305 C.E.).  Diocletian himself was looked upon early in his reign as a usurper -- he wasn't of royal blood, but was a soldier who rose up through the ranks -- so it's no wonder that a writer during his reign would be motivated to dig up all the dirt he could on the preceding dynasties.

"Okay, they may have been royals, but a lot of 'em were loonies," seems to have been the approach.  "Diocletian, on the other hand, will Make Rome Great Again."

Cover of an eighteenth-century edition of the Historia Augusta, from Ettal Abbey, Germany [Image is in the Public Domain]

To be fair, there was a lot to be critical of, especially in the last half of the period the Historia covers.  The fifty-year time period between the assassination of the Emperor Severus Alexander (235 C.E.) and the accession of Diocletian is known to historians as the "Crisis of the Third Century" because it was marked by chaos, lawlessness, and one short-lived ruler after another.  

The problem with the Historia is that for a lot of the period, there's nothing to cross-check it against.  There are chunks of material that have no attestation anywhere else; it's literally the only source that's survived.  There's an ongoing debate amongst historians about its accuracy, and some believe that even many of the sources the Historia cites are themselves made up.  The historian Anthony R. Birley, of Universität Düsseldorf, did an analysis published in the journal Classica called "Rewriting Second- and Third-Century History in Late Antique Rome: the Historia Augusta" in which he estimates the total amount of reliable historical information in the document as only seventeen percent -- from a high of thirty-three percent in the section on the life of Marcus Opellius Macrinus all the way down to a flat zero for the accounts of the usurpers Firmus, Saturnius, Proculus, and Bonosus, all of whom immediately preceded Diocletian's rise to wearing the purple.

Probably not a coincidence, that.

Historical research always runs into the problem that accurate records are no more likely to survive than inaccurate ones.  Also, there's the whole "history is written by the victors" thing, which complicates our understanding of any period of history where there was regime change.  But considering the problem of the Historia Augusta has made me wonder how historians of the future will read the documents from the United States of 2025.  Not only are the members of the Trump regime lying their asses off about what's going on, such as House Speaker Mike Johnson's claim that the economy was tanking under President Biden, and that Trump's repeatedly playing Tariff Peekaboo with Canada, Mexico, and the E.U. is somehow going to get it back on track, they're actively destroying documents having accurate information about what's happening.

My fear is that the Crisis of the Twenty-First Century won't end up any better understood by historians than the Crisis of the Third Century is.

****************************************


Wednesday, October 26, 2022

Sounding off

Ever have the experience of getting into a car, closing the door, and accidentally shutting the seatbelt in the door?

What's interesting about this is that most of the time, we immediately realize it's happened, reopen the door, and pull the belt out.  It's barely even a conscious thought.  The sound is wrong, and that registers instantly.  We recognize when something "sounds off" about noises we're familiar with -- when latches don't seat properly, when the freezer door hasn't completely closed, even things like the difference between a batter's solid hit and a tip during a baseball game.

Turns out, scientists at New York University have just figured out that there's a brain structure that's devoted to that exact phenomenon.

A research team led by neuroscientist David Schneider trained mice to learn to associate a particular sound with pushing a lever for a treat.  After learning the sound, it became as habituated in their brains as our own expectation of what the car door closing is supposed to sound like.  If after that the tone was varied even a little, or the timing between the lever push and the sound was changed, a part of the mouse's brain began to fire rapidly.

The activated part of the brain is a cluster of neurons in the auditory cortex, but I think of it as the "What The Hell Just Happened?" module.

"We listen to the sounds our movements produce to determine whether or not we made a mistake," Schneider said.  "This is most obvious for a musician or when speaking, but our brains are actually doing this all the time, such as when a golfer listens for the sound of her club making contact with the ball.  Our brains are always registering whether a sound matches or deviates from expectations.  In our study, we discovered that the brain is able to make precise predictions about when a sound is supposed to happen and what it should sound like...  Because these were some of the same neurons that would have been active if the sound had actually been played, it was as if the brain was recalling a memory of the sound that it thought it was going to hear."

As a musician, I find myself wondering if this is why I had such a hard time unlearning my tendency to make a face when I hit a wrong note, when I first started performing on stage.  My bandmates said (rightly) that if it's not a real howler, most mistakes will just zoom right past the audience unnoticed -- unless the musician clues them in by wincing.  (My bandmate Kathy also added that if it is a real howler, just play it that way again the next time that bit of the tune comes around, and the audience will think it's a deliberate "blue note" and be really impressed about how avant-garde we are.) 

My band Crooked Sixpence, with whom I played for an awesome ten years -- l. to r., Kathy Selby (fiddle), me (flute), John Wobus (keyboard)

I found it a hard response to quell, though.  My awareness of having hit a wrong note was so instantaneous that it's almost like my ears are connected directly to my facial wince-muscles, bypassing my brain entirely.  I did eventually get better, both in the sense of making fewer mistakes and also responding less when I did hit a clam, but it definitely took a while for the flinch response to calm down.

It's interesting to speculate on why we have this sense, and evidently share it with other mammals.  The obvious explanation is that a spike of awareness about something sounding off could be a good clue to the presence of danger -- the time-honored trope in horror movies of one character saying something doesn't seem quite right.  (That character, however, is usually the first one to get eaten by the monster, so the response may be of dubious evolutionary utility, at least in horror movies.)

I find it endlessly fascinating how our brains have evolved independent little subunits for dealing with contingencies like this.  Our sensory processing systems are incredibly fine-tuned, and they can alert us to changes in our surroundings so quickly it hardly involves conscious thought.

Think about that the next time your car door doesn't close completely.

****************************************


Thursday, April 29, 2021

Watching the clock

 If I had to pick the scientific law that is the most misunderstood by the general public, it would have to be the Second Law of Thermodynamics.

The First Law of Thermodynamics says that the total quantity of energy and mass in a closed system never changes; it's sometimes stated as, "Mass and energy cannot be destroyed, only transformed."  The Second Law states that in a closed system, the total disorder (entropy) always increases.  As my long-ago thermodynamics professor put it, "The First Law says you can't win; the Second Law says you can't break even."

Hell of a way to run a casino, that.

So far, there doesn't seem to be anything particularly non-intuitive about this.  Even from our day-to-day experience, we can surmise that the amount of stuff seems to remain pretty constant, and that if you leave something without maintenance, it tends to break down sooner or later.  But the interesting (and less obvious) side starts to appear when you ask the question, "If the Second Law says that systems tend toward disorder, how can a system become more orderly?  I can fling a deck of cards and make them more disordered, but if I want I can pick them up and re-order them.  Doesn't that break the Second Law?"

It doesn't, of course, but the reason why is quite subtle, and has some pretty devastating implications.  The solution to the question comes from asking how you accomplish re-ordering a deck of cards.  Well, you use your sensory organs and brain to figure out the correct order, and the muscles in your arms and hands (and legs, depending upon how far you flung them in the first place) to put them back in the correct order.  How did you do all that?  By using energy from your food to power the organs in your body.  And to get the energy out of those food molecules -- especially glucose, our primary fuel -- you broke them to bits and jettisoned the pieces after you were done with them.  (When you break down glucose to extract the energy, a process called cellular respiration, the bits left are carbon dioxide and water.  So the carbon dioxide you exhale is actually broken-down sugar.)

Here's the kicker.  If you were to measure the entropy decrease in the deck of cards, it would be less -- way less -- than the entropy increase in the molecules you chopped up to get the energy to put the cards back in order.  Every time you increase the orderliness of a system, it always (1) requires an input of energy, and (2) increases the disorderliness somewhere else.  We are, in fact, little chaos machines, leaving behind a trail of entropy everywhere we go, and the more we try to fix things, the worse the situation gets.

I've heard people arguing that the Second Law disproves evolution because the evolutionary model claims we're in a system that has become more complex over time, which according to the Second Law is impossible.  It's not; and in fact, that statement betrays a fundamental lack of understanding of what the Second Law means.  The only reason why any increase in order occurs -- be it evolution, or embryonic development, or stacking a deck of cards -- is because there's a constant input of energy, and the decrease in entropy is offset by a bigger increase somewhere else.  The Earth's ecosystems have become more complex in the 4.5 billion year history of life because there's been a continuous influx of energy from the Sun.  If that influx were to stop, things would break down.

Fast.

The reason all this comes up is because of a paper this week in Physical Review X that gives another example of trying to make things better, and making them worse in the process.  This one has to do with the accuracy of clocks -- a huge deal to scientists who are studying the rate of reactions, where the time needs to be measured to phenomenal precision, on the scale of nanoseconds or better.  The problem is, we learn from "Measuring the Thermodynamic Cost of Timekeeping," the more accurate the clock is, the higher the entropy produced by its workings.  So, in effect, you can only measure time in a system to the extent you're willing to screw the system up.

[Image licensed under the Creative Commons Robbert van der Steeg, Eternal clock, CC BY-SA 2.0]

The authors write:

All clocks, in some form or another, use the evolution of nature towards higher entropy states to quantify the passage of time.  Due to the statistical nature of the second law and corresponding entropy flows, fluctuations fundamentally limit the performance of any clock.  This suggests a deep relation between the increase in entropy and the quality of clock ticks...  We show theoretically that the maximum possible accuracy for this classical clock is proportional to the entropy created per tick, similar to the known limit for a weakly coupled quantum clock but with a different proportionality constant.  We measure both the accuracy and the entropy.  Once non-thermal noise is accounted for, we find that there is a linear relation between accuracy and entropy and that the clock operates within an order of magnitude of the theoretical bound.

Study co-author Natalia Ares, of the University of Oxford, summarized their findings succinctly in an article in Science News; "If you want a better clock," she said, "you have to pay for it."

So a little like the Heisenberg Uncertainty Principle, the more you try to push things in a positive direction, the more the universe pushes back in the negative direction.  

Apparently, even if all you want to know is what time it is, you still can't break even.

So that's our somewhat depressing science for the day.  Entropy always wins, no matter what you do.  Maybe I can use this as an excuse for not doing housework.  Hey, if I make things more orderly here, all it does is mess things up elsewhere, so what's the point?

Nah, never mind.  My wife'll never buy it.

****************************************

When people think of mass extinctions, the one that usually comes to mind first is the Cretaceous-Tertiary Extinction of 66 million years ago, the one that wiped out all the non-avian dinosaurs and a good many species of other types.  It certainly was massive -- current estimates are that it killed between fifty and sixty percent of the species alive at the time -- but it was far from the biggest.

The largest mass extinction ever took place 251 million years ago, and it destroyed over ninety percent of life on Earth, taking out whole taxa and changing the direction of evolution permanently.  But what could cause a disaster on this scale?

In When Life Nearly Died: The Greatest Mass Extinction of All Time, University of Bristol paleontologist Michael Benton describes an event so catastrophic that it beggars the imagination.  Following researchers to outcrops of rock from the time of the extinction, he looks at what was lost -- trilobites, horn corals, sea scorpions, and blastoids (a starfish relative) vanished completely, but no group was without losses.  Even terrestrial vertebrates, who made it through the bottleneck and proceeded to kind of take over, had losses on the order of seventy percent.

He goes through the possible causes for the extinction, along with the evidence for each, along the way painting a terrifying picture of a world that very nearly became uninhabited.  It's a grim but fascinating story, and Benton's expertise and clarity of writing makes it a brilliant read.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, November 4, 2019

The problem with Hubble

In my Critical Thinking classes, I did a unit on statistics and data, and how you tell if a measurement is worth paying attention to.  One of the first things to consider, I told them, is whether a particular piece of data is accurate or merely precise -- two words that in common parlance are used interchangeably.

In science, they don't mean the same thing.  A piece of equipment is said to be precise if it gives you close to the same value every time.  Accuracy, though, is a higher standard; data are accurate if the values are not only close to each other when measured with the same equipment, but agree with data taken independently, using a different device or a different method.

A simple example is that if my bathroom scale tells me every day for a month that my mass is (to within one kilogram either way) 239 kilograms, it's highly precise, but very inaccurate.

This is why scientists always look for independent corroboration of their data.  It's not enough to keep getting the same numbers over and over; you've got to be certain those numbers actually reflect reality.

This all comes up because of some new information about one of the biggest scientific questions known -- the rate of expansion of the entire universe.

[Image is in the Public Domain, courtesy of NASA]

A few months ago, I wrote about some recent experiments that were allowing physicists to home in on the Hubble constant, a quantity that is a measure of how fast everything in the universe is flying apart.  And the news appeared to be good; from a range of between 50 and 500, physicists had been able to narrow down the value of the Hubble constant to between 65.3 and 75.6.

The problem is, nobody's been able to get closer than that -- and in fact, recent measurements have widened, not narrowed, the gap.

There are two main ways to measure the Hubble constant.  The first is to use information like red shift and Cepheid variables (stars whose period of brightness oscillation varies predictably with their intrinsic brightness, making them a good "standard candle" to determine the distance to other galaxies) to figure out how fast the galaxies we see are receding from each other.  The other is to use the cosmic microwave background radiation -- the leftovers from the radiation produced by the Big Bang -- to determine the age of the universe, and therefore, how fast it's expanding.

So this is a little like checking my bathroom scale by weighing myself on it, then comparing my weight as measured by the scale at the gym and seeing if I get the same answer.

And the problem is, the measurement of the Hubble constant by these two methods is increasingly looking like it's resulting in two irreconcilably different values.

The genesis of the problem is that our measurement ability has become more and more precise -- the error bars associated with data collection have shrunk considerably.  And if the two measurements were not only precise, but also accurate, you would expect that our increasing precision would result in the two values getting closer and closer together.

Exactly the opposite has happened.

"Five years ago, no one in cosmology was really worried about the question of how fast the universe was expanding.  We took it for granted," said astrophysicist Daniel Mortlock of Imperial College London.  "Now we are having to do a great deal of head scratching – and a lot of research...  Everyone’s best bet was that the difference between the two estimates was just down to chance, and that the two values would converge as more and more measurements were taken.  In fact, the opposite has occurred.  The discrepancy has become stronger.  The estimate of the Hubble constant that had the lower value has got a bit lower over the years and the one that was a bit higher has got even greater."

The discovery of dark matter and dark energy, the first by Vera Rubin, Kent Ford, and Ken Freeman in the 1970s, and the second by Adam Riess and Saul Perlmutter in the 1990s, accounted for the fact that the rate of expansion seemed wildly out of whack with the amount of observable matter in the universe.  The problem is, since the discovery of the effects of dark matter and dark energy, we haven't gotten any closer to finding out what they actually are.  Every attempt to directly detect either one has resulted in zero success.

Now, it appears that the problems run even deeper than that.

"Those two discoveries [dark matter and dark energy] were remarkable enough," said Riess.  "But now we are facing the fact there may be a third phenomenon that we had overlooked – though we haven’t really got a clue yet what it might be."

"The basic problem is that having two different figures for the Hubble constant measured from different perspectives would simply invalidate the cosmological model we made of the universe," Mortlock said.  "So we wouldn’t be able to say what the age of the universe was until we had put our physics right."

It sounds to me a lot like the situation in the late 1800s, when physicists were trying to determine the answer to a seemingly simple question -- in what medium do light waves propagate?  Every wave has to be moving through something; water waves come from regular motion of water molecules, sound waves from oscillation of air molecules, and so on.  With light waves, what was "waving?"

Because the answer most people accepted was, "something has to be waving even if we don't know what it is," scientists proposed a mysterious substance called the "aether" that permeated all of space, and was the medium through which light waves were propagating.  All attempts to directly detect the aether were failures, but this didn't discourage people from saying that it must be there, because otherwise, how would light move?

Then along came the brilliant (and quite simple -- in principle, anyhow) Michelson-Morley experiment, which proved beyond any doubt that the aether didn't exist.  Light traveling in a vacuum appeared to have a constant speed in all frames of reference, which is entirely unlike any other wave ever studied.  And it wasn't until Einstein came along and turned our entire understanding upside down with the Special Theory of Relativity that we saw the piece we'd been missing that made sense of all the weird data.

What we seem to be waiting for is this century's Einstein, who will explain the discrepancies in the measurements of the Hubble constant, and very likely account for the mysterious, undetectable dark matter and dark energy (which sound a lot like the aether, don't they?) at the same time.  But until then, we're left with a mystery that calls into question one of the most fundamental conclusions of modern physics -- the age of the universe.

**********************************

This week's Skeptophilia book recommendation is a fun book about math.

Bet that's a phrase you've hardly ever heard uttered.

Jordan Ellenberg's amazing How Not to Be Wrong: The Power of Mathematical Thinking looks at how critical it is for people to have a basic understanding and appreciation for math -- and how misunderstandings can lead to profound errors in decision-making.  Ellenberg takes us on a fantastic trip through dozens of disparate realms -- baseball, crime and punishment, politics, psychology, artificial languages, and social media, to name a few -- and how in each, a comprehension of math leads you to a deeper understanding of the world.

As he puts it: math is "an atomic-powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength."  Which is certainly something that is drastically needed lately.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, December 12, 2017

Wikipedia, accuracy, and the Swanson conversion

I'm of two minds about Wikipedia.

I think it's a great resource for quick lookups, and use it myself for that sort of thing.  A study by Thomas Chesney found that experts generally consider Wikipedia to be pretty accurate, although the same study admits that others have concluded that 13% of Wikipedia entries have errors (how serious those errors are is uncertain; an error in a single date is certainly more forgivable than one that gives erroneous information about a major world event).  Another study concluded that between one-half and one-third of deliberately inserted errors are corrected within 48 hours.

But still.  That means that between one-half and two-thirds of deliberately inserted errors weren't corrected within 48 hours, which is troubling.  Given the recent squabbles over "fake news," having a source that could get contaminated by bias or outright falsehood, and remain uncorrected, is troubling.

Plus, there's the problem with error sneaking in, as it were, through the back door.  Sometimes claims are posted on Wikipedia (and elsewhere) by people who honestly think what they're stating is correct, and once that happens, there tends to be a snake-swallowing-its-own-tail pattern of circular citations, and before you know it, what was a false claim suddenly becomes enshrined as fact.

As an example of this, consider the strange case of the Swanson conversion.

The Swanson conversion, which sounds like the title of an episode of The Big Bang Theory but isn't, is a piece of the reaction of cellular respiration.  Without geeking out on this too extremely -- and my students will attest that I get way too excited about how cool cellular respiration is -- the background on this is as follows.

Cellular respiration, which is the set of reactions by which our cells burn glucose and release energy to power everything we do, has three major steps: glycolysis, the Krebs cycle, and the electron transport chain.  Each of those is made of dozens of sub-reactions, which I will refrain from describing (although like I said, they're extremely cool).  But there's one piece of it that doesn't have an official name, and that's the step that links glycolysis (the first step) to the Krebs cycle (the second step).

[image courtesy of the Wikimedia Commons, and the irony of the source of this image does not escape me]

Again, trying not to be too technical, here, but at the end of glycolysis, the original glucose molecule has been split in two (in fact, "glycolysis" is Greek for "sugar breaking").  The two halves are called pyruvate, and they're three-carbon compounds.  Before they can be thrown into the Krebs cycle, however, they have to lose one carbon (in the form of carbon dioxide), thus forming acetate, which can be introduced into the first step of Krebs.

So what's that carbon-losing step called?  Apparently, "the Swanson conversion."  It's in Wikipedia, not to mention many other websites describing the reactions of respiration.

The problem?  The name "Swanson conversion" was given to the linking step by a high school biology teacher named Swanson when his students asked him why that bit of the reaction didn't have a name, and he said, "hell, I dunno.  Let's call it 'the Swanson conversion.'"  And it stuck...

... especially when one of his students posted it to Wikipedia as the correct name.

When Swanson found out, he at first was annoyed, but after discussing it with his students, allowed it to remain as a test to see how quickly errors on Wikipedia were corrected.  And... it wasn't.  In fact, others who have wondered, as my students did, why this step doesn't have a name stumbled on this and thought, "Cool!  Now I know what to call it!" and posted it on their websites.  And now, this name that started out as an inside joke between a biology teacher and his students has become the semi-official name of the step.

Swanson, for his part, says he uses it as an example of how you can't trust what's online without checking your sources.  The problem is, how do you check the sources on something like this?  Once the aforementioned self-referential merry-go-round has been engaged, it becomes damn near impossible to figure out what's correct.  Especially in cases like this, which is that the correct answer to "what is the name of ____?" is, "There isn't one."  All too easy to say, "Well, I guess this one must be correct, since it's all over the place."

I realize this is a pretty unique situation, and I'm not trying to impugn the accuracy of Wikipedia as a whole.  I still use it for looking up simple facts -- after all, I'm from the generation during whose childhood if you wanted to know what year Henry VIII was crowned King of England, and didn't have an encyclopedia at home, you had to get in your car and drive to the library to look it up.  I think Wikipedia, errors and all, is a pretty significant step upward.

However, it does mean that we need to keep our brains engaged when we read stuff on the internet -- and, as always, try to find independent corroboration.  Because otherwise, we'll have people believing that one of the reactions of photosynthesis is called "the Bonnet activation."  And heaven knows, we wouldn't want that.