Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, November 4, 2019

The problem with Hubble

In my Critical Thinking classes, I did a unit on statistics and data, and how you tell if a measurement is worth paying attention to.  One of the first things to consider, I told them, is whether a particular piece of data is accurate or merely precise -- two words that in common parlance are used interchangeably.

In science, they don't mean the same thing.  A piece of equipment is said to be precise if it gives you close to the same value every time.  Accuracy, though, is a higher standard; data are accurate if the values are not only close to each other when measured with the same equipment, but agree with data taken independently, using a different device or a different method.

A simple example is that if my bathroom scale tells me every day for a month that my mass is (to within one kilogram either way) 239 kilograms, it's highly precise, but very inaccurate.

This is why scientists always look for independent corroboration of their data.  It's not enough to keep getting the same numbers over and over; you've got to be certain those numbers actually reflect reality.

This all comes up because of some new information about one of the biggest scientific questions known -- the rate of expansion of the entire universe.

[Image is in the Public Domain, courtesy of NASA]

A few months ago, I wrote about some recent experiments that were allowing physicists to home in on the Hubble constant, a quantity that is a measure of how fast everything in the universe is flying apart.  And the news appeared to be good; from a range of between 50 and 500, physicists had been able to narrow down the value of the Hubble constant to between 65.3 and 75.6.

The problem is, nobody's been able to get closer than that -- and in fact, recent measurements have widened, not narrowed, the gap.

There are two main ways to measure the Hubble constant.  The first is to use information like red shift and Cepheid variables (stars whose period of brightness oscillation varies predictably with their intrinsic brightness, making them a good "standard candle" to determine the distance to other galaxies) to figure out how fast the galaxies we see are receding from each other.  The other is to use the cosmic microwave background radiation -- the leftovers from the radiation produced by the Big Bang -- to determine the age of the universe, and therefore, how fast it's expanding.

So this is a little like checking my bathroom scale by weighing myself on it, then comparing my weight as measured by the scale at the gym and seeing if I get the same answer.

And the problem is, the measurement of the Hubble constant by these two methods is increasingly looking like it's resulting in two irreconcilably different values.

The genesis of the problem is that our measurement ability has become more and more precise -- the error bars associated with data collection have shrunk considerably.  And if the two measurements were not only precise, but also accurate, you would expect that our increasing precision would result in the two values getting closer and closer together.

Exactly the opposite has happened.

"Five years ago, no one in cosmology was really worried about the question of how fast the universe was expanding.  We took it for granted," said astrophysicist Daniel Mortlock of Imperial College London.  "Now we are having to do a great deal of head scratching – and a lot of research...  Everyone’s best bet was that the difference between the two estimates was just down to chance, and that the two values would converge as more and more measurements were taken.  In fact, the opposite has occurred.  The discrepancy has become stronger.  The estimate of the Hubble constant that had the lower value has got a bit lower over the years and the one that was a bit higher has got even greater."

The discovery of dark matter and dark energy, the first by Vera Rubin, Kent Ford, and Ken Freeman in the 1970s, and the second by Adam Riess and Saul Perlmutter in the 1990s, accounted for the fact that the rate of expansion seemed wildly out of whack with the amount of observable matter in the universe.  The problem is, since the discovery of the effects of dark matter and dark energy, we haven't gotten any closer to finding out what they actually are.  Every attempt to directly detect either one has resulted in zero success.

Now, it appears that the problems run even deeper than that.

"Those two discoveries [dark matter and dark energy] were remarkable enough," said Riess.  "But now we are facing the fact there may be a third phenomenon that we had overlooked – though we haven’t really got a clue yet what it might be."

"The basic problem is that having two different figures for the Hubble constant measured from different perspectives would simply invalidate the cosmological model we made of the universe," Mortlock said.  "So we wouldn’t be able to say what the age of the universe was until we had put our physics right."

It sounds to me a lot like the situation in the late 1800s, when physicists were trying to determine the answer to a seemingly simple question -- in what medium do light waves propagate?  Every wave has to be moving through something; water waves come from regular motion of water molecules, sound waves from oscillation of air molecules, and so on.  With light waves, what was "waving?"

Because the answer most people accepted was, "something has to be waving even if we don't know what it is," scientists proposed a mysterious substance called the "aether" that permeated all of space, and was the medium through which light waves were propagating.  All attempts to directly detect the aether were failures, but this didn't discourage people from saying that it must be there, because otherwise, how would light move?

Then along came the brilliant (and quite simple -- in principle, anyhow) Michelson-Morley experiment, which proved beyond any doubt that the aether didn't exist.  Light traveling in a vacuum appeared to have a constant speed in all frames of reference, which is entirely unlike any other wave ever studied.  And it wasn't until Einstein came along and turned our entire understanding upside down with the Special Theory of Relativity that we saw the piece we'd been missing that made sense of all the weird data.

What we seem to be waiting for is this century's Einstein, who will explain the discrepancies in the measurements of the Hubble constant, and very likely account for the mysterious, undetectable dark matter and dark energy (which sound a lot like the aether, don't they?) at the same time.  But until then, we're left with a mystery that calls into question one of the most fundamental conclusions of modern physics -- the age of the universe.


This week's Skeptophilia book recommendation is a fun book about math.

Bet that's a phrase you've hardly ever heard uttered.

Jordan Ellenberg's amazing How Not to Be Wrong: The Power of Mathematical Thinking looks at how critical it is for people to have a basic understanding and appreciation for math -- and how misunderstandings can lead to profound errors in decision-making.  Ellenberg takes us on a fantastic trip through dozens of disparate realms -- baseball, crime and punishment, politics, psychology, artificial languages, and social media, to name a few -- and how in each, a comprehension of math leads you to a deeper understanding of the world.

As he puts it: math is "an atomic-powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength."  Which is certainly something that is drastically needed lately.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]

No comments:

Post a Comment