I'm going to take a short break from Skeptophilia -- my next post will be Thursday, November 18. I'll still be lining up topics during the time I'm away, so keep those suggestions coming!
One of the most misunderstood laws of physics is the Second Law of Thermodynamics.
Honestly, I understand why. It's one of those bits of science that seem simple on first glance, then the more you learn, the weirder it gets. The simplest way to state the Second Law is "systems tend to proceed toward disorder," so on the surface it's so common-sensical that it triggers nothing more than a shrug and, "Well, of course." But a lot of its ramifications are seriously non-intuitive, and a few are downright mindblowing.
The other problem with it is that it exists in multiple formulations that seem to have nothing to do with one another. These include:
- the aforementioned statement that without an energy input, over time, systems become more disordered.
- if you place a warm object and cool object in contact with each other, energy will flow from the warmer to the cooler; the warmer object will cool off, and the cooler one will heat up, until they reach thermal equilibrium (equal temperatures).
- no machine can run at 100% efficiency (i.e., turning all of its energy input into usable work).
- some processes are irreversible; for example, there's nothing odd about knocking a wine glass off the table and shattering it, but if you were watching and the shards gathered themselves back together and leapt off the floor and back onto the table as an intact wine glass, you might wonder if all you'd been drinking was wine.
The fact that all of these are, at their basis, different ways of stating the same physical law is not obvious.
For me, the easiest way to understand the "why" of the Second Law has to do with a deck of playing cards. Let's say you have a deck in order; each suit arranged from ace to king, and the four suits in the order hearts, spades, diamonds, clubs. How many possible ways are there to arrange the cards in exactly that way?
Duh. Only one, by definition.
Now, let's say you accidentally drop the deck, then pick it up. Unless you flung the deck across the room, chances are, there will still be some of the cards in the original order, but some of the orderliness will probably have been lost. Why? Because there's only a single way to arrange the cards in the order you started with, but there are lots of ways to have them mostly out of order. The chances of jumping from the single orderly state to one of the many disorderly states is a near certainty. Then you drop them again (you're having a clumsy day, apparently). Are they more likely to become more disordered or more orderly?
You see where this is going; since at each round, there are way more disorderly states than orderly ones, just by the laws of statistics you're almost certainly going to watch the deck becoming progressively more disordered. Yes, it's possible that you could take a completely random deck, toss them in the air, and they'd fall into ace-through-king, hearts-spades-diamonds-clubs -- but if you're waiting for that to happen by random chance, you're going to have a long wait.
You can, of course, force them back into order by painstakingly rearranging the cards, but that takes an input of energy (in the form of your brain and muscles using up chemical energy to accomplish it). And here's where it gets weird; if you were to measure the decrease in entropy (disorder) in the deck of cards as you rearranged them, it would be outweighed by the increase in entropy of the energy-containing molecules you burned through to do it. The outcome: you can locally and temporarily decrease entropy, but only at the expense of creating more entropy somewhere else. Everything we do makes things more chaotic, and any decrease in entropy we see is illusory. In the end, entropy always wins.
As my long-ago thermodynamics professor told us, "The First Law of Thermodynamics says that you can't win. The Second Law says you can't break even."
Hell of a way to run a casino, that.
[Image is in the Public Domain]
The reason this all comes up is a paper that a friend of mine sent me a link to, which looks at yet another way of characterizing the Second Law; instead of heat transfer or overall orderliness, it considers entropy as a measure of information content. The less information you need to describe a system, the lower its entropy; in the example of the deck of cards, I was able to describe the orderly state in seven words (ace-through-king, hearts-spades-diamonds-clubs). High-entropy states require a lot of information; pick any of the out-of-order arrangements of the deck of cards, and pretty much the only way to describe it is to list each card individually from the top of the deck to the bottom.
The current paper has to do with information stored inside machines, and like many formulations of the Second Law, it results in some seriously weird implications. Consider, for example, a simple operation on a calculator -- 2+2, for example. When you press the "equals" sign, and the calculator tells you the answer is four, have you lost information, or gained it?
Most people, myself included, would have guessed that you've gained information; you now know that 2+2=4, if you didn't already know that. In a thermodynamic sense, though, you've lost information. When you get the output (4), you irreversibly erase the input (2+2). Think about going the other way, and it becomes clearer; someone gives you the output (4) and asks you what the input was.
No way to tell. There are, in fact, an infinite number of arithmetic operations that would give you the answer "4". What a calculator does is time-irreversible. "Computing systems are designed specifically to lose information about their past as they evolve," said study co-author David Wolpert, of the Santa Fe Institute.
By reducing the information in the calculator, you're decreasing its entropy (the answer has less information than the input did). And that means that the calculator is increasing entropy more somewhere else -- in this case, it heats up the surrounding air.
And that's one reason why your calculator gets warm when you use it. "There's this deep relationship between physics and information theory," said study co-author Artemy Kolchinsky. "If you erase a bit of information, you have to generate a little bit of heat."
But if everything you do ultimately increases the overall entropy, what does that say about the universe as a whole?
The implication is that the entire universe's entropy was at a minimum at its creation in the Big Bang -- that it started out extremely ordered, with very low information content. Everything that's happened since has stirred things up and made them more chaotic (i.e., requiring more information for a complete description). Eventually, the universe will reach a state of maximal disorder, and after that, it's pretty much game over; you're stuck there for the foreseeable future. This state goes by the cheerful name the "heat death of the universe."
Not to worry, though. It won't happen for a while, and we've got more pressing matters to attend to in the interim.
To end on a positive note, though -- going back to our original discussion of the increase of entropy as stemming from the likelihood of jumping from a disordered state back to an orderly one, recall that the chance isn't zero, it's just really really really small. So once the heat death of the universe has occurred, there is a non-zero chance that it will spontaneously come back together into a second very-low-entropy singularity, at which point the whole thing starts over. Yeah, it's unlikely, but once the universe is in heat death, it's not like it's got much else to do besides wait.
If Monday's post, about the apparent unpredictability of the eruption of the Earth's volcanoes, freaked you out, you should read Robin George Andrews's wonderful new book Super Volcanoes: What They Reveal About the Earth and the Worlds Beyond.
Andrews, a science journalist and trained volcanologist, went all over the world interviewing researchers on the cutting edge of the science of volcanoes -- including those that occur not only here on Earth, but on the Moon, Mars, Venus, and elsewhere. The book is fascinating enough just from the human aspect of the personalities involved in doing primary research, but looks at a topic it's hard to imagine anyone not being curious about; the restless nature of geology that has generated such catastrophic events as the Yellowstone Supereruptions.
Andrews does a great job not only demystifying what's going on inside volcanoes and faults, but informing us how little we know (especially in the sections on the Moon and Mars, which have extinct volcanoes scientists have yet to completely explain). Along the way we get the message, "Will all you people just calm down a little?", particularly aimed at the purveyors of hype who have for years made wild claims about the likelihood of an eruption at Yellowstone occurring soon (turns out it's very low) and the chances of a supereruption somewhere causing massive climate change and wiping out humanity (not coincidentally, also very low).
Volcanoes, Andrews says, are awesome, powerful, and fascinating, but if you have a modicum of good sense, nothing to fret about. And his book is a brilliant look at the natural process that created a great deal of the geology of the Earth and our neighbor planets -- plate tectonics. If you are interested in geology or just like a wonderful and engrossing book, you should put Super Volcanoes on your to-read list.
[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]