Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label thermodynamics. Show all posts
Showing posts with label thermodynamics. Show all posts

Friday, February 2, 2024

Going against the flow

Two of the most extensively-tested laws of physics are the First and Second Laws of Thermodynamics -- and in the nearly two centuries since they were first formulated, there has not been a single exception found.

The First Law is the less shocking one.  It's sometimes called the Law of Conservation of Matter and Energy, and says simply that in a closed system, the total amount of matter and energy does not change.  You can turn one into the other, or change its form, but the total quantity doesn't vary.  Unsurprising, and in fact can seem a little circular given that this is how a closed system is defined in the first place.

The Second Law is where things get interesting.  It can be formulated a variety of ways, but the simplest is that in a closed system, the amount of entropy (disorder) always increases.  If entropy is being decreased somewhere (the system is becoming more orderly) it always requires (1) an input of energy, and (2) that somewhere else entropy is increasing, and that increase is larger than the localized decrease.  An example is the human body.  When you develop from a single fertilized egg cell to an adult, your overall entropy decreases significantly.  But in the process, you are taking the food molecules you eat and (1) extracting their energy, and (2) increasing their entropy monumentally by chopping them up into little pieces and strewing the pieces about.  So you're able to locally decrease your own entropy, but you leave behind a trail of chaos wherever you go.

Or, as my thermodynamics professor in college put it, a lot of years ago: the First Law says you can't win; the Second Law says you can't break even.  Explaining why the United States Patent Office's official policy is that any application that claims to have a working model of a perpetual motion machine goes directly into the trash without being read any further.

The Carnot Heat Engine [Image is in the Public Domain]

All of this is by way of background for a paper that I ran across in Science, called, "Heat Flowing From Cold to Hot Without External Intervention by Using a 'Thermal Inductor," by Andreas Schilling, Xiaofu Zhang, and Olaf Bossen of the University of Zürich.  Because in this paper, the three physicists have demonstrated the passage of heat energy from a colder object to a warmer one, without any external energy input -- something first shown as impossible by French physicist Sadi Carnot in 1824.

The authors write:
The cooling of boiling water all the way down to freezing, by thermally connecting it to a thermal bath held at ambient temperature without external intervention, would be quite unexpected.  We describe the equivalent of a “thermal inductor,” composed of a Peltier element and an electric inductance, which can drive the temperature difference between two bodies to change sign by imposing inertia on the heat flowing between them, and enable continuing heat transfer from the chilling body to its warmer counterpart without the need of an external driving force.
When I read this, I sat up, squinted at my computer screen, and uttered an expression of surprise that I will leave to your imagination.  In my AP Biology class, I always described the Laws of Thermodynamics as two of the most unshakeable laws of science -- two rules that are never, ever broken.  The idea that three scientists in Switzerland had taken a simple Peltier element -- a type of heat pump often found in refrigerators -- and made it run without expending any energy was earthshattering.

But before you dust off your plans for a perpetual motion machine, read the next lines in the paper:
We demonstrate its operation in an experiment and show that the process can pass through a series of quasi-equilibrium states while fully complying with the second law of thermodynamics.  This thermal inductor extends the analogy between electrical and thermal circuits and could serve, with further progress in thermoelectric materials, to cool hot materials well below ambient temperature without external energy supplies or moving parts.
I'm not going to claim I fully understand how this all works, and how despite the system's bizarre behavior it still obeys the Second Law, but apparently the key point is that despite the heat energy flowing the "wrong way," the system still gains entropy overall.

Which, I must say, was a bit of a relief.

It's still a pretty fantastic discovery.  "With this very simple technology, large amounts of hot solid, liquid or gaseous materials could be cooled to well below room temperature without any energy consumption," study co-author Andreas Schilling said, in a press release from Phys.org.  "Theoretically, this experimental device could turn boiling water to ice, without using any energy."

So don't believe any of the hype that I'm already seeing on dubiously-accurate websites, to the effect that "An Exception Has Been Discovered to the Laws of Thermodynamics!  Physicists Dismayed!  Textbooks Will Have to be Rewritten!"  It's a curiosity, sure, and pretty cool, and sounds like it will have a good many applications, but you shouldn't discount everything you learned in physics class quite yet.

****************************************



Saturday, November 4, 2023

Cold snap

After the warmest fall I can remember, we here in upstate New York finally are seeing cooler weather.  I greet this with mixed feelings.  As I've pointed out here many times, the abnormally warm temperatures we've had in the last few years are not good news.  On the other hand, being a transplanted southerner, I can't say I'm fond of the cold, even after forty years of living in higher latitudes.

Our chilly winters, though, are nothing compared to a lot of other places.  My Canadian friends, even the ones who live in the southern parts of that vast country, see cold temperatures the likes of which I've never had to deal with.  The Rocky Mountain region, from Colorado up into Alberta, drops down to dangerous lows, often coupled with howling winds and snow.  Scandinavia, Siberia, Greenland... there are a lot of places on Earth where the cold season is actively trying to kill you.  The lowest temperature ever recorded on the surface of the Earth was -89.2 C, at Vostok Station, Antarctica, cold enough to freeze carbon dioxide into dry ice.

Makes our current 2 C seems like a gentle spring zephyr.

But I wonder if you've ever considered how much colder it can get?

Temperature is a measure of the average molecular motion of a substance.  It is connected to, but not the same as, the heat energy; to prove that to yourself, put a pot of water on the stove and bring it to boil, and set your oven to 212 F/100C, and then decide which one would be less fun to stick your hand into.  The water and the air in the stove are exactly the same temperature -- i.e., the molecules are moving at the same average speed -- but the water has a great deal more heat energy, because water molecules are so much harder to get moving than air molecules are. 

So logically, there's a minimum temperature; absolute zero, where all molecular motion stops.  This would occur at -273.15 C (0 on the Kelvin scale), but practically speaking, it's impossible to get there.  Even if you could somehow extract all the heat energy from a substance, there's still the kinetic energy of the ground states of the atoms that can't be removed.  Still, the scientists have gotten pretty damn close.  The CUORE laboratory in Italy set a record in 2014, reaching a temperature of 0.006 K, but recently that's been broken on extremely small scales -- two years ago scientists working with an exotic form of matter called a Bose-Einstein condensate got it down to 38 picokelvin -- that's 0.000000000038 degrees above absolute zero.

But that, of course, is all done in a lab setting.  What's the lowest naturally-occurring temperature ever measured?

You might think it's somewhere in deep space, but it's not.  The temperature in deep space varies all over the place; recall that what matters is the average velocity of the atoms in an area, not how much heat energy the region contains.  (The solar corona, for example, can reach temperatures of a million K, which is way higher than the Sun's surface -- there aren't many atoms out there, but the ones there are move like a bat out of hell.)

The coldest known place in the universe, outside of labs down here on Earth, is the Boomerang Nebula, a planetary nebula in the constellation of Centaurus, which has measured temperatures of around 1 K.  The reason why is weird and fascinating.

The Boomerang Nebula [Image is in the Public Domain courtesy of NASA/JPL]

A planetary nebula forms when a red giant star runs out of fuel, and the collapse of the core raises the temperatures to a ridiculously high one million degrees kelvin.  This sudden flare-up blows away the outer atmosphere of the star, dissipating it out into space, and leaves the exposed core as a white-hot white dwarf star, which will then slowly cool over billions of years.

So how could a flare-up of something that hot trigger temperatures that cold?  What's amazing is that it's the same process that heated up the core, but in reverse -- adiabatic heating and cooling.

Way back in 1780, French scientist Jacques Charles discovered that when you compress a gas (reduce its volume), it heats up, and when you allow a gas to expand (increase its volume), it cools.  Volume and temperature turned out to be inversely proportional to each other, something we now call Charles's Law in his honor.  If you've ever noticed that a bicycle pump heats up when you inflate your tire, you've seen Charles's Law in action.

This all happens because upon compression, the mechanical work of reducing the volume adds kinetic energy to the gas (increasing its temperature); when a gas expands, the opposite occurs, and the temperature falls.  This is how compressors in air conditioners and refrigerators work -- the compression of the coolant gas increases its temperature, and the warmed gas is passed through coils where the heat dissipates.  Then it's allowed to expand suddenly, reducing its temperature enough to cool the interior of a freezer compartment to below zero C.

This is what's happening in the Boomerang Nebula, but on a much larger scale.  The outer atmosphere of the star is expanding so fast its temperature has dropped to just one degree above absolute zero -- making this peculiar nebula five thousand light years away the coldest spot in the known universe.

So that's our tour of places you wouldn't want to vacation.  Top of the list: the Boomerang Nebula.  Might be pretty to look at, but from a long way away, and preferably while warmly dressed.

****************************************



Wednesday, November 10, 2021

Can't win, can't break even

Dear readers,

I'm going to take a short break from Skeptophilia -- my next post will be Thursday, November 18.  I'll still be lining up topics during the time I'm away, so keep those suggestions coming!

cheers,

Gordon

**********************************

One of the most misunderstood laws of physics is the Second Law of Thermodynamics.

Honestly, I understand why.  It's one of those bits of science that seem simple on first glance, then the more you learn, the weirder it gets.  The simplest way to state the Second Law is "systems tend to proceed toward disorder," so on the surface it's so common-sensical that it triggers nothing more than a shrug and, "Well, of course."  But a lot of its ramifications are seriously non-intuitive, and a few are downright mindblowing.

The other problem with it is that it exists in multiple formulations that seem to have nothing to do with one another.  These include:
  • the aforementioned statement that without an energy input, over time, systems become more disordered.
  • if you place a warm object and cool object in contact with each other, energy will flow from the warmer to the cooler; the warmer object will cool off, and the cooler one will heat up, until they reach thermal equilibrium (equal temperatures).
  • no machine can run at 100% efficiency (i.e., turning all of its energy input into usable work).
  • some processes are irreversible; for example, there's nothing odd about knocking a wine glass off the table and shattering it, but if you were watching and the shards gathered themselves back together and leapt off the floor and back onto the table as an intact wine glass, you might wonder if all you'd been drinking was wine.
The fact that all of these are, at their basis, different ways of stating the same physical law is not obvious.

For me, the easiest way to understand the "why" of the Second Law has to do with a deck of playing cards.  Let's say you have a deck in order; each suit arranged from ace to king, and the four suits in the order hearts, spades, diamonds, clubs.  How many possible ways are there to arrange the cards in exactly that way?

Duh.  Only one, by definition.

Now, let's say you accidentally drop the deck, then pick it up.  Unless you flung the deck across the room, chances are, there will still be some of the cards in the original order, but some of the orderliness will probably have been lost.  Why?  Because there's only a single way to arrange the cards in the order you started with, but there are lots of ways to have them mostly out of order.  The chances of jumping from the single orderly state to one of the many disorderly states is a near certainty.  Then you drop them again (you're having a clumsy day, apparently).  Are they more likely to become more disordered or more orderly?

You see where this is going; since at each round, there are way more disorderly states than orderly ones, just by the laws of statistics you're almost certainly going to watch the deck becoming progressively more disordered.  Yes, it's possible that you could take a completely random deck, toss them in the air, and they'd fall into ace-through-king, hearts-spades-diamonds-clubs -- but if you're waiting for that to happen by random chance, you're going to have a long wait.

You can, of course, force them back into order by painstakingly rearranging the cards, but that takes an input of energy (in the form of your brain and muscles using up chemical energy to accomplish it).  And here's where it gets weird; if you were to measure the decrease in entropy (disorder) in the deck of cards as you rearranged them, it would be outweighed by the increase in entropy of the energy-containing molecules you burned through to do it.  The outcome: you can locally and temporarily decrease entropy, but only at the expense of creating more entropy somewhere else.  Everything we do makes things more chaotic, and any decrease in entropy we see is illusory.  In the end, entropy always wins.

As my long-ago thermodynamics professor told us, "The First Law of Thermodynamics says that you can't win.  The Second Law says you can't break even."

Hell of a way to run a casino, that.

[Image is in the Public Domain]

The reason this all comes up is a paper that a friend of mine sent me a link to, which looks at yet another way of characterizing the Second Law; instead of heat transfer or overall orderliness, it considers entropy as a measure of information content.  The less information you need to describe a system, the lower its entropy; in the example of the deck of cards, I was able to describe the orderly state in seven words (ace-through-king, hearts-spades-diamonds-clubs).  High-entropy states require a lot of information; pick any of the out-of-order arrangements of the deck of cards, and pretty much the only way to describe it is to list each card individually from the top of the deck to the bottom.

The current paper has to do with information stored inside machines, and like many formulations of the Second Law, it results in some seriously weird implications.  Consider, for example, a simple operation on a calculator -- 2+2, for example.  When you press the "equals" sign, and the calculator tells you the answer is four, have you lost information, or gained it?

Most people, myself included, would have guessed that you've gained information; you now know that 2+2=4, if you didn't already know that.  In a thermodynamic sense, though, you've lost information.  When you get the output (4), you irreversibly erase the input (2+2).  Think about going the other way, and it becomes clearer; someone gives you the output (4) and asks you what the input was.

No way to tell.  There are, in fact, an infinite number of arithmetic operations that would give you the answer "4".  What a calculator does is time-irreversible.  "Computing systems are designed specifically to lose information about their past as they evolve," said study co-author David Wolpert, of the Santa Fe Institute.

By reducing the information in the calculator, you're decreasing its entropy (the answer has less information than the input did).  And that means that the calculator is increasing entropy more somewhere else -- in this case, it heats up the surrounding air.

And that's one reason why your calculator gets warm when you use it.  "There's this deep relationship between physics and information theory," said study co-author Artemy Kolchinsky.  "If you erase a bit of information, you have to generate a little bit of heat."

But if everything you do ultimately increases the overall entropy, what does that say about the universe as a whole?

The implication is that the entire universe's entropy was at a minimum at its creation in the Big Bang -- that it started out extremely ordered, with very low information content.  Everything that's happened since has stirred things up and made them more chaotic (i.e., requiring more information for a complete description).  Eventually, the universe will reach a state of maximal disorder, and after that, it's pretty much game over; you're stuck there for the foreseeable future.  This state goes by the cheerful name the "heat death of the universe."

Not to worry, though.  It won't happen for a while, and we've got more pressing matters to attend to in the interim.

To end on a positive note, though -- going back to our original discussion of the increase of entropy as stemming from the likelihood of jumping from a disordered state back to an orderly one, recall that the chance isn't zero, it's just really really really small.  So once the heat death of the universe has occurred, there is a non-zero chance that it will spontaneously come back together into a second very-low-entropy singularity, at which point the whole thing starts over.  Yeah, it's unlikely, but once the universe is in heat death, it's not like it's got much else to do besides wait.

*********************************************

If Monday's post, about the apparent unpredictability of the eruption of the Earth's volcanoes, freaked you out, you should read Robin George Andrews's wonderful new book Super Volcanoes: What They Reveal About the Earth and the Worlds Beyond.

Andrews, a science journalist and trained volcanologist, went all over the world interviewing researchers on the cutting edge of the science of volcanoes -- including those that occur not only here on Earth, but on the Moon, Mars, Venus, and elsewhere.  The book is fascinating enough just from the human aspect of the personalities involved in doing primary research, but looks at a topic it's hard to imagine anyone not being curious about; the restless nature of geology that has generated such catastrophic events as the Yellowstone Supereruptions.

Andrews does a great job not only demystifying what's going on inside volcanoes and faults, but informing us how little we know (especially in the sections on the Moon and Mars, which have extinct volcanoes scientists have yet to completely explain).  Along the way we get the message, "Will all you people just calm down a little?", particularly aimed at the purveyors of hype who have for years made wild claims about the likelihood of an eruption at Yellowstone occurring soon (turns out it's very low) and the chances of a supereruption somewhere causing massive climate change and wiping out humanity (not coincidentally, also very low).

Volcanoes, Andrews says, are awesome, powerful, and fascinating, but if you have a modicum of good sense, nothing to fret about.  And his book is a brilliant look at the natural process that created a great deal of the geology of the Earth and our neighbor planets -- plate tectonics.  If you are interested in geology or just like a wonderful and engrossing book, you should put Super Volcanoes on your to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, April 29, 2021

Watching the clock

 If I had to pick the scientific law that is the most misunderstood by the general public, it would have to be the Second Law of Thermodynamics.

The First Law of Thermodynamics says that the total quantity of energy and mass in a closed system never changes; it's sometimes stated as, "Mass and energy cannot be destroyed, only transformed."  The Second Law states that in a closed system, the total disorder (entropy) always increases.  As my long-ago thermodynamics professor put it, "The First Law says you can't win; the Second Law says you can't break even."

Hell of a way to run a casino, that.

So far, there doesn't seem to be anything particularly non-intuitive about this.  Even from our day-to-day experience, we can surmise that the amount of stuff seems to remain pretty constant, and that if you leave something without maintenance, it tends to break down sooner or later.  But the interesting (and less obvious) side starts to appear when you ask the question, "If the Second Law says that systems tend toward disorder, how can a system become more orderly?  I can fling a deck of cards and make them more disordered, but if I want I can pick them up and re-order them.  Doesn't that break the Second Law?"

It doesn't, of course, but the reason why is quite subtle, and has some pretty devastating implications.  The solution to the question comes from asking how you accomplish re-ordering a deck of cards.  Well, you use your sensory organs and brain to figure out the correct order, and the muscles in your arms and hands (and legs, depending upon how far you flung them in the first place) to put them back in the correct order.  How did you do all that?  By using energy from your food to power the organs in your body.  And to get the energy out of those food molecules -- especially glucose, our primary fuel -- you broke them to bits and jettisoned the pieces after you were done with them.  (When you break down glucose to extract the energy, a process called cellular respiration, the bits left are carbon dioxide and water.  So the carbon dioxide you exhale is actually broken-down sugar.)

Here's the kicker.  If you were to measure the entropy decrease in the deck of cards, it would be less -- way less -- than the entropy increase in the molecules you chopped up to get the energy to put the cards back in order.  Every time you increase the orderliness of a system, it always (1) requires an input of energy, and (2) increases the disorderliness somewhere else.  We are, in fact, little chaos machines, leaving behind a trail of entropy everywhere we go, and the more we try to fix things, the worse the situation gets.

I've heard people arguing that the Second Law disproves evolution because the evolutionary model claims we're in a system that has become more complex over time, which according to the Second Law is impossible.  It's not; and in fact, that statement betrays a fundamental lack of understanding of what the Second Law means.  The only reason why any increase in order occurs -- be it evolution, or embryonic development, or stacking a deck of cards -- is because there's a constant input of energy, and the decrease in entropy is offset by a bigger increase somewhere else.  The Earth's ecosystems have become more complex in the 4.5 billion year history of life because there's been a continuous influx of energy from the Sun.  If that influx were to stop, things would break down.

Fast.

The reason all this comes up is because of a paper this week in Physical Review X that gives another example of trying to make things better, and making them worse in the process.  This one has to do with the accuracy of clocks -- a huge deal to scientists who are studying the rate of reactions, where the time needs to be measured to phenomenal precision, on the scale of nanoseconds or better.  The problem is, we learn from "Measuring the Thermodynamic Cost of Timekeeping," the more accurate the clock is, the higher the entropy produced by its workings.  So, in effect, you can only measure time in a system to the extent you're willing to screw the system up.

[Image licensed under the Creative Commons Robbert van der Steeg, Eternal clock, CC BY-SA 2.0]

The authors write:

All clocks, in some form or another, use the evolution of nature towards higher entropy states to quantify the passage of time.  Due to the statistical nature of the second law and corresponding entropy flows, fluctuations fundamentally limit the performance of any clock.  This suggests a deep relation between the increase in entropy and the quality of clock ticks...  We show theoretically that the maximum possible accuracy for this classical clock is proportional to the entropy created per tick, similar to the known limit for a weakly coupled quantum clock but with a different proportionality constant.  We measure both the accuracy and the entropy.  Once non-thermal noise is accounted for, we find that there is a linear relation between accuracy and entropy and that the clock operates within an order of magnitude of the theoretical bound.

Study co-author Natalia Ares, of the University of Oxford, summarized their findings succinctly in an article in Science News; "If you want a better clock," she said, "you have to pay for it."

So a little like the Heisenberg Uncertainty Principle, the more you try to push things in a positive direction, the more the universe pushes back in the negative direction.  

Apparently, even if all you want to know is what time it is, you still can't break even.

So that's our somewhat depressing science for the day.  Entropy always wins, no matter what you do.  Maybe I can use this as an excuse for not doing housework.  Hey, if I make things more orderly here, all it does is mess things up elsewhere, so what's the point?

Nah, never mind.  My wife'll never buy it.

****************************************

When people think of mass extinctions, the one that usually comes to mind first is the Cretaceous-Tertiary Extinction of 66 million years ago, the one that wiped out all the non-avian dinosaurs and a good many species of other types.  It certainly was massive -- current estimates are that it killed between fifty and sixty percent of the species alive at the time -- but it was far from the biggest.

The largest mass extinction ever took place 251 million years ago, and it destroyed over ninety percent of life on Earth, taking out whole taxa and changing the direction of evolution permanently.  But what could cause a disaster on this scale?

In When Life Nearly Died: The Greatest Mass Extinction of All Time, University of Bristol paleontologist Michael Benton describes an event so catastrophic that it beggars the imagination.  Following researchers to outcrops of rock from the time of the extinction, he looks at what was lost -- trilobites, horn corals, sea scorpions, and blastoids (a starfish relative) vanished completely, but no group was without losses.  Even terrestrial vertebrates, who made it through the bottleneck and proceeded to kind of take over, had losses on the order of seventy percent.

He goes through the possible causes for the extinction, along with the evidence for each, along the way painting a terrifying picture of a world that very nearly became uninhabited.  It's a grim but fascinating story, and Benton's expertise and clarity of writing makes it a brilliant read.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, April 22, 2019

Going against the flow

Two of the most extensively-tested laws of physics are the First and Second Laws of Thermodynamics -- and in the nearly two centuries since they were first formulated, there has not been a single exception found.

The First Law is the less shocking one.  It's sometimes called the Law of Conservation of Matter and Energy, and says simply that in a closed system, the total amount of matter and energy does not change.  You can turn one into the other, or change its form, but the total quantity doesn't vary.  Unsurprising, and in fact can seem a little circular given that this is how a closed system is defined in the first place.

The Second Law is where things get interesting.  It can be formulated a variety of ways, but the simplest is that in a closed system, the amount of entropy (disorder) always increases.  If entropy is being decreased somewhere (the system is becoming more orderly) it always requires (1) an input of energy, and (2) that somewhere else entropy is increasing, and that increase is larger than the localized decrease.  An example is the human body.  When you go from a single fertilized egg cell to an adult, your overall entropy decreases significantly.  But in the process, you are taking the food molecules you eat and (1) extracting their energy, and (2) increasing their entropy monumentally by chopping them up into little pieces and strewing the pieces about.  So you're able to locally decrease your own entropy, but you leave behind a trail of chaos wherever you go.

Or, as my thermodynamics professor in college put it, a lot of years ago: the First Law says you can't win; the Second Law says you can't break even.  Explaining why the United States Patent Office's official policy is that any application that claims to have a working model of a perpetual motion machine goes directly into the trash without being read any further.

The Carnot Heat Engine [Image is in the Public Domain]

All of this is by way of background for a paper that appeared last week in Science, called, "Heat Flowing From Cold to Hot Without External Intervention by Using a 'Thermal Inductor," by Andreas Schilling, Xiaofu Zhang, and Olaf Bossen of the University of Zurich.  Because in this paper, the three physicists have demonstrated the passage of heat energy from a colder object to a warmer one, without any external energy input -- something first shown as impossible by French physicist Sadi Carnot in 1824.

The authors write:
The cooling of boiling water all the way down to freezing, by thermally connecting it to a thermal bath held at ambient temperature without external intervention, would be quite unexpected.  We describe the equivalent of a “thermal inductor,” composed of a Peltier element and an electric inductance, which can drive the temperature difference between two bodies to change sign by imposing inertia on the heat flowing between them, and enable continuing heat transfer from the chilling body to its warmer counterpart without the need of an external driving force.
When I read this, I sat up, squinted at my computer screen, and uttered an expression of surprise that I will leave to your imagination.  In my AP Biology class, I always described the Laws of Thermodynamics as two of the most unshakeable laws of science -- two rules that are never, ever broken.  The idea that three scientists in Switzerland had taken a simple Peltier element -- a type of heat pump often found in refrigerators -- and made it run without expending any energy was earthshattering.

But before you dust off your plans for a perpetual motion machine, read the next lines in the paper:
We demonstrate its operation in an experiment and show that the process can pass through a series of quasi-equilibrium states while fully complying with the second law of thermodynamics.  This thermal inductor extends the analogy between electrical and thermal circuits and could serve, with further progress in thermoelectric materials, to cool hot materials well below ambient temperature without external energy supplies or moving parts.
I'm not going to claim I fully understand how this all works, and how despite the system's bizarre behavior it still obeys the Second Law, but apparently the key point is that despite the heat energy flowing the "wrong way," the system still gains entropy overall.

Which, I must say, was a bit of a relief.

It's still a pretty fantastic discovery.  "With this very simple technology, large amounts of hot solid, liquid or gaseous materials could be cooled to well below room temperature without any energy consumption," study co-author Andreas Schilling said, in a press release from Phys.org. "Theoretically, this experimental device could turn boiling water to ice, without using any energy."

So don't believe any of the hype that I'm already seeing on dubiously-accurate websites, to the effect that "An Exception Has Been Discovered to the Laws of Thermodynamics!  Physicists Dismayed!  Textbooks Will Have to be Rewritten!"  It's a curiosity, sure, and pretty cool, and sounds like it will have a good many applications, but you shouldn't discount everything you learned in physics class quite yet.

***********************************

This week's Skeptophilia book recommendation is a classic, and is pure fun: Man Meets Dog by the eminent Austrian zoologist and ethologist Konrad Lorenz.  In it, he looks at every facet of the human/canine relationship, and -- if you're like me -- you'll more than once burst out laughing and say, "Yeah, my dog does that all the time!"

It must be said that (as the book was originally written in 1949) some of what he says about the origins of dogs has been superseded by better information from genetic analysis that was unavailable in Lorenz's time, but most of the rest of his Doggy Psychological Treatise still stands.  And in any case, you'll learn something about how and why your pooches behave the way they do -- and along the way, a bit about human behavior, too.

[Note: If you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]






Wednesday, January 20, 2016

Propping up climate denialism

Sometimes it appears that people only want to learn enough science to (1) sound scientific, and (2) prop up whatever ideas they already believed.

A sterling example came up a couple of days ago, thanks to one Ross MacLeod, who over at the sorely-misnamed Principia Scientific International explains to us why the Stefan-Boltzmann equation proves that there's no such thing as anthropogenic climate change.

The Stefan-Boltzmann equation, named after physicists Josef Stefan and Ludwig Boltzmann, basically says that the power radiated by an object is proportional to the product of its surface area and the fourth power of its temperature in degrees Kelvin.  It's not a hard relationship to comprehend, although it has deep and far-reaching (and difficult) implications for thermodynamics.  In any case, you can see why this equation is of interest to climate scientists, being that the Earth is both absorbing and radiating heat, and the relative rates at which these two happen are responsible for its average temperature.

So anyway, MacLeod quotes a NASA publication on climate, which says the following:
When it comes to climate and climate change, the Earth’s radiation budget is what makes it all happen.  Swathed in its protective blanket of atmospheric gases against the boiling Sun and frigid space, the Earth maintains its life-friendly temperature by reflecting, absorbing, and re-emitting just the right amount of solar radiation. To maintain a certain average global temperature, the Earth must emit as much radiation as it absorbs. If, for example, increasing concentrations of greenhouse gases like carbon dioxide cause Earth to absorb more than it re-radiates, the planet will warm up.
Not really all that controversial, you'd think.  But no, MacLeod implies that the atmosphere isn't keeping us warm, it's keeping us cool by protecting us from the boiling hot inferno of space:
The Sun has a surface temperature of 5778 Kelvin and emits of the order of 63,290,000 Wm-2 over every square metre of the photosphere.  By the inverse square law this staggering power is reduced to ~1368 Wm-2 at the distance the Earth is from the Sun...  A simple Stefan-Boltzmann calculation establishes this radiation power is capable of easily boiling water at Earth’s orbit – ~120 degrees C.  Even as far away as Mars the solar radiation is capable of inducing a temperature of ~319 Kelvin or ~46 degrees C in any object that absorbs significant quantities of it.
He then sneers, "Are the people who write gobbledygook like [the NASA publication] simply too stupid to describe or are they deliberately practicing misinformation to bolster a hypothesis?"

Which sounds like it could have come directly from the Unintentional Irony Department, because he is bolstering his own hypothesis by applying the Stefan-Boltzmann law incorrectly.

As he could have found out by taking a physics class -- or failing that, with a simple Wikipedia search -- in order to correctly calculate the energy budget of the Earth, you have to take into account that the Earth is a sphere.  He applied the law as if the Earth was a flat disk, and (unsurprisingly) got the wrong answer.  If you apply the law correctly, you come up with an average temperature of 6 °C -- so he was not only wrong, he was wrong by a factor of 20.

And yes, you read that right.  A climate change denier who calls the folks at NASA"simply too stupid to describe" apparently thinks that the Earth is flat.

The sad part is that this kind of specious reasoning (if I can even dignify it with that term) convinces people.  Almost no one has the expertise to recognize his argument as wrong on first reading; regrettably few think to check what he's saying against actual science.  All too many people see science-y words ("Ooh!  Stefan-Boltzmann equation!  That sounds complicated!  He must be right.") and swallow the rest of the claim without question.


So instead, let's look at some real science, shall we?  Because on the same day that Mr. MacLeod wrote his absurd piece on Flat Earth physics, a paper was published in Nature that shows that the amount of anthropogenic heat energy being dumped into the oceans has doubled since 1997.  Study co-author Jane Lubchenco of the Oregon State University Marine Sciences department said, "These findings have potentially serious consequences for life in the oceans as well as for patterns of ocean circulation, storm tracks and storm intensity."

James Severinghaus, of Scripps Institute of Oceanography, was even more unequivocal: "This study provides real, hard evidence that humans are dramatically heating the planet."

So once again, the climate change deniers are throwing around scientific terms in order to prop up a viewpoint that is contradicted by all of the evidence, and flies in the face of the consensus of nearly 100% of the climate scientists themselves.

Further indication that when it's expedient, a confident-sounding fast-talker can induce people to believe damn near anything.