Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label Second Law of Thermodynamics. Show all posts
Showing posts with label Second Law of Thermodynamics. Show all posts

Friday, May 30, 2025

Sundrops

It's always a little surprising to find out that phenomena that are (figuratively speaking) right in our own neighborhood are still a mystery.

One example is the temperature of the solar corona.  We aren't usually aware of the solar corona -- its eerie pinkish luminescence is ordinarily lost in the much brighter radiance of the solar photosphere.  But it becomes visible during a total solar eclipse:

The total solar eclipse of 21 August 2017, photographed by Giuseppe Donatiello [Image is in the Public Domain]

The core of the Sun is estimated to have a temperature of about 15,700,000 K; that heat energy reaches the surface (largely through convection) and then is lost to space.  The outer layer, or photosphere -- the part we can see from Earth on a sunny day -- is around 5,800 K, which is still pretty hot.  But the wispy corona that surrounds the sun is around 5,000,000 K.

But how can that be?  Since, presumably, it's obtaining its heat from the photosphere, how can it be hotter than its own heat source?  Doesn't that break the Second Law of Thermodynamics, which says (amongst other things) that heat energy only flows from hotter objects to cooler ones?

Well, one thing to keep in mind -- not that it solves the mystery, or anything, but at least to get the facts straight -- is that temperature and heat energy are not the same thing, although they are clearly related.  Temperature is a measure of the kinetic energy of molecules, and that depends on more than their heat energy content, but factors like what the material is made of and how densely packed it is.  When I explained this to my students, I used the example of a pot of water heated to boiling (100 C) and an oven heated to 100 C (212 F).  Now imagine putting one hand in the pot of water and the other in the oven for five seconds.

Wouldn't be the same, would it?  Water holds a great deal more heat energy than air does -- at the same temperature.

So the five million Kelvin temperature of the corona is a measure of how fast the molecules are moving.  But still -- something is giving them that much kinetic energy.  So how's it all work?

Now, a new study from the National Solar Observatory has provided one piece of the puzzle -- but in the process, raised more questions.

It appears that packets of extremely hot material are being launched from the surface of the Sun.  When they get away from the turbulent photosphere, the pressure drops, and these "heat bombs" explode, releasing their energy into the corona.  The cooled plasma then recondenses and falls back into the Sun as "coronal raindrops."

Raindrops twenty kilometers wide.

So at least part of the answer is that this launching of plasma from the surface is acting as a heat energy transporter.  But how this process sustains the coronal temperature at a (much) higher value than the surface of the Sun below it is still mysterious, as is the connection between coronal rain and larger-scale phenomena like sunspots, solar prominences, and coronal mass ejections (including the scarily enormous Miyake events).

Like the best science, this study suggests an explanation for some facets of the phenomenon, but leaves a great deal of room for further study.  And points out the fact that we still have many mysteries left to ponder, including about our closest star, something we see every clear day.  Even the familiar can lead us into deep waters fast -- if you ask the right questions.

****************************************


Sunday, May 4, 2025

Reversing the arrow

In my short story "Retrograde," the main character, Eli, meets a woman who makes the bizarre claim that she experiences time running backwards.

She's not like Benjamin Button, who ages in reverse; she experiences everything in reverse.  But from our perspective, nothing seems amiss.  From hers, though... she remembers future events and not past ones:

Hannah gave him a long, steady look.  "All I can say is that we see the same things.  For me, the film runs backwards, that’s all.  Other than that, there’s no difference.  There’s nothing I can do to change the way things unfold, same as with you."

"That’s why you were crying when I came in.  Because of something that for you, had already happened?  What was it?"

She shook her head.  "I shouldn’t answer that, Eli."

"It’s me, isn’t it?  For me, I was just meeting you for the first time.  For you, it was the last time you’d ever see me."  I winced, and rubbed my eyes with the heel of my hand.  "Jesus, I’m starting to believe you.  But that’s it, right?"

Hannah didn’t answer for a moment.  "The thing is—you know, you start looking at things as inevitable.  Like you’re in some sort of film.  The actors seem to have freedom.  They seem to have will, but in reality the whole thing is scrolling by and what’s going to happen is only what’s already written in the script.  You could, if you wanted to, start at the end and run the film backwards.  Same stuff, different direction.  No real difference except for the arrow of time."

Einstein's General Theory of Relativity shows that space and time are inextricably linked -- spacetime -- but doesn't answer the perplexing question of why we can move in any direction through space, but only one direction through time.  You can alter the rate of time's passage, at least relative to some other reference frame, by changing your velocity; but unlike what the characters in "Retrograde" experience, the arrow always points the same way.  

This becomes odder still when you consider that in just about all physical processes, there is no inherent arrow of time.  Look at a video clip of a pool ball bouncing off the side bumper, then run it backwards -- it'd be damn hard to tell which was the actual, forward-running clip.

Hard -- but not impossible.  The one physical law that has an inherent arrow of time is the Second Law of Thermodynamics.  If the clip was long enough, or your measurement devices sensitive enough, you could tell which was the forward clip because in that one, the pool ball would be slowing down from dissipation of its kinetic energy in the form of friction with the table surface.  Likewise, water doesn't unspill, glasses unbreak, snowbanks un-avalanche, reassembling in pristine smoothness on the mountainside.  But why this impels a universal forward-moving arrow of time -- and more personally, why it makes us remember the past and not the future -- is still an unanswered question.

"The arrow of time is only an illusion," Einstein quipped, "but it is a remarkably persistent one."

Two recent papers have shed some light on this strange conundrum.  In the first, a team led by Andrea Rocco of Surrey University looked how the equations of the Second Law work on the quantum level, and found something intriguing; introducing the Second Law into the quantum model generated two arrows of time, one pointing into the past and one pointing into the future.  But no matter which time path is taken, entropy still increases as you go down it.

"You’d still see the milk spilling on the table, but your clock would go the other way around," Rocco said.  "In this way, entropy still increases, but it increases toward the past instead of the future.  The milk doesn’t flow back into the glass, which the Second Law of Thermodynamics forbids, but it flows out of the glass in the direction of the past.  Regardless of whether time’s arrow shoots toward the future or past from a given moment, entropy will still dissipate in that given direction."

In the second, from Lorenzo Gavassino of Vanderbilt University et al., the researchers were investigating the mathematics of "closed time-like loops" -- i.e., time travel into the past, followed by a return to your starting point.  And what they found was that once again, the Second Law gets in the way of anything wibbly-wobbly.


Gavassino's model shows that on a closed time-like loop, entropy must peak somewhere along the loop -- so along some part of the loop, entropy has to decrease to return it to where it was when the voyage began.  The equations then imply that one of two things must be true.  Either:
  1. Time travel into the past is fundamentally impossible, because it would require entropy to backpedal; or
  2. If overall entropy can decrease somewhere along the path, it would undo everything that had happened along the entropy-increasing part of the loop, including your own memories.  So you could time travel, but you wouldn't remember anything about it (including that it had ever happened).
"Any memory that is collected along the closed time-like curve," Gavassino said, "will be erased before the end of the loop."

So that's no fun at all.  Lieutenant Commander Geordi LaForge would like to have a word with you, Dr. Gavassino.

Anyhow, that's today's excursion into one of the weirdest parts of physics.  Looks like the Second Law of Thermodynamics is still strictly enforced in all jurisdictions.  Time might be able to run backwards, but you'd never know because (1) entropy will still increase in that direction, and (2) any loop you might take will result in your remembering nothing about the trip.  So I guess we're still stuck with clocks running forwards -- and having to wait to find out what's going to happen in the future at a rate of one minute per minute.

****************************************


Friday, February 14, 2025

Hotspot

Today's topic in Skeptophilia isn't controversial so much as it is amazing.  And shows us once again what a weird, endlessly fascinating universe we live in.

First, though, a bit of a science lesson.

A great many processes in the natural world happen because of the Second Law of Thermodynamics.  The Second Law can be framed in a variety of ways, two of which are: (1) heat always tends to flow from a hotter object to a colder one; and (2) in a closed system, entropy -- disorder -- always increases.  (Why those are two ways of representing the same underlying physical law is subtle, and beyond the scope of this post.)

In any case, the Second Law is the driver behind weather.  Just about all weather happens because of heat energy redistribution -- the Sun warms the ground, which heats the air.  Hot air tends to rise, so it does, drawing in air from the sides and creating a low pressure center (and wind).  As the warm air rises, it cools (heat flowing away from the warmer blob of air), making water vapor condense -- which is why low pressure tends to mean precipitation.  Condensation releases heat energy, which also wants to flow toward where it's cooler, cooling the blob of air further (which is also cooling because it's rising and expanding).  When the air cools enough, it sinks, forming a high pressure center -- and on and on.  (Circular air movement of this type -- what are called convection cells -- can be local or global in reach.  Honestly, a hurricane is just a giant low-pressure convector.  A heat pump, in essence.  Just a fast and powerful one.)

Okay, so that's the general idea, and to any physicists who read this, I'm sorry for the oversimplifications (but if I've made any outright errors, let me know so I can fix them; there's enough nonsense out there based in misunderstandings of the Second Law that the last thing I want is to add to it).  Any time you have uneven heating, there's going to be a flow of heat energy from one place to the other, whether through convection, conduction, or radiation.

But if you think we get some violent effects from this process here on Earth, wait till you hear about KELT-9b.

KELT-9b is an exoplanet about 670 light years from Earth.  But it has some characteristics that would put it at the top of the list of "weirdest planets ever discovered."  Here are a few:
  • It's three times the mass of Jupiter, the largest planet in our Solar System.
  • It's moving at a fantastic speed, orbiting its star in only a day and a half.
  • It's tidally locked -- the same side of the planet is always facing the star, meaning there's a permanently light side and a permanently dark side.
  • It's the hottest exoplanet yet discovered -- the light side has a mean temperature of 4,300 C, which is hotter than some stars.
So the conditions on this planet are pretty extreme.  But as I found out in a paper by Megan Mansfield of the University of Chicago et al. in Astrophysical Journal Letters, even knowing all that didn't stop it from harboring a few more surprises.

Artist's conception of KELT-9b [Image is in the Public Domain courtesy of NASA/JPL]

Tidally-locked planets are likely to have some of the most extraordinary weather in the universe, again because of effects of the Second Law.  Here on Earth, with a planet that rotates once a day, the land surface has an opportunity to heat up and cool down regularly, giving the heat redistribution effects of the Second Law less to work with.  On KELT-9b, though, the same side of the planet gets cooked constantly, so not only is it really freakin' hot, there's way more of a temperature differential between the light side and the dark side than you'd ever get in our Solar System (even Mercury doesn't have that great a difference).

So there must be a phenomenal amount of convection taking place, with the atmosphere on the light side convecting toward the dark side like no hurricane we've ever seen.  But that's where Mansfield et al. realized something was amiss.  Because to account for the temperature distribution they were seeing on KELT-9b, there would have to be wind...

... moving at 150,000 miles per hour.

That seemed physically impossible, so there had to be some other process moving heat around besides simple convection.  The researchers found out what it is -- the heat energy on the light side is sufficient to tear apart hydrogen molecules.

At Earth temperatures, hydrogen exists as a diatomic molecule (H2).  But at KELT-9b's temperatures, the energy tears the molecules into monoatomic hydrogen, storing that as potential energy that is then rereleased when the atoms come back together on the dark side.  So once again we're talking the Second Law -- heat flowing toward the cooler object -- but the carrier of that heat energy isn't just warm air or warm water, but molecules that have been physically torn to shreds.

So, fascinating as it is, KELT-9b would not be the place for Captain Picard to take his away team.  But observed from a distance, it must be spectacular -- glowing blue-white from its own heat, whirling around its host star so fast its year is one and a half of our days, one side in perpetual darkness.  All of which goes to show how prescient William Shakespeare was when he wrote, "There are more things in heaven and Earth, Horatio, than are dreamt of in your philosophy."

****************************************

Thursday, December 5, 2024

Disinformation and disorder

I've dealt with a lot of weird ideas over the thirteen years I've been blogging here at Skeptophilia.

Some of them are so far out there as to be risible.  A few of those that come to mind:
  • the "phantom time hypothesis" -- that almost three hundred years' worth of history didn't happen, and was a later invention developed through collusion between the Holy Roman Empire and the Catholic Church
  • "vortex-based mathematics," which claims (1) that spacetime is shaped like a donut, (2) infinity has an "epicenter," and (3) pi is a whole number
  • the planet Nibiru, which is supposed to either usher in the apocalypse or else cause us all to ascend to a higher plane of existence, but which runs into the snag that it apparently doesn't exist
  • a claim that by virtue of being blessed by a priest, holy water has a different chemical structure and a different set of physical properties from ordinary water
  • gemstones can somehow affect your health through "frequencies"
In this same category, of course, are some things that a lot of people fervently believe, such as homeopathy, divination, and the Flat Earth.

These, honestly, don't bother me all that much, except for the fact that the health-related ones can cause sick people to bypass appropriate medical care in favor of what amounts to snake oil.  But on an intellectual level, they're easily analyzed, and equally easily dismissed.  Once you know some science, you kind of go, "Okay, that makes no sense," and that's that.

It's harder by far to deal with the ones that mix in just enough science that to a layperson, they sound like they could be plausible.  After all, science is hard; I have a B.S. in physics, and most academic papers in the field go whizzing over my head so fast they don't even ruffle my hair.  The problem, therefore, is how to tell if a person is taking (real, but difficult) science, misinterpreting or misrepresenting it, but then presenting it in such an articulate fashion that even to intelligent laypeople, it seems legitimate.

One of the first times I ran into this was the infamous video What the Bleep Do We Know?, from 2004, which is one of the best-known examples of quantum mysticism.  It takes some real, observable effects -- strange stuff like entanglement and indeterminacy and the Heisenberg Uncertainty Principle and the role of the observer in the collapse of the wave function -- and weaves in all sorts of unscientific hand-waving about how "the science says" our minds create the universe, thoughts can influence the behavior of matter, and that the matter/energy equivalence formula means that "all being is energy."  Those parts aren't correct, of course; but the film's makers do it incredibly skillfully, describing the scientific bits more or less accurately, and interviewing actual scientists then editing their segments to make it sound like they're in support of the fundamentally pseudoscientific message of the film's makers.  (It's worth noting that it was the brainchild of none other than J. Z. Knight, whose Ramtha cult has become notorious for its homophobia, anti-Semitism, anti-Catholicism, and racism.)

I ran into a (much) more recent example of this when I picked up a copy of Howard Bloom's book The God Problem: How a Godless Cosmos Creates at our local Friends of the Library used book sale.  At first glance, it looked right down my alley -- a synthesis of modern cosmology, philosophy, and religion.  And certainly the first few pages and the back cover promised great things, with endorsements from everyone from Barbara Ehrenreich to Robert Sapolsky to Edgar Mitchell.

I hadn't gotten very far into it, however, before I started to wonder.  The writing is frenetic, jumping from one topic to another seemingly willy-nilly, sprinkled with rapid-fire witticisms that in context sound like the result of way too many espressos.  But I was willing to discount that as a matter of stylistic preference, until I started running one after another into weird claims of profound insights that turn out, on examination, to be simply sleight-of hand.  We're told, for example, that we should believe his "heresy" that "A is not equal to A," and when he explains it, it turns out that this only works if you define the first A differently from the second one.  Likewise that "one plus one doesn't equal two" -- only if you're talking about the fact that joining two things together can result in the production of something different (such as a proton and an electron coming together to form a neutral hydrogen atom).

So his supposedly earthshattering "heresies" turn out to be something that, if you know a little science, would induce you to shrug your shoulders and say, "So?"

But what finally pissed me off enough that I felt like I needed to address it here was his claim that the Second Law of Thermodynamics is wrong, which he said was a heresy so terrible we should "prepare to be burned at the stake" by the scientific establishment for believing him.  Here's a direct quote:
... the Second Law of Thermodynamics [is] a law that's holy, sacred, and revered.  What is the Second Law?  All things tend toward disorder.  All things fall apart.  All things tend toward the random scramble of formlessness and meaninglessness called entropy.
He then goes into a page-long description of what happens when you put a sugar cube into a glass of water, and ends with:
The molecules of sugar in your glass went from a highly ordered state to a random whizzle [sic] of glucose and fructose molecules evenly distributed throughout your glass.  And that, says the Second Law of Thermodynamics, is the fate of everything in the universe.  A fate so inevitable that the cosmos will end in an extreme of lethargy, a catastrophe called "heat death."  The cosmos will come apart in a random whoozle [sic] just like the sugar cube did.  The notion of heat death is a belief so widespread that it was enunciated by Lord Kelvin in 1851 and has hung around like a catechism.
Then he tells us what the problem is:
But is the Second Law of Thermodynamics true?  Do all things tend to disorder?  Is the universe in a steady state of decline?  Is it moving step by step toward randomness?  Are form and structure steadily stumbling down the stairway of form into the chaos of a wispy gas?...  No.  In fact, the very opposite is true.  The universe is steadily climbing up.  It is steadily becoming more form-filled and more structure-rich.  How could that possibly be true?  Everyone knows that the Second Law of Thermodynamics is gospel.  Including everybody who is anybody in the world of physics, chemistry, and even complexity theory.
*brief pause to scream obscenities*  

*another brief pause to reassure my puppy that he's not the one I'm mad at*

No one, scientist or otherwise, is going to burn Bloom at the stake for this, because what he's claiming is simply wrong.  This is a complete mischaracterization of what the Second Law says.  Whether Bloom knows that, and is deliberately misrepresenting it, or simply doesn't understand it himself, I'm not sure.  What the Second Law says, at least in one formulation, is that in a closed system, the overall entropy always increases -- and the critical italicized bit is the part he conveniently leaves out.  Of course order can be increased, but it's always at the cost of (1) expending energy, and (2) increasing entropy more somewhere else.  A simple example is the development of a human from a single fertilized egg cell, which represents a significant increase in complexity and decrease in entropy.  But the only way that's accomplished is by giving the developing human a continuous source of energy and building blocks (i.e., food), and cellular processes tearing those food molecules to shreds, increasing their entropy.  And what the Second Law says is that the entropy increase experienced by the food molecules is bigger than the entropy decrease experienced by the developing human.  (I wrote a longer explanation of this principle a while back, if you're interested in more information.)

Let's just put it this way.  If what Bloom is saying -- that the Second Law is wrong -- was true, he'd be in line for a Nobel Prize.  There has never, ever been an exception found to the Second Law, despite centuries of testing, and the frustrated desires of perpetual-motion-machine-inventors the world over.

A model of a perpetual motion machine -- which, for the record, doesn't work [Image licensed under the Creative Commons Tiia Monto, Deutsches Museum 6, CC BY-SA 4.0]

So Bloom got it badly wrong.  He's hardly the first person to do so.  Why, then, does this grind my gears so badly?

It's that apparently no one on his editorial team, and none of the dozens of people who endorsed his book, thought even to read the fucking Wikipedia page about this fundamental law of physics Bloom is saying is incorrect.  And he certainly sounds convincing; his writing is like a sort-of-scientific-or-something Gish gallop, hurling so many arguments at us all at once that it's all readers can do to withstand the barrage and stay on our feet.

For me, though, it immediately made me discount anything else he has to say.  If his understanding of a basic scientific law that I've known about since freshman physics, and taught every year to my AP Biology students, is that flawed, how can I trust what he says on other topics about which I might not have as much background knowledge?

And that, to me, is the danger.  It's easy to point out the obvious nonsense like space donuts and gemstone frequencies -- but far harder to recognize pseudoscience that is twisted together with actual science so intricately that you can't see where one ends and the other begins.  Especially if -- as is the case with The God Problem -- it's couched in folksy, jargon-free anecdote that sounds completely reasonable.

I guess the only real solution is to learn enough science to be able to recognize this kind of thing when you see it.  And that takes time and hard work.  But it's absolutely critical, especially in our current political situation here in the United States, where there are people who are deliberately spinning falsehoods for their own malign purposes about such critical issues as health care, gender and sexuality, and the climate.

So it's hard work we all need to be doing.  Otherwise we fall prey to persuasive nonsense -- and are at the mercy of whatever the author of it is trying to sell.

****************************************

Thursday, November 23, 2023

Dreaming the past

My novel In the Midst of Lions opens with a character named Mary Hansard -- an ordinary forty-something high school physics teacher -- suddenly realizing she can see the future.

More than that, really; she now has no reliable way of telling the future from the past.  She "remembers" both of them, and if she has no external context by which to decide, she can't tell if what's in her mind occurred in the past or will occur in the future.  Eventually, she realizes that the division of the passage of time she'd always considered real and inviolable has changed.  Instead of past, present, and future, there are now only two divisions: present and not-present.  Here's how she comes to see things:

In the past two months, it felt like the universe had changed shape.  The linear slow march of time was clean gone, and what was left was a block that was unalterable, the people and events in it frozen in place like butterflies in amber.  Her own position in it had become as observer rather than participant.  She could see a wedge of the block, extending back into her distant past and forward into her all-too-short future.  Anything outside that wedge was invisible...  She found that it completely dissolved her anxiety about what might happen next.  Being not-present, the future couldn’t hurt her.  If pain lay ahead of her, it was as removed from her as her memories of a broken arm when she was twelve.  Neither one had any impact on the present as it slowly glided along, a moving flashlight beam following her footsteps through the wrecked cityscape.

 I found myself thinking about Mary and her peculiar forwards-and-backwards perception while I was reading physicist Sean Carroll's wonderful and mind-blowing book From Eternity to Here: A Quest for the Ultimate Theory of Time, which looks at the puzzling conundrum of what physicists call time's arrow -- why, when virtually all physical laws are time-reversible, there is a clear directionality to our perceptions of the universe.  A classic example is the motion of billiard balls on a table.  Each ball's individual motion is completely time-reversible (at least if you discount friction with the table); if you filmed a ball rolling and bouncing off a bumper, then ran the recording backwards, it would be impossible to tell which was the original video and which was the reversed one.  The laws of motion make no differentiation between time running forward and time running backward.

But.

If you played a video of the initial break of the balls at the beginning of the game, then ran the recording backwards -- showing the balls rolling around and after a moment, assembling themselves back into a perfect triangle -- it would be blatantly obvious which was the reversed video.  The difference, Carroll explains, is entropy, which is a measure of the number of possible ways a system can exist and be indistinguishable on the macro level.  What I mean by this is that the racked balls are in a low-entropy state; there aren't that many ways you can assemble fifteen balls into a perfect equilateral triangle.  On the other hand, after the break, with the balls scattered around the table seemingly at random -- there are nearly an infinite number of ways you can have the balls arranged that would be more or less indistinguishable, in the sense that any of them would be equally likely to occur following the break.  Given photographs of thousands of different positions, not even Commander Data could determine which one was the pic taken immediately after the balls stopped moving.

Sure, it's possible you could get all the balls rolling in such a way that they would come to rest reassembled into a perfect triangle.  It's just extremely unlikely.  The increase in entropy, it seems, is based on what will probably happen.  There are so many high-entropy states and so few low-entropy states that if you start with a low-entropy arrangement, the chances are it will evolve over time into a high-entropy one.  The result is that it is (very) strongly statistically favored that entropy increases over time.  

The Arrow of Time by artist benpva16 [Image licensed under the Creative Commons Creative Commons BY-NC-ND 3.0 license: creativecommons.org/licenses/b…]

The part of the book that I am still trying to parse is chapter nine, "Information and Life," where he ties the physical arrow of time (an example of which I described above) with the psychological arrow of time.  Why can't we all do what Mary Hansard can do -- see the past and future both -- if the only thing that keeps us knowing which way is forward and which way is backward is the probability of a state's evolution?  After all, there are plenty of cases where entropy can locally go down; a seed growing into a tree, for example.  (This only occurs because of a constant input of energy; contrary to what creationists would have you believe, the Second Law of Thermodynamics doesn't disprove evolution, because living things are open systems and require an energy source.  Turn off the Sun, and entropy would increase fast.)

So if entropy actually explains the psychological arrow of time, why can I remember events where entropy went down -- such as yesterday, when I took a lump of clay and fashioned it into a sculpture?

Carroll's explanation kind of made my mind blow up.  He says that our memories themselves aren't real reflections of the past; they're a state of objects in our environment and neural firings in our brain in the present that we then assemble into a picture of what we think the past was, based on our assumption that entropy was lower in the past than it is now.  He writes:

So let's imagine you have in your possession something you think of as a reliable record of the past: for example, a photograph taken of your tenth birthday party.  You might say to yourself, "I can be confident that I was wearing a red shirt at my tenth birthday party, because this photograph of that event shows me wearing a red shirt."...

[Is] the present macrostate including the photo... enough to conclude with confidence that we were really wearing a red shirt at our tenth birthday party?

Not even close.  We tend to think that [it is], without really worrying about the details too much as we get through our lives.  Roughly speaking, we figure that a photograph like that is a highly specific arrangement of its constituent molecules.  (Likewise for a memory in our brain of the same event.)  It's not as if those molecules are just going to randomly assemble themselves into the form of that particular photo -- that's astronomically unlikely.  If, however, there really was an event in the past corresponding to the image portrayed in the photo, and someone was there with a camera, then the existence of the photo becomes relatively likely.  It's therefore very reasonable to conclude that the birthday party really did happen in the way seen in the photo.

All of those statements are reasonable, but the problem is that they are not nearly enough to justify the final conclusion...  Yes, the photograph is a very specific and unlikely arrangement of molecules.  However, the story we are telling to "explain" it -- an elaborate reconstruction of the past, involving birthday parties and cameras and photographs surviving essentially undisturbed to the present day -- is even less likely than the photo all by itself...

Think of it this way: You would never think to appeal to some elaborate story in the future to explain the existence of a particular artifact in the present.  If we ask about the future of our birthday photo, we might have some plans to frame it or whatnot, but we'll have to admit to a great deal of uncertainty -- we could lose it, it could fall into a puddle and decay, or it could burn in a fire.  Those are all perfectly plausible extrapolations of the present state into the future, even with the specific anchor point provided by the photo here in the present.  So why are we so confident about what the photo implies concerning the past?

The answer, he says, is that we're relying on probability and the likelihood that the past had lower entropy -- in other words, that the photo didn't come from some random collision of molecules, just as our surmise about the billiard balls' past came from the fact that a perfect triangular arrangement is way less likely than a random one.  All we have, Carroll says, is our knowledge of the present; everything else is an inference.  In every present moment, our reconstruction of the past is a dream, pieced together using whatever we're experiencing at the time.

So maybe we're not as different from Mary Hansard, with her moving flashlight beam gliding along and spotlighting the present, as I'd thought.

Mind = blown.

I'm still not completely convinced I'm understanding all the subtleties in Carroll's arguments, but I get enough of it that I've been thinking about it ever since I put the book down.  But in any case, I'd better wrap this up, because...

... I'm running short on time.

****************************************



Tuesday, April 26, 2022

The stubbornly persistent illusion

I was driving through Ithaca, New York a while back, and came to a stoplight, and the car in front of me had a bumper sticker that said, "Time is that without which everything would happen at once."

I laughed, but I kept thinking about it, because in one sentence it highlights one of the most persistent mysteries of physics: why we perceive a flow of time.  The problem is, just about all of the laws of physics, from quantum mechanics to the General Theory of Relativity, are time-reversible; they work equally well in forward as in reverse.  Put another way, most physical processes look the same both ways.  If I were to show you a short video clip of two billiard balls colliding on a pool table, then the same clip backwards, it would be hard to tell which was which.  The Laws of Conservation of Momentum and Conservation of Energy that describe the results of the collision work in either direction.

There are exceptions, though.  The Second Law of Thermodynamics is the most commonly-cited one: closed systems always increase in entropy.  It's why when I put sugar in my coffee in the morning and stir it, the sugar spreads through the whole cup.  If I were to give it one more stir and all the sugar molecules were to come back together as crystals and settle out on the bottom, I'd be mighty surprised.  I might even wonder if someone had spiked the sugar bowl with something other than sugar.

In fact, that's why I had to specify a "short clip" in the billiard ball example.  There is a time-irreversible aspect of such classical physics; as the balls roll across the table, they lose momentum, because a little of the kinetic energy of their motion leaks away as thermal energy due to friction with the surface.  When they collide, a little more is lost because of the sound of the balls striking each other, the (slight) physical deformation they undergo, and so on.  So if you had a sensitive enough camera, or a long enough clip, you could tell which was the forward and which the reverse clip, because the sum of the kinetic energies of the balls in the forward clip would be (slightly) greater before the collision than after it.

But I am hard-pressed to see why that creates a sense of the flow of time.  It can't be solely from our awareness of a movement toward disorder.  When there's an energy input, you can generate a decrease in entropy; it's what happens when a single-celled zygote develops into a complex embryo, for example.  There's nothing in the Second Law that prevents increasing complexity in an open system.  But we don't see those situations as somehow running in reverse; entropy increase by itself doesn't generate anything more than expected set of behaviors of certain systems.  How that could affect how time is perceived by our brains is beyond me.

The problem of time's arrow is one of long standing.  Einstein himself recognized the seeming paradox; he wrote, "The distinction between past, present, and future is only a stubbornly persistent illusion."  "Persistent" is an apt word; more than sixty years after the great man's death, there was an entire conference on the nature of time, which resolved very little but giving dozens of physicists the chance to defend their own views, and in the end convinced no one.

It was, you might say, a waste of time.  Whatever that means.

One of the most bizarre ideas about the nature of time is the one that comes out of the Special Theory of Relativity, and was the reason Einstein made the comment he did: the block universe.  I first ran into the block universe model not from Einstein but from physicist Brian Greene's phenomenal four-part documentary The Fabric of the Cosmos, and it goes something like this.  (I will append my usual caveat that despite my bachelor's degree in physics, I really am a layperson, and if any physicists read this and pick up any mistakes, I would very much appreciate it if they'd let me know so I can correct them.)

One of the most mind-bending things about the Special Theory is that it does away with simultaneity being a fixed, absolute, universal phenomenon.  If we observe two events happening at exactly the same time, our automatic assumption is that anyone else, anywhere in the universe, would also observe them as simultaneous.  Why would we not?  But the Special Theory shows conclusively that your perception of the order of events is dependent upon your frame of reference.  If two individuals are in different reference frames (i.e. moving at different velocities), and one sees the two events as simultaneous, the other will see them as sequential.  (The effect is tiny unless the difference in velocities is very large; that's why we don't experience this under ordinary circumstances.)

This means that past, present, and future depend on what frame of reference you're in.  Something that is in the future for me might be in the past for you.  This can be conceptualized by looking at space-time as being shaped like a loaf of bread; the long axis is time, the other two represent space.  (We've lost a dimension, but the analogy still works.)  The angle you are allowed to slice into the loaf is determined by your velocity; if you and two friends are moving at different velocities, your slice and theirs are cut at different angles.  Here's a picture of what happens -- to make it even more visualizable, all three spatial dimensions are reduced to one (the x axis) and the slice of time perceived moves along the other (the y axis).  A, B, and C are three events, and the question is -- what order do they occur in?

[Image licensed under the Creative Commons User:Acdx, Relativity of Simultaneity Animation, CC BY-SA 4.0]

As you can see, it depends.  If you are taking your own velocity as zero, all three seem to be simultaneous.  But change the velocity -- the velocities are shown at the bottom of the graph -- and the situation changes.  To an observer moving at a speed of thirty percent of the speed of light relative to you, the order is C -> B -> A.  At a speed of fifty percent of the speed of light in the other direction, the order is A -> B -> C.

So the tempting question -- who is right? what order did the events really occur in? -- is meaningless.

Probably unnecessarily, I'll add that this isn't just wild speculation.  The Special Theory of Relativity has been tested hundreds, probably thousands, of times, and has passed every test to a precision of as many decimal places as you want to calculate.  (A friend of mine says that the papers written about these continuing experiments should contain only one sentence: "Yay!  Einstein wins again!")  Not only has this been confirmed in the lab, the predictions of the Special Theory have a critical real-world application -- without the equations that lead directly to the block universe and the relativity of simultaneity, our GPS systems wouldn't work.  If you want accurate GPS, you have to accept that the universe has some seriously weird features.

So the fact that we remember the past and don't remember the future is still unexplained.  From the standpoint of physics, it seems like past, present, and future are all already there, fixed, trapped in the block like flies in amber.  Our sense of time flowing, however familiar, is the real mystery.

But I'd better wrap this up, because I'm running out of time.

Whatever that means.

**************************************

Wednesday, November 10, 2021

Can't win, can't break even

Dear readers,

I'm going to take a short break from Skeptophilia -- my next post will be Thursday, November 18.  I'll still be lining up topics during the time I'm away, so keep those suggestions coming!

cheers,

Gordon

**********************************

One of the most misunderstood laws of physics is the Second Law of Thermodynamics.

Honestly, I understand why.  It's one of those bits of science that seem simple on first glance, then the more you learn, the weirder it gets.  The simplest way to state the Second Law is "systems tend to proceed toward disorder," so on the surface it's so common-sensical that it triggers nothing more than a shrug and, "Well, of course."  But a lot of its ramifications are seriously non-intuitive, and a few are downright mindblowing.

The other problem with it is that it exists in multiple formulations that seem to have nothing to do with one another.  These include:
  • the aforementioned statement that without an energy input, over time, systems become more disordered.
  • if you place a warm object and cool object in contact with each other, energy will flow from the warmer to the cooler; the warmer object will cool off, and the cooler one will heat up, until they reach thermal equilibrium (equal temperatures).
  • no machine can run at 100% efficiency (i.e., turning all of its energy input into usable work).
  • some processes are irreversible; for example, there's nothing odd about knocking a wine glass off the table and shattering it, but if you were watching and the shards gathered themselves back together and leapt off the floor and back onto the table as an intact wine glass, you might wonder if all you'd been drinking was wine.
The fact that all of these are, at their basis, different ways of stating the same physical law is not obvious.

For me, the easiest way to understand the "why" of the Second Law has to do with a deck of playing cards.  Let's say you have a deck in order; each suit arranged from ace to king, and the four suits in the order hearts, spades, diamonds, clubs.  How many possible ways are there to arrange the cards in exactly that way?

Duh.  Only one, by definition.

Now, let's say you accidentally drop the deck, then pick it up.  Unless you flung the deck across the room, chances are, there will still be some of the cards in the original order, but some of the orderliness will probably have been lost.  Why?  Because there's only a single way to arrange the cards in the order you started with, but there are lots of ways to have them mostly out of order.  The chances of jumping from the single orderly state to one of the many disorderly states is a near certainty.  Then you drop them again (you're having a clumsy day, apparently).  Are they more likely to become more disordered or more orderly?

You see where this is going; since at each round, there are way more disorderly states than orderly ones, just by the laws of statistics you're almost certainly going to watch the deck becoming progressively more disordered.  Yes, it's possible that you could take a completely random deck, toss them in the air, and they'd fall into ace-through-king, hearts-spades-diamonds-clubs -- but if you're waiting for that to happen by random chance, you're going to have a long wait.

You can, of course, force them back into order by painstakingly rearranging the cards, but that takes an input of energy (in the form of your brain and muscles using up chemical energy to accomplish it).  And here's where it gets weird; if you were to measure the decrease in entropy (disorder) in the deck of cards as you rearranged them, it would be outweighed by the increase in entropy of the energy-containing molecules you burned through to do it.  The outcome: you can locally and temporarily decrease entropy, but only at the expense of creating more entropy somewhere else.  Everything we do makes things more chaotic, and any decrease in entropy we see is illusory.  In the end, entropy always wins.

As my long-ago thermodynamics professor told us, "The First Law of Thermodynamics says that you can't win.  The Second Law says you can't break even."

Hell of a way to run a casino, that.

[Image is in the Public Domain]

The reason this all comes up is a paper that a friend of mine sent me a link to, which looks at yet another way of characterizing the Second Law; instead of heat transfer or overall orderliness, it considers entropy as a measure of information content.  The less information you need to describe a system, the lower its entropy; in the example of the deck of cards, I was able to describe the orderly state in seven words (ace-through-king, hearts-spades-diamonds-clubs).  High-entropy states require a lot of information; pick any of the out-of-order arrangements of the deck of cards, and pretty much the only way to describe it is to list each card individually from the top of the deck to the bottom.

The current paper has to do with information stored inside machines, and like many formulations of the Second Law, it results in some seriously weird implications.  Consider, for example, a simple operation on a calculator -- 2+2, for example.  When you press the "equals" sign, and the calculator tells you the answer is four, have you lost information, or gained it?

Most people, myself included, would have guessed that you've gained information; you now know that 2+2=4, if you didn't already know that.  In a thermodynamic sense, though, you've lost information.  When you get the output (4), you irreversibly erase the input (2+2).  Think about going the other way, and it becomes clearer; someone gives you the output (4) and asks you what the input was.

No way to tell.  There are, in fact, an infinite number of arithmetic operations that would give you the answer "4".  What a calculator does is time-irreversible.  "Computing systems are designed specifically to lose information about their past as they evolve," said study co-author David Wolpert, of the Santa Fe Institute.

By reducing the information in the calculator, you're decreasing its entropy (the answer has less information than the input did).  And that means that the calculator is increasing entropy more somewhere else -- in this case, it heats up the surrounding air.

And that's one reason why your calculator gets warm when you use it.  "There's this deep relationship between physics and information theory," said study co-author Artemy Kolchinsky.  "If you erase a bit of information, you have to generate a little bit of heat."

But if everything you do ultimately increases the overall entropy, what does that say about the universe as a whole?

The implication is that the entire universe's entropy was at a minimum at its creation in the Big Bang -- that it started out extremely ordered, with very low information content.  Everything that's happened since has stirred things up and made them more chaotic (i.e., requiring more information for a complete description).  Eventually, the universe will reach a state of maximal disorder, and after that, it's pretty much game over; you're stuck there for the foreseeable future.  This state goes by the cheerful name the "heat death of the universe."

Not to worry, though.  It won't happen for a while, and we've got more pressing matters to attend to in the interim.

To end on a positive note, though -- going back to our original discussion of the increase of entropy as stemming from the likelihood of jumping from a disordered state back to an orderly one, recall that the chance isn't zero, it's just really really really small.  So once the heat death of the universe has occurred, there is a non-zero chance that it will spontaneously come back together into a second very-low-entropy singularity, at which point the whole thing starts over.  Yeah, it's unlikely, but once the universe is in heat death, it's not like it's got much else to do besides wait.

*********************************************

If Monday's post, about the apparent unpredictability of the eruption of the Earth's volcanoes, freaked you out, you should read Robin George Andrews's wonderful new book Super Volcanoes: What They Reveal About the Earth and the Worlds Beyond.

Andrews, a science journalist and trained volcanologist, went all over the world interviewing researchers on the cutting edge of the science of volcanoes -- including those that occur not only here on Earth, but on the Moon, Mars, Venus, and elsewhere.  The book is fascinating enough just from the human aspect of the personalities involved in doing primary research, but looks at a topic it's hard to imagine anyone not being curious about; the restless nature of geology that has generated such catastrophic events as the Yellowstone Supereruptions.

Andrews does a great job not only demystifying what's going on inside volcanoes and faults, but informing us how little we know (especially in the sections on the Moon and Mars, which have extinct volcanoes scientists have yet to completely explain).  Along the way we get the message, "Will all you people just calm down a little?", particularly aimed at the purveyors of hype who have for years made wild claims about the likelihood of an eruption at Yellowstone occurring soon (turns out it's very low) and the chances of a supereruption somewhere causing massive climate change and wiping out humanity (not coincidentally, also very low).

Volcanoes, Andrews says, are awesome, powerful, and fascinating, but if you have a modicum of good sense, nothing to fret about.  And his book is a brilliant look at the natural process that created a great deal of the geology of the Earth and our neighbor planets -- plate tectonics.  If you are interested in geology or just like a wonderful and engrossing book, you should put Super Volcanoes on your to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, April 29, 2021

Watching the clock

 If I had to pick the scientific law that is the most misunderstood by the general public, it would have to be the Second Law of Thermodynamics.

The First Law of Thermodynamics says that the total quantity of energy and mass in a closed system never changes; it's sometimes stated as, "Mass and energy cannot be destroyed, only transformed."  The Second Law states that in a closed system, the total disorder (entropy) always increases.  As my long-ago thermodynamics professor put it, "The First Law says you can't win; the Second Law says you can't break even."

Hell of a way to run a casino, that.

So far, there doesn't seem to be anything particularly non-intuitive about this.  Even from our day-to-day experience, we can surmise that the amount of stuff seems to remain pretty constant, and that if you leave something without maintenance, it tends to break down sooner or later.  But the interesting (and less obvious) side starts to appear when you ask the question, "If the Second Law says that systems tend toward disorder, how can a system become more orderly?  I can fling a deck of cards and make them more disordered, but if I want I can pick them up and re-order them.  Doesn't that break the Second Law?"

It doesn't, of course, but the reason why is quite subtle, and has some pretty devastating implications.  The solution to the question comes from asking how you accomplish re-ordering a deck of cards.  Well, you use your sensory organs and brain to figure out the correct order, and the muscles in your arms and hands (and legs, depending upon how far you flung them in the first place) to put them back in the correct order.  How did you do all that?  By using energy from your food to power the organs in your body.  And to get the energy out of those food molecules -- especially glucose, our primary fuel -- you broke them to bits and jettisoned the pieces after you were done with them.  (When you break down glucose to extract the energy, a process called cellular respiration, the bits left are carbon dioxide and water.  So the carbon dioxide you exhale is actually broken-down sugar.)

Here's the kicker.  If you were to measure the entropy decrease in the deck of cards, it would be less -- way less -- than the entropy increase in the molecules you chopped up to get the energy to put the cards back in order.  Every time you increase the orderliness of a system, it always (1) requires an input of energy, and (2) increases the disorderliness somewhere else.  We are, in fact, little chaos machines, leaving behind a trail of entropy everywhere we go, and the more we try to fix things, the worse the situation gets.

I've heard people arguing that the Second Law disproves evolution because the evolutionary model claims we're in a system that has become more complex over time, which according to the Second Law is impossible.  It's not; and in fact, that statement betrays a fundamental lack of understanding of what the Second Law means.  The only reason why any increase in order occurs -- be it evolution, or embryonic development, or stacking a deck of cards -- is because there's a constant input of energy, and the decrease in entropy is offset by a bigger increase somewhere else.  The Earth's ecosystems have become more complex in the 4.5 billion year history of life because there's been a continuous influx of energy from the Sun.  If that influx were to stop, things would break down.

Fast.

The reason all this comes up is because of a paper this week in Physical Review X that gives another example of trying to make things better, and making them worse in the process.  This one has to do with the accuracy of clocks -- a huge deal to scientists who are studying the rate of reactions, where the time needs to be measured to phenomenal precision, on the scale of nanoseconds or better.  The problem is, we learn from "Measuring the Thermodynamic Cost of Timekeeping," the more accurate the clock is, the higher the entropy produced by its workings.  So, in effect, you can only measure time in a system to the extent you're willing to screw the system up.

[Image licensed under the Creative Commons Robbert van der Steeg, Eternal clock, CC BY-SA 2.0]

The authors write:

All clocks, in some form or another, use the evolution of nature towards higher entropy states to quantify the passage of time.  Due to the statistical nature of the second law and corresponding entropy flows, fluctuations fundamentally limit the performance of any clock.  This suggests a deep relation between the increase in entropy and the quality of clock ticks...  We show theoretically that the maximum possible accuracy for this classical clock is proportional to the entropy created per tick, similar to the known limit for a weakly coupled quantum clock but with a different proportionality constant.  We measure both the accuracy and the entropy.  Once non-thermal noise is accounted for, we find that there is a linear relation between accuracy and entropy and that the clock operates within an order of magnitude of the theoretical bound.

Study co-author Natalia Ares, of the University of Oxford, summarized their findings succinctly in an article in Science News; "If you want a better clock," she said, "you have to pay for it."

So a little like the Heisenberg Uncertainty Principle, the more you try to push things in a positive direction, the more the universe pushes back in the negative direction.  

Apparently, even if all you want to know is what time it is, you still can't break even.

So that's our somewhat depressing science for the day.  Entropy always wins, no matter what you do.  Maybe I can use this as an excuse for not doing housework.  Hey, if I make things more orderly here, all it does is mess things up elsewhere, so what's the point?

Nah, never mind.  My wife'll never buy it.

****************************************

When people think of mass extinctions, the one that usually comes to mind first is the Cretaceous-Tertiary Extinction of 66 million years ago, the one that wiped out all the non-avian dinosaurs and a good many species of other types.  It certainly was massive -- current estimates are that it killed between fifty and sixty percent of the species alive at the time -- but it was far from the biggest.

The largest mass extinction ever took place 251 million years ago, and it destroyed over ninety percent of life on Earth, taking out whole taxa and changing the direction of evolution permanently.  But what could cause a disaster on this scale?

In When Life Nearly Died: The Greatest Mass Extinction of All Time, University of Bristol paleontologist Michael Benton describes an event so catastrophic that it beggars the imagination.  Following researchers to outcrops of rock from the time of the extinction, he looks at what was lost -- trilobites, horn corals, sea scorpions, and blastoids (a starfish relative) vanished completely, but no group was without losses.  Even terrestrial vertebrates, who made it through the bottleneck and proceeded to kind of take over, had losses on the order of seventy percent.

He goes through the possible causes for the extinction, along with the evidence for each, along the way painting a terrifying picture of a world that very nearly became uninhabited.  It's a grim but fascinating story, and Benton's expertise and clarity of writing makes it a brilliant read.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Tuesday, September 22, 2020

Towards zero

Most scientifically-literate types know about the impossibility of reaching the temperature of -273.15 C, better known as "absolute zero."  The way most of us would explain why it's impossible goes back to one formulation of the Second Law of Thermodynamics.  Since the Second Law says that heat always flows from a hotter object to a colder one unless you put energy into the system (I always called this "the Refrigerator Principle" in my biology classes), once you get near absolute zero it takes exponentially more energy to remove that last bit of heat from the object you're cooling, and to go all the way to zero would require (1) an infinite amount of energy expended to accomplish it, and (2) an infinite heat sink in which to place the extracted energy.  (Despite having been conjectured for over a century, this way of looking at absolute zero was proven beyond question by the mathematics of known physical laws just three years ago.

There's another way to look at it, though, which is a little harder to wrap your brain around, because it hinges on one of the most misunderstood laws of physics: the Heisenberg Uncertainty Principle.  Developed in 1927 by German physicist Werner Heisenberg, the Uncertainty Principle does not mean what I saw someone claim on a woo-woo website a while back that "because of the Uncertainty Principle, we now know that science can't ever prove anything."


What the Uncertainty Principle does tell us is, despite its name, extremely specific.  What it says is for a particle, there are pairs of physical quantities called complementary variables, and for such a pair (call 'em A and B), the more we know about variable A, the less we can even theoretically know about variable B.  The most commonly cited pair of complementary variables is position and momentum/velocity.  If we know exactly where a particle is, we have no accessible information about its velocity, and vice versa.

Note that the Uncertainty Principle is not about the inaccuracy of our measuring techniques.  It's not that the particle has a specific position and velocity and we just don't know what they are, the same as watching a car speed by and thinking, "Okay, it was going so fast, at any given point along the road I don't know exactly how quickly it was moving."  This is a fundamental, built-in feature of the universe.  If I know a particle's position to a high degree of certainty, its velocity is equally uncertain, and in fact the particle exists in a superposition of all possible velocities simultaneously.  The reality is inherently blurry, and the more you home in on one piece of it, the blurrier the rest of it gets.

So what does this have to do with absolute zero?  Well, the temperature of an object is a measure of the average speeds of its constituent particles.  So if you had an object at absolute zero, you'd know the positions of the particles (because they're not moving) to 100% accuracy, and you'd also know their velocities (zero) to 100% accuracy.

About as huge a violation of the Uncertainty Principle as you can get.

The reason all this comes up is because of a study at the University of Vienna that was the subject of a paper last week in Science in which we read about a 150-nanometer bit of silica (made up of around a hundred million atoms) that was cooled down to twelve millionths of a degree above absolute zero.  This turns out to be the true temperature limit for a particle that size; every atom in the particle was in the ground state, the lowest allowable energy a particle can have without violating Heisenberg's law.  The particle was suspended in an "optical trap" -- the easiest way to describe it is they levitated it with lasers -- and then allowed to free-fall so they could observe its behavior.

What the researchers hope to do is to use such experiments to shed some light on the behavior of gravity in the quantum world, something that has been a dream of physicists for a very long time.  While the other three fundamental forces of nature (electromagnetism and the weak and strong nuclear forces) have all been shown to be manifestations of a single "electronuclear" force, gravitation has resisted all attempts to incorporate it into a "Grand Unified Theory" that could simultaneously explain the gravitational warping of space and the strange behavior of the quantum world.  None of the candidates for a Grand Unified Theory (the most famous contender is string theory) have as yet panned out, but the search continues -- and the ability to cool particles described in last week's paper give a bit of hope that physicists will be able to isolate and study systems under conditions that make the mathematics tractable.

So that's our quantum weirdness for today.  Thanks to the friend and long-term loyal reader of Skeptophilia who sent me the link to the study.  I would never claim to say I understand it at any kind of deep level -- after all, no less a luminary than Richard Feynman famously said, "If you think you understand quantum mechanics, you don't understand quantum mechanics."  The fundamental workings of the universe are counterintuitive and mind-blowingly odd, and that's kind of where we have to leave it.

**********************************

Author Mary Roach has a knack for picking intriguing topics.  She's written books on death (Stiff), the afterlife (Spook), sex (Bonk), and war (Grunt), each one brimming with well-researched facts, interviews with experts, and her signature sparkling humor.

In this week's Skeptophilia book-of-the-week, Packing for Mars: The Curious Science of Life in Space, Roach takes us away from the sleek, idealized world of Star Trek and Star Wars, and looks at what it would really be like to take a long voyage from our own planet.  Along the way she looks at the psychological effects of being in a small spacecraft with a few other people for months or years, not to mention such practical concerns as zero-g toilets, how to keep your muscles from atrophying, and whether it would actually be fun to engage in weightless sex.

Roach's books are all wonderful, and Packing for Mars is no exception.  If, like me, you've always had a secret desire to be an astronaut, this book will give you an idea of what you'd be in for on a long interplanetary voyage.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]