I remember how stunned I was when I was in high school and found out that all energy release -- from striking a match to setting off a nuclear bomb -- goes back to Einstein's famous equation, that energy is equal to mass times the speed of light squared.
It all hinges on the fact that the Law of Conservation of Mass isn't quite right. If I set a piece of paper on fire inside a sealed box, the oft-quoted line in middle school textbooks -- that if I'd weighed the paper and the air in the box beforehand and then reweighed the ash and the air in the box afterward, they'd have identical masses -- isn't true. The fact is, the box would weigh less after the paper had burned completely.
The reason is that some (a very tiny amount, but some) of the mass of the paper would have been converted to energy according to Einstein's equivalency, and that's where the heat and light of the fire came from. Thus, the box and its contents would have less mass than they started with.
The mind-boggling truth is that when you burn a fossil fuel -- oil, coal, or natural gas -- you are re-releasing energy from the Sun that was stored in the tissues of plants in the form of a little bit of extra mass during the Carboniferous Period, three-hundred-odd million years ago.
So to fix the problem with the "Law," we have to account for the shifting back and forth between matter and energy. If you change it to a conservation law of the total -- that the sum of the mass and energy stays constant in a closed system -- it's spot-on. (In fact, this is the First Law of Thermodynamics.)
How much energy you can get out of anything depends, then, only on one thing; how much of its mass you can turn into energy. This is the basis of (amongst many other things) what happens in a nuclear power plant. As folks like Henri Becquerel, Marie Skłodowska Curie, Pierre Curie, and others showed in the early twentieth century, the atoms of an element can be turned into the atoms of a different element -- the dream of the alchemists -- and the amount of energy required or released by that process is described by something called the binding energy curve.
Note, for example, going from uranium (at the far right end of the graph) to any of the other mid-weight elements uranium breaks down into when it undergoes nuclear fission. What those are, specifically, isn't that important; they all lie on the flattish part of the curve between iron (Fe, the most stable element) and uranium. Going from uranium to any of those is an upward movement on the graph, and thus releases energy. Seems like it must not be much, right? Well, that "small" release is what generates the energy from a nuclear power plant -- and from bombs of the type that destroyed Hiroshima.
Now check out the other end of the graph -- the elements for which fusion is the energy-releasing transformation.
Go, for example from hydrogen-1 (the very bottom left corner of the graph) to helium-4 (at the peak, right around 7 MeV), and compare the size of that leap with the one from uranium to any of its fission products. This transition -- hydrogen-1 to helium-4 -- is the one that powers the Sun, and is what scientists would like to get going in a fusion reactor.
See why? I could sit down and calculate the per-transition difference in the energy release between fission and fusion, but it's huge. Fusion releases more energy by orders of magnitude. Also, the fuel for fusion, hydrogen, is by far the most abundant element in the Solar System; it's kind of everywhere. Not only that, the waste product -- helium -- is completely harmless and inert, by comparison to fission waste, which remains deadly for centuries.
That's why the scientists want so desperately to get fusion going as a viable energy source.
The problem, as I noted earlier, is practicality. The fusion reactions in the Sun are kept going because the heat and pressure in the core are sufficient for hydrogen nuclei to overcome their mutual electrostatic repulsion, crushing them together and triggering a chain reaction that leads to helium-4 (and releasing a crapload of energy in the process). Maintaining those conditions in the lab has turned out to be extraordinarily difficult; it's always consumed (far) more energy to trigger nuclear fusion than came out of it, and the reactions are self-limiting, collapsing in a split-second. It's what's given rise to the sardonic quip, "Practical nuclear fusion is fifty years in the future... and always will be."
Well -- it seems like "fifty years in the future" may have just gotten one step closer.
It was just announced that for the first time ever, scientists at the amusingly-named National Ignition Facility of Livermore, California have created a nuclear fusion reaction that produced more energy than it consumed. This proof-of-concept is, of course, only the first step, but it demonstrates that practical nuclear fusion might not be the pipe dream it has seemed since its discovery almost a century ago.
"This is a monumental breakthrough," said Gilbert Collins of the University of Rochester in New York, a physicist who has collaborated in other NIF projects but was not involved the current research. "With this achievement, the landscape has changed... comparable to the invention of the transistor or the Wright brothers’ first flight. We now have a laboratory system that we can use as a compass for how to make progress very rapidly."
No comments:
Post a Comment