Fusion energy, simply, is the exact opposite of fission energy, which comes from splitting an atom and is widely used to power nuclear plants and weapons. Fusion occurs constantly on our sun, which produces most of its energy via the nuclear fusion of hydrogen into helium. When nuclei fuse, they create a heavier nucleus and produce a little leftover energy in the process.
Fusion doesn’t produce runaway chain reactions the way fission can, so there’s no need to worry about meltdowns. Nor do fusion reactions produce the large amounts of dangerous radioactive waste that fission reactions do. That’s why it’s such a dreamy source of energy.
So why do we have fission power, but not fusion power? The answer is pretty simple, and very frustrating to a lot of physicists out there: For fusion to occur on Earth, you need a temperature of at least 100 million degrees Celsius—six times hotter than the core of the sun. The sun is a natural fusion reactor which makes up for its measly 15 million degrees with the intense pressure created by its core’s gravity. Currently, here on Earth the amount of energy you’d need to put in to produce that kind of heat or pressure is much, much higher than what you get out in usable energy.
That’s where the term cold fusion, the hope that fusion reactions can occur at relatively low temperatures, comes in. Once a promising theoretical goal, the field was largely written off as pseudoscience the late 1980s, when electrochemists Stanley Pons and Martin Fleischmann reported that their room-temperature electrolysis experiment had produced so much excess heat—as well as nuclear by-products like tritium—that only a nuclear reaction could be blamed. The attention led to a massive wave of cold-fusion experimenting, but no one was able to replicate their heat anomaly. A Department of Energy review panel debunked the evidence, and met again in 2004 to draw the same conclusion.
That doesn’t mean scientists aren’t still working on making (hot) fusion a viable energy source. In a New York Times op-ed in November, Stewart C. Prager of the Princeton Plasma Physics Laboratory called the process of creating viable energy from fusion “a grand scientific challenge.”
Today, fusion reactions occur in tokamaks—doughnut-shaped chambers where gas is pumped into a vacuum chamber and electricity flows through the center (the doughnut’s hole). The gas becomes charged, forming plasma that’s then locked inside the vacuum chamber by magnetic fields (which are created by massive magnetic coils), mimicking the pressure of the sun’s core. Radio and microwaves are fired into the plasma to raise its temperature, and at around 100 million degrees fusion can occur. The main barrier to a sustained reaction, other than the high cost of the electricity needed to heat the chamber, is finding a material that can withstand that much heat for more than a few seconds.
Prager points out that reaction outputs have come a long way in the past few decades—from milliwatts in the 1970s to 16 megawatts today, though the current record (at 65 percent output) was made in 1997. He believes that more time and support are all scientists need to make this “nearly ideal energy source” a reality. Then again, claims that net fusion power is just around the corner have been made for decades.
Still, Steve Cowley, director of the Culham Centre for Fusion Energy, tells PM that the main barrier is getting government funding. “It’s expensive research that can only be done at large scales,” he says, “and nobody sees the need right now. Every time there’s talk about climate change, funding goes up for awhile.” But not enough to get the first power plants built.
“For $20 billion in cash,” he says, “I could build you a working reactor. It would be big, and maybe not very reliable, but 25 years ago we didn’t even know if we’d be able to make fusion work. Now, the only question is whether we’ll be able to make it affordable.”