Nuclear Energy – White Knight or Dangerous Fantasy?

Share this article

This article is by Stephanie Cooke, author of In Mortal Hands – A Cautionary History of the Nuclear Age, published in April.

 

The idea of nuclear energy as a white knight that will save the world from the evils inflicted by carbon-based fuels is nothing new. In 1965, industry newsletter Nucleonics Week stated that compared with coal, “the one issue on which nuclear power can make an invincible case is the air pollution issue.” After that, utilities increasingly looked on nuclear as a solution to the problem. Then, in 1973, the year of the first Arab oil embargo, former President Richard Nixon launched an energy plan called Project Independence to wean Americans off imported energy.

That year, the then Atomic Energy Commission predicted that by the turn of the century one thousand reactors would be turning the lights on in homes and businesses across the country. Nuclear orders were booming. But after 1973, orders sharply declined as the 7 percent annual electricity demand growth typical of the previous two decades plummeted by more than half to about 3 percent.

At the same time, no one really knew whether nuclear energy was safe. A number of nasty accidents at reactors in the 1950s and early ‘60s had either been hushed up or conveniently forgotten. Most of these involved experimental government-run reactors, but the most famous, apart from the disastrous Windscale reactor fire in Britain, ended with the permanent shutdown of the ill-fated Fermi breeder reactor near Detroit, and that was the product of a private utility venture.

In their current enthusiasm for nuclear energy, boosters have tended to overlook or dismissed the dark days of nuclear’s more recent past. For example, British scientist James Lovelock has dismissed the Three Mile Island accident as a “a joke.” In reality, it was anything but that. When the reactor tripped in the early morning hours of March 28, 1979, the world’s most highly-trained nuclear operators (they had been part of Admiral Hyman Rickover’s nuclear navy corps) stared for more than two hours at a bank of blinking lights, unable to decipher their meaning, while approximately half the core melted. Then, for more than two days, while state and federal officials weighed the pros and cons of a general evacuation order, the nation’s leading nuclear experts debated whether a hydrogen bubble inside the reactor would lead to an explosion.

Luckily, the worst did not happen but that is not the point – scientists and engineers did not have answers when they should have and thousands of lives, even as far away as Washington, Baltimore and Philadelphia, according to one Nobel Prize-winning biologist, were put at risk.

Things, of course, did not turn out so well at Chernobyl, whose history is also being rewritten by rosy-eyed optimists. That there is new vegetation and animal life in the region immediately around the reactor or “only 56 deaths” (the number is disputed) is hailed as evidence the damage was not so bad after all. But what about the thousands of excess cancer deaths predicted by the World Health Organization in a 2006 report? Or the thousands of childhood thyroid cancers already documented? Or, to quote the WHO study, “the massive relocations, loss of economic stability and long-term threats to health in current, and, possibly, future generations, that resulted in an increased sense of anomie and diminished sense of physical and emotional balance.” The disruption caused by Chernobyl is measured in decades, not days, or weeks, or years, and in the mental and physical health of millions of people.

When weighing electricity-producing options for the future it seems sensible to opt for those that will churn out kilowatt hours with the least harmful possible consequences to people and the biosphere, as reliably as possible. Nuclear energy’s merits are that it provides electricity without significantly contributing to the planet’s carbon footprint and that when functioning properly it generates a reasonably reliable amount of baseload electricity. But what about its downside?

To begin with, even without accidents, nuclear energy leaves an environmental footprint lasting for upwards of one million years, the length of time the U.S. Environmental Protection Agency says is necessary to prevent spent nuclear fuel (waste) from seeping into the air or groundwater. But the ‘footprint’ begins with the mining and milling of uranium for reactor fuel, a process that produces tailings and other residues containing cancer-causing radioactive elements. These in turn generate decay products with half-lives of up to 80,000 or more years. Buried under several feet of soil (and probably left in open air in some countries where mining takes place), there is no guarantee these tailings will remain secure or be prevented from leaching into air or groundwater over even a relatively short period of time.

Reactors themselves are big heat generators that require large amounts of water for their cooling systems. With unexpected droughts, particularly in the United States and Europe, reactor operators have been forced to cut back operations at various times, and there are growing concerns about the willingness of regulators to allow utilities to override their own norms on water temperatures released into rivers and lakes. Older plants with once-through cooling systems, such as the Oyster Creek plant in New Jersey, are damaging aquatic life. Moreover, nuclear plants discharge large quantities of non-radioactive carcinogenic material, such as hydrazine and chromium, used as anti-corrosive agents in the reactors. There have been reported leaks of tritium as well and concern that more is going unreported.

When it comes to nuclear safety, the lack of a serious accident since TMI and Chernobyl has lulled many into thinking that reactors are better managed these days. To be sure, many improvements have been made, but that overlooks the fact that hundreds of “events” are reported each year (and generally not made public). The list of what can go wrong includes primary coolant leaks, fuel degradation, fires and explosions, blackouts, hurricanes, tornadoes, floods and security breaches.

In July 2007, a strong earthquake shut down the Kashiwazaki-Kariwa nuclear complex, the world’s largest, consisting of seven reactors, forcing the owner of the complex, Tokyo Electric Power Co., to buy costly replacement fossil fuels. The loss of power from the plant over the following two years also was a major contributor to a one percent decline in nuclear energy’s contribution to worldwide electricity output (now about 15%), and extremely low capacity factors in Japan. Just last week another Japanese utility was faced with the possibility of an extended nuclear plant outage after a 6.5-magnitude earthquake rocked an area near Tokyo. In the United States, there have been a number of worrying incidents, including the televised video documentation of security guards sleeping on the job at the Peach Bottom plant in Pennsylvania and the discovery in 2002 of a rust hole at the top of the reactor pressure vessel at the Davis-Besse plant in Ohio – a situation that had the unit been operating at the time might have led to a steam explosion, triggering a runaway chain reaction.

Ever since President Dwight D. Eisenhower’s famous Atoms for Peace speech in 1953 (and even before) nuclear scientists were encouraged to apply their knowledge to the development of civilian nuclear energy, with the expectation that they would eventually find solutions to its many challenges. These, of course, include safety and the issue of how to dispose of the waste. But perhaps the toughest question concerns how to stop the mounting threat of proliferation. The largesse that flowed from Atoms for Peace led to an atmosphere of tolerance that allowed countries all over the world to acquire the technology and expertise necessary for bomb-making. The secret Israel and Indian nuclear weapons programs took root in the Atoms for Peace firmament.

While attempts were made to put the brakes on with the establishment of the International Atomic Energy Agency in 1957, the Non-Proliferation Treaty in 1968, and the London Suppliers Group in 1974, they were not enough to stop proliferators intent on developing nuclear weapons. Most notorious among them was A.Q. Khan who stole uranium enrichment technology from a civilian establishment in Holland; his effort ensured Pakistan’s success in developing its first nuclear weapons. Kahn then went on to sell the technology to other countries, including Iran, Iraq, Libya and North Korea. But apart from the American, Russian and Chinese programs, every other effort to develop nuclear weapons, including in Britain and France, benefited from the advent of civilian nuclear energy because of the cover it provided as well as the access to nuclear fuel, equipment and technology.

Today’s talk about a nuclear renaissance is in part a legitimate attempt to provide an alternative to carbon-based fuels, although for many reasons nuclear energy is unlikely to fulfil its hoped-for promise. Trillions of dollars have been poured into nuclear development over the past fifty years and nuclear reactors today account for just 15% of worldwide electricity output. Yucca Mountain in Nevada was never an ideal solution for a permanent waste repository in the United States and now the Obama Administration has basically ruled it out. That basically puts those of us in the U.S. back to the question we finally tried to resolve in 1982 with the passage of a comprehensive nuclear waste policy act. A blue-ribbon commission is being set up to study the question – all over again. In the meantime, the answers to the many other serious challenges posed by nuclear energy – proliferation probably most of all — still elude us.