”For all sad words of tongue and pen, The saddest are these, ‘It might have been.’”
— John Greenleaf Whittier
I
t’s been obvious for a long time that global warming would be a lot less of an urgent problem if environmentalists hadn’t killed nuclear power in the 1970s. Now an Australian researcher has totaled up the costs of anti-nuclear environmentalism.
Peter Lang, a researcher at Australian National University and a strong advocate of nuclear energy, calculates that if nuclear energy development hadn’t been derailed in the 1970s by environmental opposition and government over-regulation, the United States would be getting 80 percent of its electricity from carbon-free nuclear energy today, instead of the approximately 20 percent it currently gets.
(According to the U.S. Energy Information Agency, in 2017, 62 percent of U.S. electricity was produced with fossil fuels. Renewables accounted for 17 percent, of which 7.5 percent came from hydropower and 7.6 percent from wind and solar.)
France, which made an all-in commitment to nuclear energy in the 1970s, currently gets about 80 percent of its electricity from nuclear power.
Lang found that in the late 1960s and the 1970s, the deployment rates of nuclear power plants and the learning curves reflecting engineering and technological advances in the nuclear power industry changed from showing “rapidly falling costs and accelerating deployment to rapidly rising costs and stalled deployment.”
“Had the early rates continued, nuclear power could now be around 10 percent of its current cost,” he said, and by 2015, nuclear power could have replaced up to 100 percent of coal-generated and 76 percent of gas-generated electricity, thereby avoiding up to 540,000 deaths from fossil-fuel-produced pollution and the addition of 11 billion tons of carbon dioxide to the atmosphere in that year alone.
Among Lang’s other findings:
• Had nuclear plant deployment continued at mid-1970s rates, over the last 30 years more than 186,000 terawatt-hours of fossil-fuel-generated electricity could have been displaced by carbon-free electricity — avoiding the release of 174 billion tons of CO2 and 9.5 million air pollution deaths.
• Cumulative global CO2 emissions 1985-2015 would have been 18 percent lower, and annual CO2 emissions would have been one-third less.
• The construction cost of the Oyster Creek Nuclear Generating Plant in New Jersey, the oldest operating nuclear plant in the U.S., measured in 2017 dollars was $595 million. America’s newest nuclear plant, at Watts Bar in Tennessee, opened in 2016. It cost more than $7 billion and construction lasted more than 10 years.
According to an article on Lang’s study in the libertarian publication Reason, the fall of nuclear power in the United States began with a 1971 decision in the Washington, D.C. Circuit Court case of Calvert Cliffs Coordinating Committee, Inc. vs. Atomic Energy Commission.
The case, which is considered the first major judicial interpretation of the National Environmental Policy Act (NEPA), required the U.S. Atomic Energy Commission (AEC) to comply with a mandate to prepare environmental impact statements for all proposed new nuclear power plants.
There’s nothing wrong with requiring environmental impact statements. There’s plenty wrong with requiring environmental impact statements that take years to complete and hang up projects with delays, not substantive findings.
At any rate, the AEC reacted to the Calvert Cliffs decision by suspending all licensing for nuclear power plants for 18 months while it devised new rules. Carnegie Mellon Historian Andrew Ramey maintains the Calvert Cliffs ruling was “the opinion which had the most far-reaching and detrimental effect on the development of nuclear power.”
The Energy Reorganization Act of 1974 abolished the AEC and handed its regulatory powers to the newly created Nuclear Regulatory Commission, which focused almost exclusively on safety. This resulted in lengthening the construction times for nuclear power plants from four to 14 years.
The tightening regulations meant orders for new nuclear reactors almost completely dried up even before the 1979 Three Mile Island meltdown. Utilities, which still had to meet the country’s rapidly growing demand for electricity, turned to coal. Nearly 60 planned nuclear power plants were canceled.
And U.S. coal consumption, which was 390 million tons in 1974, grew to more than a billion tons, most of it going for electric power production.
Today, coal use in the U.S. is down to 677 million tons a year, not because of the use of renewables, but because the glut of natural gas produced by fracking has allowed gas to replace coal in electricity production.
Environmentalists argue that renewables can replace both fossil fuels and nuclear power. And the use of wind and solar power is increasing, both in the U.S. and globally. But so is fossil fuel use — and the amount of CO2 annually released into the atmosphere.
In other words, environmental and anti-nuclear activists are as responsible for global warming as any oil company or OPEC member.
Be careful what you wish for, because you just might get it. Good and hard.
(Peter Lang’s paper “Nuclear Learning and Deployment Rates; Disruption and Global Benefits Forgone” appeared in Energies – Open Access Journal of Energy Research, Engineering, and Policy. www.mdpi.com/journal/energies)
This opinion column does not necessarily reflect the views of Boulder Weekly.