Supercomputers, Climate Models and 40 Years of the World Climate Research Programme

The world’s ability to adapt to climate change is on the agenda as the venerable international climate research program charts its next decade.

A computer model show oxygen levels at the ocean's surface off Newfoundland, near the mouth of the Gulf of St. Lawrence. Red indicates more oxygen. Credit: Mariona Claret/University of Washington
Computer models are becoming detailed enough to account for eddies, key components of ocean circulation. This visualization shows oxygen levels at the ocean's surface off Newfoundland and Nova Scotia. Red indicates more oxygen. Credit: Mariona Claret/University of Washington

Share this article

CHEYENNE, Wyoming — On the western rim of the Great Plains, a futuristic building looks out of place on the treeless range. The NCAR-Wyoming Supercomputing Center is indeed remote, two continents away from the weather forecasts it produces each day for Antarctica and millions of miles from the space weather it tracks.

And yet, it is surprisingly connected to the rapidly evolving story of Earth’s changing climate.

The supercomputer inside the concrete and steel building, dubbed Cheyenne, is one in a global network of high-capacity computers that run vast sets of calculations to simulate the workings of our planet.

Since the 1970s, when early computer models of the climate were emerging, they’ve been remarkably successful at projecting global temperature changes as greenhouse gas emissions rise. Over the years, the computers have become faster and more powerful, enabling the models to incorporate more of the influences on the climate and zoom in with finer-scale resolution to depict local climate changes.

Accurately gauging the risks posed by climate change at a local level is among the holy grails of climate modeling, and world-changing advances likely await the next generation of ultrafast supercomputers. At only three years old, Cheyenne is already set to be replaced by a successor with triple its speed or better.


We deliver climate news to your inbox like nobody else. Every day or once a week, our original stories and digest of the web's top headlines deliver the full story, for free.

Antonio Busalacchi, president of the University Corporation for Atmospheric Research, the consortium that oversees NCAR, which operates the Wyoming supercomputer in partnership with the National Science Foundation, is leading the team that is working on procuring the next Cheyenne.

Their goal for the new supercomputer is to understand our changing climate in ways that will help the world prepare for and adapt to the effects of rising global temperatures before it’s too late. That could mean knowing how much sea level will rise along each mile of coastline in 10, 20, 50 years, and how changes in precipitation patterns will affect water availability and drought in every community in the West, or the precise path and intensity of hurricanes before they strike. It’s what Busalacchi describes as “actionable science.”

“In order to deliver on that promise, we need to have better predictive tools, he said. “We’re really at sort of a juncture in our future of being able to predict the Earth as a coupled system to help answer these pressing questions that society is asking of us.”

40 Years of World Climate Research Programme

Busalacchi is the former science steering committee leader at the World Climate Research Programme, which helps set future climate research priorities and coordinates international research among thousands of scientists to target the most pressing climate science problems at hand.

This weekend, he will join 150 of the world’s leading climate experts at a symposium in San Francisco to celebrate the program’s 40th anniversary and to plan its research strategy for the next decade. It coincides with the centennial of the American Geophysical Union (AGU), which hosts the largest gathering of Earth, space and climate scientists. Like the supercomputer upgrade in Wyoming, the meeting is taking place at a critical juncture in the history of climate science, with a growing sense of looming crisis around the world.

The World Climate Research Programme has laid a path for innovation in climate modeling since it was founded in 1980 by the World Meteorological Organization and the International Council for Science. It formed with a long-range objective to seek “a better understanding of the climate system and the causes of climate variability and change.”

The program has since helped spawn some of the world’s most important climate research, from efforts to understand monsoons in Africa to future sea ice cover in the Arctic and adapting food systems to cope with global change. It targets information gaps that no single country is likely to fill on its own by bringing together scientists and computing power from around the world.

Computer modeling is at the heart of much that work. 

Graphic: The Growth of Climate Modeling

Climate models are based on the physics, chemistry and biology of the natural world and assumptions about factors that have and will affect the climate, such as levels of greenhouse gas emissions. It turns out, they have been remarkably accurate in projecting global temperature rise dating back to the 1970s, when computing power was a fraction of what scientists have to work with today.

Earlier this week, a group of scientists published a peer-reviewed paper comparing the early climate models published between 1970 and the mid-2000s with what actually happened. They found that 14 of the 17 early models’ projections about temperature change as emissions rise were almost indistinguishable from the observed record.

Today’s computer models are far more complex, requiring supercomputers to account for everything from the forces melting Antarctica’s ice to the impact of vegetation on temperature and moisture. But there are still uncertainties, such as how aerosols impact cloud formation that could affect temperature and how and when tipping points such as loss of sea ice or thawing of permafrost will trigger faster global changes.

The next generation models—running on even more powerful supercomputers—are being designed to incorporate more detail to help answer increasingly difficult questions.

From Billions of Computations to Quintillions

Busalacchi started his career as an oceanography graduate student, and in the late 1970’s, his research took him to NCAR’s Boulder campus to use the first non-classified supercomputer.

The supercomputer he worked on as a graduate student performed about 160 million computations a second. Considered revolutionary at the time, it could model oceans, land, vegetation or the atmosphere—but not all at the same time.

In contrast, the world’s soon-to-be fastest computer coming online at the Oak Ridge National Laboratory in 2021 will perform 1.5 quintillion computations per second. Not only will computers of the future be capable of processing information on everything from the upper atmosphere to ocean currents and all the details in between—such as sea spray, dust, ice sheets and biogeochemical cycles—but it will be able to do so while capturing the ways humans influence the climate and how climate change influences humans.

Infographic: The Difference Resolution Makes in Climate Modeling

Busalacchi uses the first report of the Intergovernmental Panel on Climate Change as an example of how earlier climate science offered only fuzzy portrayals of complex climate systems. He recalls how models, based on the data computers were able to process at the time, generated maps with grid squares so big they represented much of Western Europe and lacked sufficient detail to show the British Isles or the Alps.

“Then, over the succeeding decades, the resolution got smaller and smaller and smaller,” he said, which enabled scientists to distinguish the region’s mountain ranges and river valleys.

Global Race for High-Performance Computing

While many programs focused on the environment have been targeted for funding cuts under the Trump administration, that hasn’t been true for high-performance computers needed for climate research.

Busalacchi said the National Science Foundation, the largest source of NCAR’s funding, has provided around $99 million annually over the past few years to cover its research, 1,300 employees and the NCAR-Wyoming Supercomputing Center. And there’s not much concern about funding for the $30-40 million next-generation Cheyenne.

One factor driving the momentum behind high-performance computing is fierce international competition. According to the latest top-500 list for supercomputers, U.S. national laboratories host the world’s two most powerful supercomputers, with China holding the next two spots. It’s a contest that’s so intense, NCAR isn’t really trying to keep up in Wyoming. Cheyenne started in 20th place when it came online three years ago; last month it was 44th.

John Holdren, a Harvard University professor and science advisor to former president Barack Obama says there’s another important reason supercomputing gets funding support: It’s not only scientists who prize it but business and government leaders, too, who want to use it for studying genomics, energy and other complex scientific problems.

“The reason we need even better computing than we already have, is that—for the purposes of adaptation, for taking steps that reduce the damage from the changes in climate that we can no longer avoid—we need more localized information,” he said.

“We’ve only recently gotten models good enough to reliably say how what happens in South Florida is going to differ from what happens in North Florida, how what happens in Iowa is going to differ from what happens in Nebraska,” Holdren said. “So, just in the climate change domain, there’s a very strong argument for continuing to increase the computing power.”

NCAR will be looking for a replacement for Cheyenne this spring, once major vendors have new technologies in the pipeline.

“We want to make sure that when that procurement goes out, we can take advantage of the latest and greatest technology with respect to high-performance computing,” Busalacchi explained.

The NCAR-Wyoming Supercomputing Center in Cheyenne, Wyoming. Credit: Judy Fahys
The NCAR-Wyoming Supercomputing Center in Cheyenne, Wyoming. Credit: Judy Fahys

The fact that even the next Wyoming supercomputer won’t have a spot at the top of the world top-500 list doesn’t trouble Anke Kamrath, director of NCAR’s Computational and Information Systems Laboratory.

She calls the supercomputer contest a fight over “macho-flops” that doesn’t really capture all of the qualities that make a supercomputer valuable. Avoiding data bottlenecks, is important too, as is data storage.

‘Trying to Minimize What We Don’t Know’

At the NCAR-Wyoming Supercomputing center this fall, visitors peered through a viewing window to see the wiry guts of Cheyenne’s 1.5 petaflop predecessor, Yellowstone. Before being dismantled for sale as surplus government property, Yellowstone a third as powerful as Cheyenne while taking up more than twice as much space.

During its short life, Cheyenne’s processors have whirred dutifully through nearly 20 million jobs. Some 850 projects are currently underway, moving data along wires hidden under the floor to storage stacks that scientists around the world are accessing via the web.

 Busalacchi said the modeling, observations and high-performance computing are all essential tools that climate scientists need to address the urgent challenges ahead.

“We’re trying to minimize what we don’t know,” he said.

Correction: This story has been updated to correct the description of the 1970s-era computer to 160 million computations per second.