December 7, 2007

Behind the Scenes of the IPCC: Supercomputers, Teamwork, and the Nobel Prize

 
 
 

The 2007 Nobel Peace Prize was awarded to the Intergovernmental Panel on Climate Change (IPCC) and former Vice President Al Gore, "for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change." The prize was particularly meaningful for the National Center for Atmospheric Research (NCAR), where over 40 of these Nobel laureates work to develop and interpret the climate models that feed into the IPCC reports.

More goes into generating these models than meets the eye. The accomplishments of the IPCC are the result of decades of dedicated efforts by scientists, programmers, engineers, and support staff. They are also the culmination of intense planning, collaboration, and ingenuity. And along the way, the process engendered a true sense of community, with a recognition that this kind of achievement comes from many parts working together, and many small, but vital pieces, falling into place.

One key component was developing the Community Climate System Model (CCSM). The CCSM, run on some of the world’s most powerful supercomputers, simulates the many interconnected events that drive Earth’s climate. These include changes in the atmosphere and oceans, the ebb and flow of sea ice, and the subtle impacts of forests and rivers.

The CCSM is unique among powerful climate models. Primarily supported by the National Science Foundation (NSF) and the Department of Energy (DOE), with additional support from the National Aeronautics and Space Administration (NASA), it belongs to the entire community of climate scientists, rather than to a single institution. Hundreds of specialists from across the United States and overseas collaborate on improvements to the CCSM. The model’s underlying computer code is freely available on the Web. As a result, scientists throughout the world can use the CCSM for their climate experiments.

Another crucial element was computing capability and capacity. As NCAR developed a timeline to provide model data to the community, managers at NCAR’s Computational and Information Systems Laboratory (CISL) quickly recognized the need to move up the acquisition of the next supercomputer to complete the intensive model runs on time. Doing that meant advancing the normal procurement schedule by over a year and adding several millions of dollars of additional nodes dedicated to the IPCC production runs.

CISL also worked diligently to minimize the risk of computer downtime, and conducted a series of risk reduction exercises that studied different scenarios to address potential problems and threats. Tom Bettge led the initiative, and it paid off when hackers attacked a number of government supercomputers just as the main IPCC runs were going into production. The attack led NCAR to temporarily shut down all external access to its supercomputers, but the IPCC simulations, already configured to operate under such conditions, continued to run uninterrupted. This was particularly important, because NCAR was one of three U.S. institutions (along with Oak Ridge National Laboratory and The National Energy Research Scientific Computing Center) to provide dedicated computer time for the models.

“NCAR recognized what it would take to do these model runs and made those resources available in the form of dedicated computer time,” says project manager Lawrence Buja. “Dedicated time is a two-edged sword. When you have sole access to a significant portion of a supercomputer, you can do remarkable things. But, during that same time, nobody else can run much developmental research. It was a huge responsibility, because if our models stop running, we become accountable not only for the wasted resource, but the missed research opportunity.”

To ensure that valuable computing time was never lost due to code problems, Buja ran automated monitoring programs that continuously checked the status of the IPCC runs. If any of the simulations ran into a problem, the monitor program automatically called Buja’s cell phone to notify him. “After a few late-night calls, I was very motivated to ensure that the IPCC run environment was as robust and fault-tolerant as it could be,” he says.

Buja’s philosophy did more than ensure productivity for the climate model runs. It set a new performance standard for the organization. But Buja credits the effectiveness of the CISL computing and data support infrastructure for the success of the IPCC production effort. “Scoping the required computing and data resources with CISL managers was just the first step in a process that involved many groups across the organization,” says Buja. “ESS and NETS provided a stable environment. HSS worked with us one-on-one to keep the simulations running as smoothly and efficiently as possible. And MSS and VETS helped deliver the data to the global science community. This was a true team effort.”

In terms of creating and providing climate model data to the global scientific community, and in particular to the IPCC, NCAR met and exceeded its goals, with the CCSM becoming the largest single contributor of model data to the IPCC of any group in the world.



The Computational and Information Systems Laboratory (CISL) is part of the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. NCAR is operated by the University Corporation for Atmospheric Research under the primary sponsorship of the National Science Foundation.