DOE SciDAC Review Office of Science
CLIMATE: The World in Balance
Exascale Supercomputing Advances Climate Research
DOE Office of Science leadership computing systems have gone from 7 teraflop/s at the NERSC Center in 2003 to almost 3 petaflop/s at ORNL in 2009. By 2018 DOE supercomputers will achieve exaflop/s levels of performance. This million-fold increase over 16 years continues to accelerate advancements in fields from astrophysics and biology to medicine and nanotechnology. In climate science it may spur a quantum leap forward as supercomputers enable the largest climate datasets, highest-resolution climate models, and speediest, most complex climate simulations ever. With no less than the world at stake (observed emissions are above the worst-case scenario considered by the most recent IPCC assessment), exascale computing will help validate climate models and improve climate predictions. That means global stakeholders can ask "what if?" and get the best answers science can deliver.
 
Since the Industrial Revolution, human activities have been releasing greenhouse gases into the atmosphere, mainly carbon dioxide from fossil fuels and nitrous oxide and methane from farming. This century Earth's average temperature may rise by 1–6 degrees Celsius (2–12 degrees Fahrenheit) as a result, according to the United Nations' Intergovernmental Panel on Climate Change (IPCC), which shared a 2007 Nobel Prize with former vice president Al Gore. The expected consequences of atmospheric carbon levels that have risen more than 30% since the invention of the steam engine include melting glaciers, rising sea levels, and intensified heat waves, floods, and droughts.
Supercomputers help stakeholders explore their options and plan accordingly. In few fields are high-performance computing (HPC) systems as crucial as they are in climate research. Scientists incorporate what is known about climate into mathematical models and set the calculations in motion in simulations that reflect the planet's past and present and probe its future. The Advanced Scientific Computing Research (ASCR) program develops and deploys computational and networking tools that enable scientists to model, simulate, analyze, and predict phenomena important to the Department of Energy (DOE) Office of Science. ASCR created a roadmap to increase America's scientific computing capacity a million-fold, from teraflop/s (trillions of floating point operations, or calculations, per second) in 2004, to petaflop/s (quadrillions of floating point operations per second) in 2008, to tens of petaflop/s by 2011, to hundreds of petaflop/s by 2015, and on to exaflop/s (quintillions of floating point operations per second) by 2018. Exascale computing will generate the world's largest climate datasets with unprecedented speed, enabling high-resolution models that will improve predictive skill, add new science capabilities, and expand explorations of scenarios of interest to stakeholders (sidebar "Interagency Climate Research Partnerships Leverage America's Supercomputing Investments" p24).
J. DANIEL AND E. BRIGHT, ORNL
Figure 1. Coupled population and climate simulations aid humanitarian relief efforts. ORNL's Global Population Project helps thousands of experts coordinate disaster response, humanitarian relief, sustainable development, and environmental protection. To improve assessment of regional impacts of climate change, DOE's Scientific Discovery through Advanced Computing program is developing an ultra-high-resolution version of the CCSM tailored for the world's fastest supercomputers, such as Jaguar at ORNL and Dawn at LLNL. Scientist Mark Taylor at Sandia National Laboratories and colleagues ran a simulation of Earth's atmosphere that used the CCSM and 58,000 of Dawn's processors. To explore how climate change might put people at risk, the CCSM climate output was overlaid onto the LandScan population dataset. Simulation of the global water vapor distribution, which illustrates features of the hydrological cycle and planetary radiation budget, is shown here for December and January. It may reveal regions at risk for droughts and heat waves and guide stakeholders in taking action.
What if, for example, global warming shrinks glaciers, which supply much of Earth's fresh water for drinking, bathing, farming, and generating electricity? In the Himalayas alone, glaciers feed rivers supporting 2.4 billion people — one-third of the world's population. Himalayan glaciers could shrink significantly by 2035, according to the IPCC's fourth assessment report, AR4. Floods first and droughts later could ensue in China, India, Pakistan, Bangladesh, Myanmar, and Nepal. Climate change may alter lives and livelihoods by spurring famine, disease, migration, and war. Investigating possible scenarios with extreme-scale supercomputers may help stakeholders devise strategies for reducing risks through mitigation — doing what we can to reduce further warming, such as decreasing greenhouse gas emissions — and adaptation by coping with current climate-change impacts, such as intensified heat waves (figure 1)
Another case in which ultrascale computing may lend insight is determining sea-level rise as polar ice melts. According to a 2008 United Nations Human Settlements Program report, 3,351 coastal cities — home to 10% of the world's population — are less than 10 meters (33 feet) above sea level. The IPCC estimates sea-level rise by the end of the 21st century will be 0.2 to 0.6 meters (0.6 to 1.9 feet). The more detailed simulations enabled at exascale may increase the level of certainty of such conclusions.
"Ultrascale systems will continue to accelerate the Department of Energy's mission of breakthrough science," says Jeff Nichols, associate laboratory director for computing and computational sciences at Oak Ridge National Laboratory (ORNL). The lab is home to Jaguar, the world's fastest supercomputer for civilian scientific research. "With increased computational capability, the scientific research community can obtain results faster, better understand complexities, and provide critical information to policymakers."
Yesterday's discoveries were built on the twin pillars of theory and experiment, but today's advances also rest on a third pillar — simulation. "High-performance computing is essential to progress in climate science," says James J. Hack, director of the National Center for Computational Sciences (NCCS), which features diverse computing resources including the Oak Ridge Leadership Computing Facility (OLCF) that houses Jaguar. "In many cases, simulation is the only way to get the answer. You can't pump the atmosphere full of gases and wait a hundred years to see what happens. At the petascale, simulations that once took months now take days."
"Ultrascale systems will continue to accelerate the Department of Energy’s mission of breakthrough science."
AR4 concluded that global warming is definitely happening and humans probably caused most of it since the mid-20th century. HPC systems at Lawrence Berkeley National Laboratory's (LBNL) National Energy Research Scientific Computing Center (NERSC) and ORNL's OLCF, working at the speed of teraflop/s, provided more than half of the joint DOE/National Science Foundation (NSF) data contribution to AR4. The DOE/NSF data contributed to a repository that scientists worldwide accessed to write approximately 300 articles that were published in peer-reviewed journals. The IPCC, which neither conducts research nor monitors climate but instead evaluates published scientific literature, cited those articles in AR4.
OLCF

NERSC
ALCF
Figure 2. DOE Office of Science major computing facilities. Enabling world-class research and scientific advancements, DOE Office of Science resources include: (top) Jaguar, a 2.6 petaflop/s Cray XT system at ORNL; (middle) Franklin, a 352 teraflop/s Cray XT system at LBNL; and (bottom) Intrepid, a 557 teraflop/s IBM Blue Gene/P HPC system at ANL.
Some of DOE's largest computing systems — Jaguar, a Cray XT system at ORNL; Franklin, a Cray XT system at NERSC; and Intrepid, an IBM Blue Gene/P system at Argonne National Laboratory (ANL) (figure 2) — will generate the world's largest datasets for studies that support the upcoming Fifth Assessment Report (AR5). The repository that scientists access to produce papers cited in AR5, expected in 2014, will include data from the OLCF and Argonne Leadership Computing Facility (ALCF), as well as climate-change attribution studies (that is, studies able to attribute a response to a specific forcing agent, such as atmospheric carbon dioxide) conducted at NERSC.
In 2008 through DOE's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, the open scientific community gained access to its first petascale system. Upgrades since then have boosted Jaguar to a peak calculating speed of 2.6 petaflop/s for the combined system (2.332 petaflop/s from the XT5 component and 0.263 petaflop/s from the XT4 portion). The more than a quarter-million AMD Opteron processing cores working in concert can do in a day what it would take every person on Earth working together several hundred years to do — assuming each person could complete one calculation per second.
 
Petascale Projects, Exascale Ambitions
Immediately after Jaguar's petascale upgrade but before INCITE awardees began their runs, the OLCF granted early access to several computationally taxing climate projects.
One of these, a National Aeronautics and Space Administration (NASA) project headed by Max Suarez, simulated weather features more than 3 kilometers (2 miles) across — a resolution so fine it shows the detailed structure of cloud systems. In comparison, simulations cited in AR4 looked at weather features at roughly 140 kilometer (87 mile) resolution. Cloud processes regulate the flow of radiation in the upper atmosphere, control precipitation, and exert other important effects. Yet they are among the least understood of atmospheric processes. The petascale simulation may enrich understanding of how clouds affect climate (figure 3) and improve capabilities for predicting weather and simulating climate change.
STOCK PHOTO
Figure 3. Exascale computing will allow models to include news physics aimed at understanding how clouds affect climate.
Led by V. Balaji of the National Oceanic and Atmospheric Administration's (NOAA's) Geophysical Fluid Dynamics Laboratory (GFDL) at Princeton University, a project modeled natural and forced climate variability at a resolution high enough to realistically depict physical processes. The model explored the limits of climate prediction, ability of the ocean's "memory" to modulate climate change, and behavior of hurricanes in a warmer world. The simulations may provide insight into climate variability over decades and improve prediction of regional events. Balaji's early access to Jaguar at the OLCF, as well as Office of Science Director awards on Franklin at NERSC in 2007 and 2008, helped him prepare for his INCITE project. Called CHiMES (Coupled High-Resolution Modeling of the Earth System), the project uses 20 million processor hours to conduct a first-of-its-kind, high-resolution, multi-century suite of simulations to parse and predict both natural and forced variability in the coupled climate system. It is also producing a series of groundbreaking simulations about how hurricane counts and intensities might evolve in a changing climate. GFDL models on DOE computers have yielded a series of recently published articles that are already being cited in policy debates around climate extremes.
To mitigate emissions Peter Lichtner of Los Alamos National Laboratory (LANL) has led a project, still ongoing, to weigh the long-term risks and rewards of capturing carbon from the atmosphere and storing it underground in saline aquifers. The project investigates the behavior of "supercritical" carbon dioxide, which is dense like a liquid but lighter than the aquifer brine into which it is injected. As supercritical carbon dioxide dissolves into the brine, the brine becomes heavier and more acidic and starts to sink, resulting in a convection current with "fingers" of sinking brine. Through simulations Lichtner's team, which includes Glenn Hammond of Pacific Northwest National Laboratory (PNNL) and Richard Mills of ORNL, is exploring this poorly understood event, which may speed the dissolution of carbon dioxide and play an important role in its ultimate fate. "Once we understand how supercritical carbon dioxide behaves underground, sequestration can be an important method for reducing greenhouse gases in the atmosphere," Lichtner says.
Simulations may provide insight into climate variability over decades and improve prediction of regional events.
Now that those projects have paved the way for the larger scientific community to make the most of the resources, in 2010 the Leadership Computing Facilities (LCF) will award 1.6 billion processor hours on Intrepid and Jaguar to 66 research projects through the INCITE program, seven of which relate to climate change.
Crucial to climate research is an INCITE project led by Warren Washington of the National Center for Atmospheric Research (NCAR) in which a large community of scientists has used more than 20 million processor hours at NERSC and almost 50 million hours at the OLCF, and plan to use an additional 7.5 million hours at the ALCF to execute the next-generation Community Climate System Model (CCSM), a megamodel that couples independent models describing Earth's atmosphere, oceans, lands, and sea ice. DOE, NSF, and NASA fund development of the model, which contributes to the science missions of all three agencies. The researchers aim to develop a Climate-Science Computational End Station to solve grand challenges in climate science. End station users will try to predict future climate using emission scenarios that might result from adopting different energy policies. Climate community studies based on the project's simulations may improve the scientific basis, accuracy, and fidelity of climate models. Validating that models correctly depict the past improves confidence that simulations can accurately forecast responses to forcing agents.
Understanding the influence of the ocean on climate is the objective of several projects. Paola Cessi of the Scripps Institution of Oceanography at the University of California–San Diego, leads a project that has used 4.5 million processor hours at NERSC and more than 750,000 hours at the OLCF, with plans for an additional five million processor hours at the ALCF. With models based on conservation principles of heat, salinity, and momentum, the researchers will study how ocean circulation affects climate dynamics. The work may shed light on the feasibility of storing carbon dioxide in the oceans. Similarly, the research team of Susan Kurien (LANL), Leslie Smith (University of Wisconsin–Madison), Mark Taylor (Sandia National Laboratories), and Ramesh Balakrishnan (ANL) will use 25 million processor hours at the ALCF to study the effects of the small fluid dynamics scales, which are largely omitted from current climate models, on larger-scale resolved features such as global and regional ocean currents. The research will focus on idealized systems with rotation and stratification parameters that are relevant to real ocean and climate dynamics. An important past project, by Synte Peacock and Frank Bryan at NCAR and Mathew Maltrud at LANL, used Jaguar to carry out the first century-long global eddying ocean simulation. After the ocean absorbs such chemicals as carbon dioxide and chlorofluorocarbons from the atmosphere, it will store these substances for hundreds to thousands of years and circulate them around the globe, transporting them in complex and unknown ways (figure 5).
J. DANIEL, NCCS
Figure 5. The POP2 model shows ocean temperatures at 5 meters (16 feet) in this simulation of heat transport.
Paleoclimate is another hot area for petascale research. In 2009 Science magazine published results of a simulation of abrupt climate change, led by Zhengyu Liu at the University of Wisconsin and Bette Otto-Bliesner at NSF's NCAR. Most climate simulations are discontinuous, amounting to snapshots of century-sized time slices taken every 1,000 years or so. Such simulations can miss transitions occurring over centuries or millennia. Oak Ridge supercomputers used four million processor hours to run the CCSM in a continuous simulation that stitched together an uninterrupted stream of climate snapshots and recovered Earth's history as a motion picture. The results started ice sheets melting in North America and Eurasia 19,000 years ago. By 17,000 years ago, melting glaciers had dumped so much fresh water into the North Atlantic that the overturning ocean circulation stopped and Greenland cooled. Freshwater flux continued until about 14,500 years ago, then virtually stopped. Over subsequent centuries, Greenland's temperature rose by 27 degrees Fahrenheit (15 degrees Celsius) and sea level rose about 16 feet (5 meters). Based on the simulations, researchers proposed a new mechanism to explain events during Earth's most recent period of natural global warming. The research may illuminate the effect of today's glacial melting in Greenland and Antarctica on tomorrow's oceans (figure 6, p26).
STOCK PHOTO
Figure 6. What if glaciers in Greenland and Antarctica melt? Exascale systems may elucidate the effect on ocean circulation.
Diving not so far into the past but employing much higher resolution, Gilbert Compo of the University of Colorado leads the Surface Input Reanalysis for Climate Applications (SIRCA) project, which reconstructs past weather and weather extremes, such as hurricanes and heat waves, to improve climate models predicting future extreme weather. With 1.1 million hours on Jaguar at the OLCF and four million hours on Franklin at NERSC, the researchers will use historical observations to reconstruct global weather conditions in 6-hour intervals from the year 1871 to the present with 56 equally likely "weather maps" for each interval. These maps discern details 200 kilometers (124 miles) apart. The petascale Jaguar will enable SIRCA to reconstruct global weather conditions in 6-hour intervals starting farther back in time, from 1850 to present with 56 weather maps for each interval in finer-grained detail (100 kilometer, or 62 mile, resolution). Plus, the researchers can zoom in on hurricanes, severe storms, and floods to generate maps showing details 60 kilometers (37 miles) apart. Zooming in on extreme weather requires 100 times the computing power of a conventional simulation. The upcoming tens of petaflop/s may enable the project to reconstruct all of 1850 to present at this zoomed-in level of detail. The ambitious reanalysis project should be done by 2012. "These maps will give us the ability to examine how our state-of-the-art models simulate weather events," says Compo, who has used more than 6.5 million processor hours at NERSC to reproduce major weather events such as the Galveston hurricane of 1900 and the 1930s Dust Bowl. "It should give us confidence in our understanding of how our weather will change in future decades."
Using two million hours at the OLCF and seven million hours at NERSC, David Randall of Colorado State University and his colleagues will study clouds in the global atmosphere to improve capabilities for weather prediction and climate-change simulation. When rising air expands and cools, clouds can form, and the dynamics of moving air affect processes on scales both large (for example, the generation of thunderstorms and the scattering of solar and terrestrial radiation) and small (such as the birth of cloud drops, ice crystals, and aerosols). Processes interact with each other over timescales of a few minutes. They also interact with larger-scale circulation systems. As computers approach exascale and horizontal resolution of a few kilometers becomes possible, global cloud resolving models (GCRMs) will be able to discern the growth and decay of individual large clouds. GCRMs are ideal vehicles for the implementation of advanced parameterizations of microphysics, turbulence, and radiation.
Based on the simulations, researchers proposed a new mechanism to explain events during Earth’s most recent period of natural global warming.
STOCK PHOTO
Figure 7. Scientists are using supercomputers to study climate response to aerosols from volcanic eruptions.
The influence of atmospheric aerosols is also important. Led by Kate Evans, scientists at DOE's ORNL and NSF's NCAR aim to conduct a petascale simulation of climate response to aerosols from volcanic eruptions (figure 7). After a large eruption, sulfur dioxide gas circulates Earth within weeks, interacting with oxygen to form a sulfate aerosol that can stay in the upper reaches of the atmosphere for up to several years. In this scenario reflection of solar radiation off the additional aerosol particles causes short-term cooling. Project members have performed early simulations to configure the Community Atmosphere Model to meet demands at petascale, which enable calculation of temperatures and pressures for millions of points around the globe every 150 seconds from 1978 to the present. The model's response to short-term effects due to volcanic aerosols may increase confidence in the ability to predict a longer-term response to human-induced aerosol production if observational and simulation data match well.
 
As computers approach exascale and horizontal resolution of a few kilometers becomes possible, global cloud resolving models will be able to discern the growth and decay of individual large clouds.
What Exascale Computing Would Mean to Climate Science
While climate research at the petascale is impressive even in its nascent stages, the possibilities at exascale are even more promising. ASCR's terascale-to-exascale roadmap anticipates exascale computing — a thousand times today's capability — by 2018. Among the scientists clamoring loudest for exascale computing are climate researchers, for whom exaflop/s will enable models and simulations with enhanced fidelity and increased complexity.
"The climate problem has an insatiable appetite for computing," says Hack, who helped lead development of the Community Atmosphere Model, which much of America's climate science community uses. "The realistic representation of the Earth system is paced by computing."
Exascale computing will improve the ability to accomplish a number of tasks, including those outlined in the following subsections.
 
Validate Models
Climate modelers try to provide theoretical explanations of phenomena and represent them mathematically. Before a model is accepted as a plausible representation of reality, scientists compare it with observed phenomena and determine its accuracy — a process called validation. Climate scientists are constantly working to improve the validation processes for their models. Recently DOE and NSF's NCAR selected ORNL as a computational end station for climate science. That means Jaguar will run crucial simulations using the CCSM and generate the largest set of publicly available climate data anywhere. By providing more points of validation, this enormous dataset will help the climate science community improve the next-generation CCSM. The data will be placed in a repository for analysis, publication, and citation in AR5. The comprehensive and detailed studies enabled by Jaguar will improve the level of certainty for future IPCC conclusions.
 
“The climate problem has an insatiable appetite for computing. The realistic representation of the Earth system is paced by computing.”

JAMES J. HACK

Push the Limits of Predictive Skill
The predictive skill of a simulation depends on time and space scales within the system probed. A water resources manager may want to predict rains over a river basin, but if the width of the river basin is much shorter than the length scale that the numerical model can resolve, the simulation has no chance of success. "Exascale is going to get us to the point where we can run models at a fine enough resolution that we're resolving scales of motion that are of interest to the stakeholder community," Hack says. "We won't be able to properly address these kinds of problems without ultrascale machines." Advances in computing power are enabling climate simulations of unprecedented resolution, opening the door for researchers to analyze climate change at the regional level. Leading climate researchers have simulated cells 100 kilometers (62 miles) wide for both atmosphere and ocean. As they take the next step, to cells about 25 kilometers (15 miles) wide, understanding may improve of climate variability that is natural (examples include changes caused by variations in planetary orbit, solar output, and volcanic eruptions) and forced (such as those caused by human activity). At a resolution of 10 kilometers (6 miles) in the ocean, for example, researchers begin to see the influence of ocean eddies that transport heat poleward and are important to ocean mixing processes. At a resolution of 25 kilometers (15 miles) in the atmosphere, researchers can incorporate many of the organized mesoscale processes that regulate Earth's water cycle.
 
Build Biogeochemistry into Simulations
Exascale computing may allow researchers to include more biogeochemistry in their simulations. In a 2009 article in the journal Biogeosciences, researchers at ORNL, NCAR, NOAA, and several universities reported including fundamental ecological interactions between carbon and nitrogen cycles in their simulations of global climate change. Lead author Peter Thornton of ORNL said the rate of change over the next century could be higher than previously thought when plant nutrients are included in the model. Plant growth is limited by the nutrient supply. Less growth means less vegetation to remove carbon dioxide from the atmosphere. While reduced growth is partially offset by increased availability of nutrients due to accelerated decomposition at higher temperatures, the authors combined the effects and found less new growth would occur than expected, meaning higher atmospheric carbon dioxide levels would result.
 
Supporting decision makers who assess strategies for mitigating or adapting to climate change will take highresolution models that can reliably predict changes over 20 or 30 years.
Analyze New Processes
Scientists may be able to build new physics into models for exascale simulations. "Exascale will get us to the point where we can develop global cloud system resolving models," Hack says. Improved understanding of cloud processes is the focus of extreme-scale computing work by Suarez and Randall. Global water vapor distribution influences the hydrological cycle and planetary radiation budget. Exascale computing would enable more process fidelity, allowing higher resolution and more accurate exploration of organization within modeled systems. "For the climate system, you solve a continuous system of equations which are discretized so they can be solved numerically. The discretization process is an approximation," Hack says. If a cloud is 1 kilometer (0.6 mile) wide, it will take several grid points to accurately represent the cloud's features. "If someone has a global model with a [resolution] of 1 kilometer, it does not mean that they can represent 1 kilometer features. They can represent, say, a 10 kilometer (6 mile) feature," he explains. Coarse grids may miss features that finer meshes catch. Case in point: Tennessee is 708 kilometers (440 miles) wide, but the state is represented by only two to three pixels in most global climate models.
 
Enable Decadal Forecasts
Supporting decision makers who assess strategies for mitigating or adapting to climate change will take high-resolution models that can reliably predict changes over 20 or 30 years. The goal of decadal prediction (figure 8) poses computational challenges. How do scientists choose the starting conditions for a simulation if observational records are limited? How do they best couple models of oceans, land, atmosphere, cryosphere — and all the associated biogeochemistry — to ensure forecasts do not drift as a simulation runs? How do they address unpredictable components, such as volcanic eruptions, and still end up with a simulation that produces a reliable forecast? The Coupled Model Intercomparison Project Phase 5 (CMIP5), a standard experimental protocol for studying the output of coupled atmosphere–ocean general circulation models, will support the IPCC AR5 activity. CMIP5 provides an infrastructure to aid climate model diagnosis, validation, intercomparison, documentation, and data access. It enables a diverse community of scientists to systematically analyze the models, facilitating improvements. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) archives CMIP5 data and provides other support. The coupled models allow the simulated climate to adjust to changes in climate forcing, such as increasing atmospheric carbon dioxide. The CMIP5 experiments aim to improve the accuracy of predictions made for the near term (decades) or long term (centuries). An enhanced understanding of ocean dynamics may improve decadal predictions, as much of the "memory" built into the dynamics of the climate system takes place in the seas.
 
STOCK PHOTO
Figure 8. Where will droughts make crops unsustainable? Exascale computing may enable decadal forecasts.
Explore Tipping Points
A broad community of researchers explores large-scale change that happens more quickly than that brought on by forcing mechanisms. One such project, led by William Collins of LBNL, is called IMPACTS (Investigation of the Magnitudes and Probabilities of Abrupt Climate Transitions) and explores destabilization of marine ice sheets in West Antarctica, interactions between biosphere and atmosphere that could cause megadroughts in North America, and more. Another important project used ORNL's Jaguar, the world's fastest supercomputer for open science, and Phoenix, an earlier machine that provided data for the IPCC AR4, for the first continuous paleoclimate simulation, revealing a transient period of abrupt climate change. Liu and Otto-Bliesner led an interdisciplinary, multi-institution research group attempting the world's first continuous simulation of 21,000 years of Earth's climate history, from the last glacial maximum to the present, in a state-of-the-art climate model. The group will also extend the simulation 200 years into the future to forecast climate. So is the disastrous outcome of the movie The Day After Tomorrow our fate? "The current forecast predicts the ocean-overturning current is likely to weaken but not stop over the next century," Liu says. "However, it remains highly uncertain whether abrupt changes will occur in the next century because of our lack of confidence in the model's capability in simulating abrupt changes. Our simulation is an important step in assessing the likelihood of predicted abrupt climate changes in the future because it provides a rigorous test of our model against the major abrupt changes observed in the recent past." Exascale simulations may enhance explorations of forcings that either lead to a reversible state for the global climate system or drive the global climate system to a new state (figure 9).
 
J. DANIEL, NCCS
Figure 9. Abrupt climate change. Running NCAR's CCSM3 model, the simulation shows deglaciation during the Bolling–Allerod, Earth's most recent period of natural global warming.
Assess Geoengineering
Geoengineering is deliberate manipulation of physical, chemical, or biological aspects of the Earth system to counteract consequences of greenhouse gas emissions. Examples include reducing atmospheric gases by fertilizing the oceans and planting forests as well as cooling the planet by reflecting sunlight via particles spewed high in the atmosphere, mirrors launched into space, or clouds altered in number or radiative characteristics. Conducting geoengineering experiments in the real world may have adverse effects or unintended consequences. Asking "what if?" in the virtual world seems safer, so supercomputers may have an important role to play in assessing high-risk actions. Is geoengineering a dangerous distraction or a strategy on par with adaptation and mitigation? In July 2009 the American Meteorological Society issued a policy statement that geoengineering the climate requires more research, cautious consideration, and appropriate restrictions.
 
Better Understand Oscillations
The model improvements that would come with exascale computing could advance understanding of climate oscillations, such as El Niño, an oscillation of the ocean–atmosphere system in the tropical Pacific that has important consequences for weather worldwide. Exascale simulations could improve predictions of decadal oscillations, such as the Pacific Decadal Oscillation, which has phase shifts every 20–30 years. The climate community currently lacks estimates of the limits of predictability and a full understanding of sources of predictability.
 
Exascale simulations may enhance explorations of forcings that either lead to a reversible state for the global climate system or drive the global climate system to a new state.
Increase Capacity
Exascale systems will allow addition of components to simulate features with important climate effects, such as aerosols, clouds, and ice sheets. A weighty step in that direction is a new initiative of DOE's Scientific Discovery through Advanced Computing (SciDAC) program to deliver accurate, high-resolution ice sheet simulation capabilities. Through six parallel yet complementary multi-institutional efforts, the Ice Sheet Initiative for CLimate ExtremeS, or ISICLES, program addresses the need for advanced dynamical ice sheet modeling at the petascale. The program uses ASCR's tools and research capabilities to accelerate scientific and computational breakthroughs in state-of-the-science climate models. A top priority is the ability to reproduce observational ice sheet parameters at a scale that resolves the separation of the ice sheet from land to create a shelf over the ocean. Results from efforts to improve ice sheet modeling will also improve capabilities that apply to other climate and multiphysics applications within the SciDAC portfolio.
 
Put People in the Mix
Integrated assessment models are the primary tool for describing the human Earth system — the anthropogenic greenhouse gases and other emissions and land-use and land-cover changes that cause climate change. By comparison, Earth system models are the primary scientific tools for examining the climate, biogeophysical, and biogeochemical impacts of changes to the radiative properties of Earth's atmosphere. A five-year project of Jae Edmonds of PNNL, John Drake of ORNL, and Bill Collins of LBNL is the first to link Earth system modeling, which requires supercomputing, with economy and policy. By integrating the economic and human dimensions of integrated assessment models with fully coupled Earth system models, the researchers will improve climate predictions and enhance scientific understanding of climate impacts and adaptation opportunities. "Exascale machines could give us the capability, in time, to develop more complete ensemble calculations and potentially enable much higher spatial resolution in integrated Earth system models, which include state-of-the-art representations of both human and natural systems," explains Edmonds, who has been a lead author on all major IPCC assessments and now serves on an IPCC steering committee about new integrated assessment scenarios.
 
C. DOUTRIAUX, LLNL
Figure 10. Self-generation of a category 4 tropical cyclone on day 0 (a), day 2 (b), day 4 (c), and day 6 (d) from the Ultra-High-Resolution Community Climate System Model Simulation, run on the Atlas supercomputer at LLNL by a team of scientists from DOE laboratories and NCAR. Grid spacing is 0.25 degree for the atmosphere and 0.1 degree for the ocean. Colors represent sea-surface temperature; contour lines depict surface pressure. At this resolution the phenomenon of cold-water upwelling produced by the storm's winds was realistically simulated, appearing as a cold-water "wake" behind the storm track.
Challenges Ahead
Exascale systems will bring big boons (sidebar "Exascale Computing Could Spur Integrated Climate Assessments" p30), but they will also bring big challenges. On November 6–7, 2008, NCAR's Warren Washington chaired a workshop titled "Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale." The report (see Further Reading) details how next-generation models and the greater computing capabilities they require will challenge current frameworks for computation, communication, and analysis. To keep pace with evolving scientific and computational infrastructures, the authors recommend immediate, proactive investments in data management and algorithms.
Exascale systems will allow addition of components to simulate features with important climate effects, such as aerosols, clouds, and ice sheets.
In the area of data management, challenges are daunting. Climate model datasets are growing faster than datasets of any other scientific field and are expected to reach hundreds of exabytes by 2020. For access by the world's scientists, the data must be copied and stored at many sites. Distributed data systems challenge system architectures, software applications, and networking infrastructures.
Algorithms are tricky too. They will have to exploit yet-to-be-developed system capabilities for detecting faults and recovering from them. They will need to be decomposable into a very large number of independent tasks of approximately the same cost to keep all processors busy. Coordination between tasks will need to occur infrequently so that time spent communicating does not prevent processors from staying busy. In particular, global communication, in which all processes communicate before processing continues, is expensive at scale and needs to occur even less frequently. Algorithm design will likely become even more complex as computing moves toward the exascale. Proposed exascale systems are making greater use of extensions, such as floating point accelerators, to improve application performance. The accelerators introduce asymmetries and heterogeneities into the target processor architectures, favoring certain types of computations over others.
 
Contributor Dawn Levy is a science writer at NCCS
 
Further Reading
Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale
http://www.science.doe.gov/ascr/ProgramDocuments/Docs/ClimateReport.pdf