ProQuest www.csa.com
 
 
RefWorks
  
Discovery Guides Areas
>
>
>
>
>
 
  
e-Journal

 

Climate Models and Global Climate Change
(Released December 2007)

 
  by Christopher Readinger  

Review

Key Citations

Visuals

News

Glossary

Editor
 
History of Climate Models

Contents

Scientists employ computer models to assist in a wide variety of tasks, including forecasting day to day weather, analyzing local severe weather events, predicting future climates, and even modeling the atmospheres of different planets. Shortly after the invention of the computer however, scientists' goals were more humble, since they rarely had much more than a few bytes of memory to work with and had to spend a significant amount of time repairing hardware.

man at enormous computer
The ENIAC (Electronic Numerical Integrator and Computer)

In the mid-20th century, as the idea arose that computers could perform the myriad calculations to simulate atmospheric motion, scientists attempted to apply the pre-defined laws of physics and fluid dynamics to recreate large scale atmospheric circulation. After several attempts they soon learned that the atmosphere was much more complex than their simple models could handle. They were greatly limited by computer technology and more importantly lacking in important knowledge of how climatic processes interact and how they influence climate. On one of the early forecast models, run on ENIAC (electronic numerical integrator and computer), one of the first computers, the modelers found that a two dimensional simulation with grid points 700km apart with 3 hour time steps could forecast for a 24 hour period in about 24 hours, meaning that the model was just able to keep up with the weather as opposed to creating useful forecasts days in advance. Models were indeed simple compared to today; for example, after several failed attempts to create a basic representation of large scale atmospheric flow, scientists at Princeton University's Geophysical Fluid Dynamics Laboratory (GFDL) created a model that incorporated large eddies, making the simulation much more representative. This experiment was deemed a major success and the model is considered the first true GCM. It showed scientists just how significant transient disturbances and smaller scale processes are in influencing the transportation of energy and momentum throughout the atmosphere.

With this success research groups around the country began to develop their own models, including at UCLA's Lawrence Livermore National Laboratory (LLNL) and the National Center for Atmospheric Research (NCAR), further adding to the resources attempting to accurately forecast weather and model the climate system. With a greater number of scientists working on the problem, more was learned about the climate system and progress accelerated. In addition, the rapid increase in computer technology, from the few bytes of memory the first modelers had to work with, to kilobytes, megabytes, and gigabytes, enabled the creation of much more complex models.

Even with drastic advances in technology and scientific knowledge, climatologists still have to make many compromises in terms of realistically representing the Earth. For example, until recently most models focused only on atmospheric circulation (AGCMs) whereas we now know that the oceans, cryosphere (glaciers, ice sheets, sea ice, snow cover), and land surface play extremely important roles in shaping our climate. Today, most models contain a separate or self-contained oceanic component that actively interacts with the model atmosphere. These are called Atmosphere-Ocean coupled models, or AOGCMs.

While early modelers made significant progress, the models still had problems reliably forecasting climatic trends or oscillations. Because the model resolution was extremely coarse many processes had to be parameterized. Model resolution is analogous to photographic resolution as a measure of how small you can look at details. In computer models resolution is important for small scale disturbances like thunderstorms and cyclones and also for accurate representation of the Earth. For example, in early GCMs the land surface resolution was so coarse that peninsulas and islands such as Florida and the UK did not exist and the Great Lakes were treated as land. While extremely fine resolution may be ideal, a balance must always be struck between model resolution and the computer power available. If a model takes months to run then it's not useful to modelers trying to do experiments. The computer power/resolution balance can be thought of as follows: for every doubling in spatial resolution (horizontal and vertical) there is an eightfold increase in grid points to solve for, and very often to keep the model mathematically stable the time step must be halved as well, meaning you would need 16 times more computer power just to double your model resolution.

The process whereby model resolution forces climatologists to simplify calculations is called parameterization. It is the recognition that, while we realize there is an important process here and we have an idea of its magnitude, we cannot possibly explicitly model it so we must attempt to treat it as realistically as possible. One important example of a parameterized process is convective clouds and thunderstorms. Thunderstorms, while extremely important in the atmosphere for transporting heat and water vapor, are also extremely small on the global-scale. A typical GCM grid box ranges from 100km - 300km and the typical thunderstorm is around 1km. Therefore convection must be treated in a much simpler way. While it seems unlikely and maybe unnecessary for convective clouds to ever be modeled in a GCM, parameterizations have also evolved over time and have become better at calculating the influence convection has in the atmosphere.

In the late 20th century, as models and computers became more complex and powerful, model design began to diverge into several subcategories focusing on different aspects of weather and climate, including Numerical Weather Prediction models (NWP), regional scale models, and mesoscale models. These models all differ from GCMs in that they focus on different aspects of the atmosphere. For instance, NWP models use a much smaller horizontal scale--the North American continent for example--and attempt to forecast small changes in weather over short periods of time (a few hours or days). These differ from GCMs in that they are highly sensitive to initial conditions, where meteorological data fed into the model have a dramatic influence on the output. These models deal with what is called the "initial value" problem in that, given meteorological data, the simulation will diverge from reality over time. Climate models are less dependent upon initial conditions and instead must deal with the "boundary value" problem. This occurs where, once the general circulation of the atmosphere has been established, it is difficult to create realistic climatic disturbances such as interannual oscillations (ENSO, PDO, NAO) or climatic trends caused by external forces.

Slowly but surely models have been developed, refined, and tested against real-world situations, to the point that in the 1990s many scientists say the modern GCM was established. While some model weaknesses persist in that they may have biases with parameters, such as too much rain in a region or too warm in another, atmospheric scientists have been able to include more and more climatic processes and better simulate the climate as we learn more about our environment and the importance of the terms in the equations.

Go To Construction of a Modern Climate Model

© 2007, ProQuest LLC. All rights reserved.