Climate models
The behaviour of the climate system, its components and their interactions,
can be studied and simulated using tools known as climate models. These are
designed mainly for studying climate processes and natural climate variability,
and for projecting the response of the climate to human-induced forcing. Each
component or coupled combination of components of the climate system can be
represented by models of varying complexity.
The nucleus of the most complex atmosphere and ocean models, called General Circulation Models (Atmospheric General Circulation Models (AGCMs) and Ocean General Circulation Models (OGCMs)) is based upon physical laws describing the dynamics of atmosphere and ocean, expressed by mathematical equations. Since these equations are non-linear, they need to be solved numerically by means of well-established mathematical techniques. Current atmosphere models are solved spatially on a three-dimensional grid of points on the globe with a horizontal resolution typically of 250 km and some 10 to 30 levels in the vertical. A typical ocean model has a horizontal resolution of 125 to 250 km and a resolution of 200 to 400 m in the vertical. Their time-dependent behaviour is computed by taking time steps typically of 30 minutes. The impact of the spatial resolution on the model simulations is discussed in Section 8.9 of Chapter 8.
Models of the various components of the climate system may be coupled to produce increasingly complex models. The historical development of such coupled climate models is shown in Box 3 of the Technical Summary. Processes taking place on spatial and temporal scales smaller than the model’s resolution, such as individual clouds or convection in atmosphere models, or heat transport through boundary currents or mesoscale eddies in ocean models, are included through a parametric representation in terms of the resolved basic quantities of the model. Coupled atmosphere-ocean models, including such parametrized physical processes, are called Atmosphere-Ocean General Circulation Models (AOGCMs). They are combined with mathematical representations of other components of the climate system, sometimes based on empirical relations, such as the land surface and the cryosphere. The most recent models may include representations of aerosol processes and the carbon cycle, and in the near future perhaps also the atmospheric chemistry. The development of these very complex coupled models goes hand in hand with the availability of ever larger and faster computers to run the models. Climate simulations require the largest, most capable computers available.
A realistic representation of the coupling between the various components of the climate system is essential. In particular the coupling between the atmosphere and the oceans is of central importance. The oceans have a huge heat capacity and a decisive influence on the hydrological cycle of the climate system, and store and exchange large quantities of carbon dioxide. To a large degree the coupling between oceans and atmosphere determines the energy budget of the climate system. There have been difficulties modelling this coupling with enough accuracy to prevent the model climate unrealistically drifting away from the observed climate. Such climate drift may be avoided by adding an artificial correction to the coupling, the so-called “flux adjustment”. The evaluation in Chapter 8 of recent model results identifies improvements since the SAR, to the point that there is a reduced reliance on such corrections, with some recent models operating with minimal or no adjustment.
For various reasons, discussed in Section 8.3 of Chapter 8, it is important to also develop and use simpler models than the fully coupled comprehensive AOGCMs, for example to study only one or a specific combination of components of the climate system or even single processes, or to study many different alternatives, which is not possible or is impractical with comprehensive models. In IPCC (1997) a hierarchy of models used in the IPCC assessment process was identified and described, differing in such aspects as the number of spatial dimensions, the extent to which physical processes are explicitly represented, the level to which empirical parametrization is involved, and the computational costs of running the models. In the IPCC context, simple models are also used to compute the consequences of greenhouse gas emission scenarios. Such models are tuned to the AOGCMs to give similar results when globally averaged.
Projections of climate change
Climate models are used to simulate and quantify the climate response to present
and future human activities. The first step is to simulate the present climate
for extended simulation periods, typically many decades, under present conditions
without any change in external climate forcing.
The quality of these simulations is assessed by systematically comparing the simulated climate with observations of the present climate. In this way the model is evaluated and its quality established. A range of diagnostic tools has been developed to assist the scientists in carrying out the evaluation. This step is essential to gain confidence in and provide a baseline for projections of human-induced climate change. Models may also be evaluated by running them under different palaeoclimate (e.g. Ice Age) conditions. Chapter 8 of this report presents a detailed assessment of the latest climate models of various complexity, in particular the AOGCMs. Once the quality of the model is established, two different strategies have been applied to make projections of future climate change.
The first, so-called equilibrium method is to change, e.g. double, the carbon dioxide concentration and to run the model again to a new equilibrium. The differences between the climate statistics of the two simulations provide an estimate of the climate change corresponding to the doubling of carbon dioxide, and of the sensitivity of the climate to a change in the radiative forcing. This method reduces systematic errors present in both simulations. If combined with simple slab ocean models, this strategy is relatively cheap because it does not require long runs to reach equilibrium. However it does not provide insight in to the time dependence of climate change.
The second, so-called transient, method, common nowadays with improved computer resources, is to force the model with a greenhouse gas and aerosol scenario. The difference between such simulation and the original baseline simulation provides a time-dependent projection of climate change.
This transient method requires a time-dependent profile of greenhouse gas and aerosol concentrations. These may be derived from so-called emission scenarios. Such scenarios have been developed, among others by IPCC, on the basis of various internally coherent assumptions concerning future socio-economic and demographic developments. In the SAR the IPCC Scenarios IS92 were used (IPCC, 1994). The most recent IPCC emission scenarios are described in the IPCC Special Report on Emission Scenarios (Nakic´enovic´ et al., 2000). Different assumptions concerning e.g. the growth of the world population, energy intensity and efficiency, and economic growth, lead to considerably different emission scenarios. For example the two extreme estimates in the IPCC IS92 scenarios of the carbon dioxide emission by 2100 differ by a factor of 7. Because scenarios by their very nature should not be used and regarded as predictions, the term “climate projections” is used in this Report.
Transient simulations may also be based on artificially constructed, so-called idealised, scenarios. For example, scenarios have been constructed, assuming a gradual increase of greenhouse gas concentrations followed by stabilisation at various levels. Climate simulations based on such idealised scenarios may provide insight in to the climate response to potential policy measures leading to a stabilisation of the GHG concentrations, which is the ultimate objective of the United Nations Framework Convention on Climate Change (UNFCCC) as formulated in its Article 2. See Section 3 of Chapter 9 for an assessment.
Projections from present models show substantial agreement, but at the same time there is still a considerable degree of ambiguity and difference between the various models. All models show an increase in the globally averaged equilibrium surface temperature and global mean precipitation. In Chapter 9 the results of various models and intercomparison projects are assessed. Model results are more ambiguous about the spatial patterns of climate change than about the global response. Regional patterns depend significantly on the time dependence of the forcing, the spatial distribution of aerosol concentrations and details of the modelled climate processes. Research tools have been developed to generate more reliable regional climate information. These tools and their results are presented and assessed in Chapter 10.
To study the impact of climate change, a plausible and consistent description of a possible future climate is required. The construction of such climate change scenarios relies mainly on results from model projections, although sometimes information from past climates is used. The basis for and development of such scenarios is assessed in Chapter 13. Global and regional sea-level change scenarios are reviewed in Chapter 11.
Predictability, global and regional
In trying to quantify climate change, there is a fundamental question to be
answered: is the evolution of the state of the climate system predictable? Since
the pioneering work by Lorenz in the 1960s, it is well known that complex non-linear
systems have limited predictability, even though the mathematical equations
describing the time evolution of the system are perfectly deterministic.
The climate system is, as we have seen, such a non-linear complex system with many inherent time scales. Its predictability may depend on the type of climate event considered, the time and space scales involved and whether internal variability of the system or variability from changes in external forcing is involved. Internal variations caused by the chaotic dynamics of the climate system may be predictable to some extent. Recent experience has shown that the ENSO phenomenon may possess a fair degree of predictability for several months or even a year ahead. The same may be true for other events dominated by the long oceanic time-scales, such as perhaps the NAO. On the other hand, it is not known, for example, whether the rapid climate changes observed during the last glacial period are at all predictable or are unpredictable consequences of small changes resulting in major climatic shifts.
There is evidence to suggest that climate variations on a global scale resulting from variations in external forcing are partly predictable. Examples are the mean annual cycle and short-term climate variations from individual volcanic eruptions, which models simulate well. Regularities in past climates, in particular the cyclic succession of warm and glacial periods forced by geometrical changes in the Sun-Earth orbit, are simulated by simple models with a certain degree of success. The global and continental scale aspects of human-induced climate change, as simulated by the models forced by increasing greenhouse gas concentration, are largely reproducible. Although this is not an absolute proof, it provides evidence that such externally forced climate change may be predictable, if their forcing mechanisms are known or can be predicted.
Finally, global or continental scale climate change and variability may be more predictable than regional or local scale change, because the climate on very large spatial scales is less influenced by internal dynamics, whereas regional and local climate is much more variable under the influence of the internal chaotic dynamics of the system. See Chapter 7 for an assessment of the predictability of the climate system.
Rapid climate change
A non-linear system such as the climate system may exhibit rapid climate change
as a response to internal processes or rapidly changing external forcing. Because
the probability of their occurrence may be small and their predictability limited,
they are colloquially referred to as “unexpected events” or “surprises”.
The abrupt events that took place during the last glacial cycle are often cited
as an example to demonstrate the possibility of such rapid climate change. Certain
possible abrupt events as a result of the rapidly increasing anthropogenic forcing
could be envisioned. Examples are a possible reorganization of the thermohaline
ocean circulation in the North Atlantic resulting in a more southerly course
of the Gulf Stream, which would have a profound influence on the climate of
Western Europe, a possible reduction of upper-level ocean cycling in the Southern
Ocean, or a possible but unlikely rapid disintegration of part of the Antarctic
ice sheet with dramatic consequences for the global sea level.
More generally, with a rapidly changing external forcing, the non-linear climate system may experience as yet unenvisionable, unexpected, rapid change. Chapter 7, in particular Section 7.7, of this Report reviews and assesses the present knowledge of non-linear events and rapid climate change. Potential rapid changes in sea level are assessed in Chapter 11.
Observing the climate
The question naturally arises whether the system has already undergone human-induced
climate change. To answer this question, accurate and detailed observations
of climate and climate variability are required. Instrumental observations of
land and ocean surface weather variables and sea surface temperature have been
made increasingly widely since the mid-19th century. Recently, ships’ observations
have been supplemented by data from dedicated buoys. The network of upper-air
observations, however, only became widespread in the late 1950s. The density
of observing stations always has been and still is extremely inhomogeneous,
with many stations in densely populated areas and virtually none in huge oceanic
areas. In recent times special earth-observation satellites have been launched,
providing a wide range of observations of various components of the climate
system all over the globe. The correct interpretation of such data still requires
high quality in situ and surface data. The longer observational records suffer
from changes in instrumentation, measurement techniques, exposure and gaps due
to political circumstances or wars. Satellite data also require compensation
for orbital and atmospheric transmission effects and for instrumental biases
and instabilities. Earlier the problems related to urbanisation were mentioned.
To be useful for the detection of climate change, observational records have
to be adjusted carefully for all these effects.
Concern has been expressed about the present condition of the observational
networks. The number of upper-air observations, surface stations and observations
from ships is declining, partly compensated for by an increasing number of satellite
observations. An increasing number of stations are being automated, which may
have an impact on the quality and homogeneity of the observations. Maintaining
and improving the quality and density of existing observational networks is
essential for necessary high standard information. In order to implement and
improve systematic observations of all components of the climate system, the
World Meteorological Organization and the International Oceanographic Commission
have established a Global Climate Observing System (GCOS). Initially GCOS uses
existing atmospheric, oceanic and terrestrial networks. Later GCOS will aim
to amplify and improve the observational networks where needed and possible.
Observations alone are not sufficient to produce a coherent and global picture
of the state of the climate system. So-called data assimilation systems have
been developed, which combine observations and their temporal and spatial statistics
with model information to provide a coherent quantitative estimate in space
and time of the state of the climate system. Data assimilation also allows the
estimation of properties which cannot easily be observed directly but which
are linked to the observations through physical laws. Some institutions have
recently reanalysed several decades of data by means of the most recent and
most sophisticated version of their data assimilation system, avoiding in this
way inhomogenities due to changes in their system. However inhomogeneities in
these reanalyses may still exist due to changing sources of information, such
as the introduction of new satellite systems.
The 20th century
Historically, human activities such as deforestation may have had a local or
regional impact, but there is no reason to expect any large human influence
on the global climate before the 20th century. Observations of the global climate
system during the 20th century are therefore of particular importance. Chapter
2 presents evidence that there has been a mean global warming of 0.4 to
0.8°C of the atmosphere at the surface since the late 19th century. Figure
2.1 of Chapter 2 shows that this increase took place
in two distinct phases, the first one between 1910 and 1945, and recently since
1976. Recent years have been exceptionally warm, with a larger increase in minimum
than in maximum temperatures possibly related, among other factors, to an increase
in cloud cover. Surface temperature records indicate that the 1990s are likely
to have been the warmest decade of the millennium in the Northern hemisphere,
and 1998 is likely to have been the warmest year. For instrumentally recorded
history, 1998 has been the warmest year globally. Concomitant with this temperature
increase, sea level has risen during the 20th century by 10 to 20 cm and there
has been a general retreat of glaciers worldwide, except in a few maritime regions,
e.g. Norway and New Zealand (Chapter 11).
Regional changes are also apparent. The observed warming has been largest over
the mid- and high-latitude continents in winter and spring. Precipitation trends
vary considerably geographically and, moreover, data in most of the Southern
Hemisphere and over the oceans are scarce. From the data available, it appears
that precipitation has increased over land in mid- and high latitudes of the
Northern Hemisphere, especially during winter and early spring, and over most
Southern Hemisphere land areas. Over the tropical and the Northern Hemisphere
subtropical land areas, particularly over the Mediterranean region during winter,
conditions have become drier. In contrast, over large parts of the tropical
oceans rainfall has increased.
There is considerable variability of the atmospheric circulation at long time-scales.
The NAO for example, with its strong influence on the weather and climate of
extratropical Eurasia, fluctuates on multi-annual and multi-decadal time-scales,
perhaps influenced by varying temperature patterns in the Atlantic Ocean. Since
the 1970s the NAO has been in a phase that gives stronger westerly winds in
winter. Recent ENSO behaviour seems to have been unusual compared to that of
previous decades: there is evidence that El Niño episodes since the mid-1970s
have been relatively more frequent than the opposite La Niña episodes.
There are suggestions that the occurrence of extreme weather events has changed
in certain areas, but a global pattern is not yet apparent. For example, it
is likely that in many regions of the world, both in the Northern and Southern
Hemisphere, there has been a disproportionate increase in heavy and extreme
precipitation rates in areas where the total precipitation has increased. Across
most of the globe there has been a decrease in the frequency of much below-normal
seasonal temperatures.
A detailed assessment of observed climate variability and change may be found
in Chapter 2, and of observed sea level change in Chapter
11. Figure 2.39 of Chapter
2 summarises observed variations in temperature and the hydrological cycle.
Detection and attribution
The fact that the global mean temperature has increased since the late 19th
century and that other trends have been observed does not necessarily mean that
an anthropogenic effect on the climate system has been identified. Climate has
always varied on all time-scales, so the observed change may be natural. A more
detailed analysis is required to provide evidence of a human impact.
Identifying human-induced climate change requires two steps. First it must be
demonstrated that an observed climate change is unusual in a statistical sense.
This is the detection problem. For this to be successful one has to know quantitatively
how climate varies naturally. Although estimates have improved since the SAR,
there is still considerable uncertainty in the magnitude of this natural climate
variability. The SAR concluded nevertheless, on the basis of careful analyses,
that “the observed change in global mean, annually averaged temperature
over the last century is unlikely to be due entirely to natural fluctuations
of the climate system”.
Having detected a climatic change, the most likely cause of that change has
to be established. This is the attribution problem. Can one attribute the detected
change to human activities, or could it also be due to natural causes? Also
attribution is a statistical process. Neither detection nor attribution can
ever be “certain”, but only probable in a statistical sense. The attribution
problem has been addressed by comparing the temporal and spatial patterns of
the observed temperature increase with model calculations based on anthropogenic
forcing by greenhouse gases and aerosols, on the assumption that these patterns
carry a fingerprint of their cause. In this way the SAR found that “there
is evidence of an emerging pattern of climate response to forcing by greenhouse
gases and sulphate aerosols in the observed climate record”. Since the
SAR new results have become available which tend to support this conclusion.
The present status of the detection of climate change and attribution of its
causes is assessed in Chapter 12.
Other reports in this collection |