ChapterPDF Available

Frameworks for Dealing with Climate and Economic Uncertainties in Integrated Assessment Models

Authors:

Abstract

IAMs connect physical and social science models to address cross-disciplinary questions, such as how does climate affect economies. There are different types of IAMs. One category of IAMs, for example, derives scenarios for future population, economies, technology and greenhouse gas (GHG) emissions, and explores how these may influence climate variables and, subsequently, the biosphere. These IAMs tend to be large, complex and computationally intensive. As a result, they are typically deterministic, as there is often insufficient information available to define probability distributions for the thousands of parameters of the model. Even if this could be done, computational power is often not sufficient to run large numbers of simulations with varying parameters. Examples of such IAMs include the Integrated Model to Assess the Global Environment (IMAGE) (van Vuuren et al., 2011, p. 6) and the Global Change Assessment Model (GCAM) (Wise et al., 2009). A different category of IAMs primarily estimates the economic costs and benefits of climate change, and then uses cost-benefit analysis (CBA) to assess the relative desirability of different GHG emissions as well as adaptation policies. These IAMs tend to be much smaller and simpler, which means that uncertainties can be explored via Monte Carlo analysis. IAMs in this group sometimes use GHG and socioeconomic scenarios produced by IAMs in the previous category as exogenous inputs, and then generate their own estimates of temperature, sea-level rise (SLR) and the associated economic impacts. Alternatively they may generate the input scenarios themselves using simple internal models, and then use these in other components of the model. Popular examples include the Dynamic Integrated Climate-Economy model (DICE) (Nordhaus, 2017), the Climate Framework for Uncertainty, Negotiation and Distribution (FUND) (Anthoff and Tol, 2014) and Policy Analysis of the Greenhouse Effect (PAGE) (Hope, 2013). This second group of IAMs will often focus on estimating the social cost of carbon (SCC), which is the discounted economic impact of emitting an additional tonne of carbon today. William Nordhaus, the creator of DICE, for example, called the SCC “The most important single economic concept in the economics of climate change” (Nordhaus, 2017). The SCC is calculated by running the same climate and socioeconomic scenario, for example business as usual or 2°C in 2100, twice; only, in one case, assuming additional CO2 emissions in the starting period. The SCC will, thus, be different for different scenarios, although the variation is often limited (Hope and Newbery, 2006). With recognition of its limitations, the SCC is sometimes cautiously put forward as the possible tax that should be applied on carbon emissions to correct for associated climate change externalities, i.e. the costs from carbon emissions not included in the prices of goods and services produced by carbon-emitting industries. For example, see Anthoff and Tol (2013) and Hope and Newbery (2006). Modelling how two highly complex systems, the climate and the macroeconomy, will evolve individually, let alone when they interact with one another, is plagued with uncertainties. As will be explored in section 5 of this chapter, results generated by IAMs in the first category are typically used in a manner that reflects the deep uncertainties present. The builders and users of IAMs in the second category, however, place greater faith in the ability to quantify uncertainties, and their results are generally interpreted in this light. For example, cost-optimal GHG emissions trajectories and the mean SCC are often estimated. This chapter looks at the frameworks utilised to analyse the outputs of IAMs, particularly those in the second category, and alternative frameworks for exploring results generated by IAMs in the second category that are more appropriate for decision-making under deep uncertainty. Section 2 summarises sources of uncertainty and how they are quantified. Section 3 highlights broader problems with modelling and predicting complex nonlinear systems. By reflecting on why models are built, section 4 explores how imperfect models can still have academic value if used appropriately. Section 5 suggests alternative frameworks for communicating and using results generated by IAMs in the second category in light of the preceding analysis. Section 6 concludes.
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
71
Frameworks for Dealing with Climate and Economic
Uncertainties in Integrated Assessment Models1
Lydia PRIEG, Dmitry YUMASHEV
University of Cambridge, Lancaster University Management School
IAMs connect physical and social science models to address cross-disciplinary
questions, such as how does climate affect economies. There are different types of
IAMs. One category of IAMs, for example, derives scenarios for future population,
economies, technology and greenhouse gas (GHG) emissions, and explores how
these may influence climate variables and, subsequently, the biosphere. These IAMs
tend to be large, complex and computationally intensive. As a result, they are
typically deterministic, as there is often insufficient information available to define
probability distributions for the thousands of parameters of the model. Even if this
could be done, computational power is often not sufficient to run large numbers of
simulations with varying parameters. Examples of such IAMs include the Integrated
Model to Assess the Global Environment (IMAGE) (van Vuuren et al., 2011, p. 6) and
the Global Change Assessment Model (GCAM) (Wise et al., 2009).
A different category of IAMs primarily estimates the economic costs and benefits of
climate change, and then uses cost-benefit analysis (CBA) to assess the relative
desirability of different GHG emissions as well as adaptation policies. These IAMs
tend to be much smaller and simpler, which means that uncertainties can be explored
via Monte Carlo analysis2. IAMs in this group sometimes use GHG and
socioeconomic scenarios produced by IAMs in the previous category as exogenous
inputs, and then generate their own estimates of temperature, sea-level rise (SLR)
and the associated economic impacts. Alternatively they may generate the input
scenarios themselves using simple internal models, and then use these in other
components of the model. Popular examples include the Dynamic Integrated
Climate-Economy model (DICE) (Nordhaus, 2017), the Climate Framework for
Uncertainty, Negotiation and Distribution (FUND) (Anthoff and Tol, 2014) and
Policy Analysis of the Greenhouse Effect (PAGE) (Hope, 2013).
This second group of IAMs will often focus on estimating the social cost of carbon
(SCC), which is the discounted3 economic impact of emitting an additional tonne of
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
1 This work is part of the ICE-ARC project funded by the European Union 7th Framework
Programme, grant number 603887, ICE-ARC contribution number 086.
2 Some IAMs in this category are also deterministic.
3 ‘Discounting’ is a process that adjusts the values of cash flows occurring at different points in time so
that they are commensurable.
Integrated Assessment Models and Other Climate Policy Tools
!
72
carbon today. William Nordhaus, the creator of DICE, for example, called the SCC
“The most important single economic concept in the economics of climate change”
(Nordhaus, 2017). The SCC is calculated by running the same climate and
socioeconomic scenario, for example business as usual or 2°C in 2100, twice; only, in
one case, assuming additional CO2 emissions in the starting period. The SCC will,
thus, be different for different scenarios, although the variation is often limited (Hope
and Newbery, 2006). With recognition of its limitations, the SCC is sometimes
cautiously put forward as the possible tax that should be applied on carbon
emissions to correct for associated climate change externalities, i.e. the costs from
carbon emissions not included in the prices of goods and services produced by
carbon-emitting industries. For example, see Anthoff and Tol (2013) and Hope and
Newbery (2006). Modelling how two highly complex systems, the climate and the
macroeconomy, will evolve individually, let alone when they interact with one
another, is plagued with uncertainties. As will be explored in section 5 of this
chapter, results generated by IAMs in the first category are typically used in a
manner that reflects the deep uncertainties present. The builders and users of IAMs
in the second category, however, place greater faith in the ability to quantify
uncertainties, and their results are generally interpreted in this light. For example,
cost-optimal GHG emissions trajectories and the mean SCC are often estimated.
This chapter looks at the frameworks utilised to analyse the outputs of IAMs,
particularly those in the second category, and alternative frameworks for exploring
results generated by IAMs in the second category that are more appropriate for
decision-making under deep uncertainty. Section 2 summarises sources of
uncertainty and how they are quantified. Section 3 highlights broader problems with
modelling and predicting complex nonlinear systems. By reflecting on why models
are built, section 4 explores how imperfect models can still have academic value if
used appropriately. Section 5 suggests alternative frameworks for communicating
and using results generated by IAMs in the second category in light of the preceding
analysis. Section 6 concludes.
1. Uncertainty in IAMs
One faces risk when all possible outcomes and their probabilities are known. An
example is rolling a dice. Uncertainty, on the other hand, occurs when one’s
knowledge of either the outcomes or their probabilities is limited. Quantifying
uncertainty means estimating how model outcomes may be affected by uncertainties
surrounding model parameters and structure, observational errors and numerical
approximations of solutions (Beven, 2009). Where observational outcome data is
available, regression, Bayesian, Markov Chain Monte Carlo (MCMC) and
Generalised Likelihood Uncertainty Estimation (GLUE) methods are often deployed.
This is known as inverse uncertainty quantification, as the model and associated
uncertainties are inferred from the data. Where data is sparse or unavailable, one
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
73
instead typically uses sensitivity analysis or forward uncertainty propagation, for
example, Monte Carlo simulations, to explore uncertainty in model output variables
given specified uncertainty ranges of the input variables. Here one has to estimate or
assume the range of plausible forms or values that, respectively, equations and
parameters may take, and often also the associated probability density functions
(PDFs), which capture the relative likelihood of the different options. By varying the
model across multiple simulations in accordance with this information, one can
explore how model outcomes are affected by model uncertainties. A good overview
of inverse and forward uncertainty quantification methods can be found in (Beven,
2009).
Deep uncertainty, sometimes called Knightian uncertainty, occurs when quantifiable
information is so sparse that the PDF associated with an uncertainty cannot be
ascertained. An example is the Intergovernmental Panel on Climate Change’s
(IPCC’s) Representative Concentration Pathways (RCPs) (IPCC, 2014). These GHG
atmospheric concentration scenarios do not have associated probabilities, as there is
insufficient information to estimate the relative likelihood of the socioeconomic
assumptions underpinning these scenarios. Deep uncertainty is a problem for
methods such as CBA that rely on calculating expected values, and so require PDFs
or exact knowledge. Our understanding of climate change is highly uncertain, both
from an environmental and economic perspective, and much effort has been made to
quantify uncertainty. For example, it is uncertain how global mean surface
temperature (GMST) may respond to increases in GHG concentration, with the
effects of complex feedback processes, such as those involved in cloud formation,
being especially confounding. Attempts to estimate this climate sensitivity have used
a variety of tools, including model simulations, instrumental records of GHG
atmospheric concentrations and temperature, and paleo records. Other common
examples of climate-related uncertainty associated with GMST increases include
changes in the tropopause height that affect the water vapour feedback, ocean
circulation responses to freshwater discharges from melting ice sheets (Greenland
and Antarctica) that could affect heat transfer between latitudes, carbon emissions
from thawing permafrost on land and sub-sea that will amplify anthropogenic GHG
emissions, changes in the surface albedo (reflectivity) driven by the decline of the sea
ice and land snow covers that increase solar absorption, and greater variability in the
Northern jet stream due to Arctic amplification that affects extreme weather patterns
in mid-latitude regions. An overview of climate uncertainties can be found in (IPCC,
2014).
On the socioeconomic side of IAMs, economic and demographic projections are
uncertain, particularly given the time-scale over which climate change damages are
likely to occur. Commonly used IAM damage functions, which estimate the annual
cost or benefit to an economy or sectors within it arising from changes in annual
mean temperatures, are highly speculative. While they theoretically draw upon a
range of external damage studies, the latter themselves are often exploratory rather
Integrated Assessment Models and Other Climate Policy Tools
!
74
than confident estimates. Moreover, there are very few damage studies that estimate
the economic consequences of an increase greater than 3°C in GMST (Journal of
Economic Perspectives, 2015; Nordhaus and Sztorc, 2013). IAMs, however, often look
at scenarios where GMST increases by as much as 6°C. So how do IAMs estimate
damages at such levels?
As explored in Prieg and Yumashev (forthcoming), approaches include performing
regression analysis over the range in which damage estimates are available and then
extrapolating the established relationship to higher temperatures, or using expert
guesses of functional forms that can then be calibrated based on one or two damage
estimates from lower temperature changes. Expert guesses are also sometimes used
for smaller temperature changes, where damage estimates are not available, or where
it’s believed existing damage estimates need to be adjusted to reflect omitted factors.
A recent wave of damage studies has used econometrics to estimate the relationship
between temperature and total economic growth across a large number of countries.
(Burke et al., 2015), for example, uses regression analysis on data from 166 countries
between 1960 and 2010 to estimate the effect of temperature on growth in real GDP
per capita, specifically whether deviations from growth trends appear to be driven
by deviations from temperature trends (inter-annual variability). Individual response
functions are generated for the different countries and then combined to form a
global response function. Taking only the first or the last 20 years of the time period
generates essentially the same results, which suggests that technological
advancements and increased wealth have not altered the relationship. This could be
due to effective autonomous adaptation to the relatively mild temperature changes
that have occurred so far. While year-on-year fluctuations in temperature in any
given country cover only a moderate range (typically ±1°C), the wide range of
countries studied have very different average temperatures, so, by placing all the
countries on the same curve, the effects of a wide range of temperature increases can
be estimated.
The Burke et al. (2015) damage function has so far been incorporated into one widely
recognised IAM, PAGE (Yumashev et al., 2019).4 We expect that a similar approach
will be adopted by other IAMs in the near future, since econometric studies provide
a more compelling way to establish and calibrate damage functions than the methods
currently deployed. Econometric analysis, however, will not resolve all damage
function uncertainties. The Burke et al. (2015) damage function, for example, assumes
that all countries can be placed on the same curve. In reality, economies and the
environment may struggle to adapt to large increases in average temperature levels
that occur over a relatively short time-scale or with high volatility. The Burke et al.
(2015) analysis also tells us nothing about what will happen if countries at the top
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
4 A new version of the PAGE model, called PAGE-ICE, has a modification of the Burke impact
function implemented. PAGE-ICE has been made publically available as part of the study “Climate
policy implications of nonlinear decline in Arctic land permafrost and other cryosphere elements” by
Yumashev et al. (2019).
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
75
end of the average temperature spectrum experience a large increase in temperature,
and unfortunately these countries tend to be among the poorest.
In addition, IAMs typically assume that all climate damages are repaired by the next
time-step, which may be overly optimistic, particularly as resources will need to be
diverted from other activities towards repairing them. In other words, there may be
growth effects, rather than, or in addition to, level effects (Piontek et al., 2019), even
though the econometric analysis with 5-year lags by Burke et al. (2015) could not
establish the prevalence of the growth effects with sufficient confidence. Moreover,
while econometrics may work for a total economic damage function, there may be
insufficient sector-level data for a wide variety of countries to establish sector-level
damage functions in this manner.5 Depending on the research question being
addressed, a sector-level approach may be required; for example, if one hopes to
estimate the extent that different sectors are vulnerable to climate change, or how
damages to individual sectors propagate through the economy (multiplier effects).
As discussed, IAMs in the first category are large and complex, which means they are
typically deterministic. To explore uncertainty, different models produced by
different modelling teams are often employed to generate climate and biophysical
results for the same scenario. The same approach is used in climate model
simulations, such as the Coupled Model Intercomparison Project Phase 5 (CMIP5).
Probabilities are not assigned to the results for different scenarios; the latter are
merely used to get an idea of the range of outcomes that may occur. For example, this
is the approach taken by the IPCC in the Assessment Reports. IAMs in the second
category, meanwhile, are smaller and simpler, and so the effect of uncertainty on a
model’s output can be explored. While they typically use the non-probabilistic GHG
and socioeconomic scenarios generated by IAMs in the first category as exogenous
inputs, for each of these scenarios they attempt to quantify the uncertainty around
their output variable, which is usually the SCC. FUND and PAGE, for example, use
expert guesses or the range of estimates provided by external studies to suggest
PDFs for over 100 uncertain variables, be they a scaling factor for the climate impact
at a calibration temperature or polynomial degree which determines the convexity of
the damage function. Monte Carlo simulations6 are then performed to estimate a PDF
for the SCC. However, not all IAMs in this category are highly stochastic. Some, such
as DICE, are deterministic or present only a very simple sensitivity analysis. To
explore structural uncertainty, for example, Nordhaus, (2017) varied three variables
in DICE 2016R: the damage parameter, productivity growth and equilibrium
temperature sensitivity. As with IAMs in the first category, research using IAMs in
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
5 Although there are many existing econometric studies exploring the relationship between
temperature and labour productivity or agricultural output.
6 Monte Carlo simulations run a model thousands of times using random sampling of any uncertain
variables from a specified PDF. Through thousands of simulations, one can thus build up a probability
distribution for the model’s outcome value. Monte Carlo analysis also allows one to see which
uncertain variables have the largest impact on results (sensitivity analysis).
Integrated Assessment Models and Other Climate Policy Tools
!
76
the second category also frequently compares results generated by the different
models.
2. Modelling complex nonlinear systems
In a linear system, if you double the input, you get twice the output. More generally,
if you feed 𝐴 + 𝐵 into the system, the output generated is the same as if you had fed
𝐴 and 𝐵 into the system individually, and then summed their outputs. In nonlinear
systems, this principle is broken. Meanwhile, a dynamical system is one containing
variables that are time dependent.
Most nonlinear differential equations are not integrable. Such systems do not have
closed-form solutions, and one must instead use numerical methods to see how they
evolve. Nonlinear systems can have predictable steady states or periodic behaviour,
but they can also exhibit quasi-periodicity or complex behaviours that are extremely
difficult to predict. In addition, small inputs can translate into large outputs. This can
result in small measurement errors for initial conditions translating into large
differences in long-term prediction trajectories. Small uncertainties can thus be
amplified over time, so rendering predictions useless (Kleeman, 2011). Such systems
may also have multiple equilibria or attractors, which are regions in a nonlinear
dynamic system’s state space towards which the system eventually evolves and
orbits around, periodically or otherwise. This means that a small deviation has the
potential to suddenly shift the system into a new regime. In Edward Lorenz’ simple
atmospheric model, for example, small changes in starting conditions may cause the
system to jump between different attractors in the state space (Lorenz, 1963).
Complex systems, such as the atmosphere or an economy, have many different parts
that interact. While the isolated individual components of such systems may be
predictable, interactions and feedback effects can make the system unpredictable. An
example of this is the double pendulum, a non-integrable system. While the
dynamics of each individual pendulum, in isolation, are easy to predict, when the
pendulums are connected the behaviour becomes chaotic (Shinbrot et al., 1992). To
precisely predict such systems, one would need to have perfect knowledge of all the
forces acting on every component, along with perfect measurement of starting
conditions for each component, and extensive computing power. Without these
capabilities, the system may exhibit behaviour that cannot be predicted from studies
of isolated components – a phenomenon called emergence. The double pendulum has
only two components, and already exhibits complexity. The atmosphere or an
economy contains countless parts, so the level of knowledge, measurement and
computer power required to precisely predict such systems is high indeed.
Furthermore, while the forces and dynamics around single pendulums are well
known, with naturally occurring complex systems it can be difficult to isolate
components and determine the equations governing them. Trying to infer such
information from emergent behaviour is extremely difficult (Wang et al., 2011).
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
77
The study of nonlinear dynamics is in its infancy, with mathematical tools still being
developed (De Gooijer and Hyndman, 2006; Vlad et al., 2010). This is reflected in the
performance of nonlinear models versus linear ones; namely, nonlinear models often
don’t produce better forecasts (Clements et al., 2004; De Gooijer and Hyndman,
2006). Understanding the behaviour of complex systems is considered to be one of
the key challenges in modern science (Cheng et al., 2015).
While statistics can be useful for exploring some microdynamics, time-series analysis,
for example, struggles to deal with interacting processes and non-stationary
behaviour, where means and variances evolve with time (Cheng et al., 2015). Such
behaviour is common in complex, dynamic non-linear systems. Although several
techniques that can deal with non-stationary processes, such as multi-layer
perceptron (MLP) and generalised autoregressive conditional heteroscedasticity
(GARCH), have been developed (Golestani and Gras, 2014), such methods typically
can only cope with certain types of non-stationary behaviour (Cheng et al., 2015). As
a result, an expert judgement may still be required to ascertain whether a given non-
stationary dynamic system could indeed be modelled using these and similar
techniques, without losing crucial properties and information that cannot be
captured by these methods. Non-stationary time series analysis thus still poses great
challenges. Furthermore, the data demands required to overcome such difficulties
are high, as models of complex systems commonly have many variables relative to
data points, which reduces statistical power.
The problems of non-stationarity have been known for many years. Robert Solow, for
example, who developed the influential Solow-Swan growth model, argued that it is
“straining credulity” to believe that economic processes are stationary:
“As soon as time-series get long enough to offer hope of discriminating among complex
hypotheses, the likelihood that they remain stationary dwindles away, and the noise level gets
correspondingly high. Under these circumstances, a little cleverness and persistence can get
you almost any result you want. I think that is why so few econometricians have ever been
forced by the facts to abandon a firmly held belief.” (Solow, 1985).
Economic models’ current forecasting abilities, even for the near-future, reflect these
challenges. Researchers at the Federal Reserve, for example, examined US GDP
growth and inflation forecasts for between 1992 and 2004 and found that dynamic
stochastic general equilibrium (DSGE) models appear to forecast “very poorly”,
although they outperformed econometric models and expert guesses (Edge and
Gurkaynak, 2011). DSGE predictions of inflation for even one quarter ahead had an
R2 of only 0.13 when tested on historic data7. For quarters further out, the R2 fell to
nearly zero. Fildes and Stekler (2002) surveyed the literature evaluating UK and US
real GDP and inflation forecasts, and found that forecasts outperformed those
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
7 R2 is a proportional measure of the extent that realised data can be explained by the model; if R2 is
close to zero, the model is deemed to perform poorly, while the values close to 1 imply very good
model performance in reproducing the data.
Integrated Assessment Models and Other Climate Policy Tools
!
78
generated by naïve random walk models, but not significantly so. They also found
only mixed evidence for forecasts improving over the latter decades of the twentieth
century, as econometric techniques improved. Meanwhile, in May 2007 and 2008, in
the midst of the financial crisis, the OECD predicted GDP growth for the following
year in 40 different countries. These estimates were ultimately wrong by an average
of 2.6 percentage points – a very large error for GDP growth (Richardson et al., 2014).
To add value, researchers must therefore remember that IAMs deal with nonlinear
systems and imperfect models, knowledge, and measurements, so, given the tools
currently available, one cannot make precise predictions even for the next decade, let
alone over the climate change timescale. This is a big conceptual leap for the
economics profession which, since Milton Friedman’s influential essay (Friedman,
1953), has generally focused on predictive power when evaluating models. Friedman
argued that
“Truly important and significant hypotheses will be found to have “assumptions” that are
wildly inaccurate descriptive representations of reality, and, in general, the more significant
the theory, the more unrealistic the assumptions” (Friedman, 1953)
He goes on to claim that the appropriateness of assumptions is determined not by
how realistic they are, but by how well the model, as a whole, makes predictions. As
models cannot include everything, they are inherently unrealistic. In fact, successful
models will be the most unrealistic, he asserts, as they should have distilled a
phenomenon down to its core drivers and eliminated more minor interactions.
Friedman’s arguments have had an enormous influence on the economics profession
and its interpretation of the scientific method (Mäki, 2009).
Moving away from this makes it more difficult to assess a model. Can an imperfect
model still have academic value or be useful to decision-makers? And, if so, how can
we tell which models are helpful, and which are useless or even misleading? To
approach this, let us consider what models are for.
3. Why do we build models?
Models do not perfectly reflect reality. The latter is too complex for mankind to
encapsulate in a series of rules. Even if such a model could, in theory, be created,
there is insufficient computing power to run it. So models are instead simplified
abstractions of reality that help us explore questions and discover new ones; they are
tools to help us learn. While Friedman argued that the purpose of an economic
model is to make predictions, many natural scientists, social scientists and
philosophers take a different view.
Solow, for example, advised economists to use models to organise thoughts, discover
unexpected links, develop plausible causal narratives, and to make “rough
quantitative judgements” (Solow, 1985). Another perspective is that models are an
effective way to explore the implications of a theory, to differentiate between
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
79
theories, and to compare and contrast them. When different models are used to
simulate the consequences of the same set of assumptions, examining the differences
between the results produced can help illustrate how modelling frameworks
influence the projected outcome. In addition to exploring theories, models can be
used to probe how different processes may influence and interact with one another
(Bharwani et al., 2005). Agent-based models (ABMs), for example, can help explore
the emergent behaviour of interacting processes and complex systems. They can also
be used to generate scenarios to explore a range of possible outcomes and to see if
counterintuitive outcomes could occur. Similarly, they can help one better
understand policy questions and problems (Morgan, 2017). Explicitly speculative
models and theories have even been put forward with the stated aim of prompting
discussion and highlighting areas about which there is a paucity of knowledge. The
Kuznets curve, which suggests a relationship between economic development and
economic inequality, is an example of this (Kuznets, 1955). By highlighting areas
where our understanding is deficient, models can thus help direct further research
and data gathering (Oreskes et al., 1994).
In the face of deep uncertainty, some argue that expert guesses are good alternatives
to formal modelling. Thinking qualitatively, however, can be vague and open to
inconsistencies. Expert guesses are filled with just as many assumptions as formal
models, only they are less transparently stated; in fact, they probably won’t be
declared at all. Expert guesses may also suffer from cognitive biases of which the
expert is unaware (Goodwin and Wright, 2010). Building a model forces one to state
all assumptions, to think through them as a collective package, and to identify and
eliminate logical contradictions in one’s theory.
Climate change is arguably the biggest challenge of our generation. Its estimated
effects could range from negligible to the complete collapse of ecosystems. Some
even argue that global warming may bring benefits. These varying consequences
demand different responses from individuals, firms, NGOs and governments. Fossil
fuels, for example, have historically been one of the biggest drivers of economic
development (Asafu-Adjaye et al., 2016). In this light, is it preferable to curb carbon
emissions and potentially reduce economic growth? If so, what are the pros and cons
of the different ways to do this, and, if future generations are likely to be wealthier
than people today, won’t they be better placed to cut consumption? These are the
questions that IAMs and damage studies address; it would be absurd and
irresponsible for academics, as well as decision-makers of all levels, to not consider
these issues. Macroeconomics, similarly, should not disappear despite the
profession’s frequently failure to make successful predictions.
So the question is not ‘should one model the impacts of climate change’, but rather
‘at what point do these models become misleading rather than helpful?’ The answer
is when results are oversold. Even Robert Pindyck, perhaps the most vocal critic of
Integrated Assessment Models and Other Climate Policy Tools
!
80
IAMs of the second kind,8 acknowledges that the problem is not people trying to
model uncertain interactions between the economy and GHG emissions, but rather
that results are sometimes communicated with the same degree of confidence one
would expect from a scientific experiment or randomised controlled medical trial
(Pindyck, 2017). He calls this “the veneer of scientific legitimacy”, and takes
particular issue with precise quantitative conclusions, such as calculating the SCC, or
prescribing an optimal greenhouse gas mitigation strategy (Pindyck, 2017).
IAMs thus can have many purposes, even if they can’t confidently be used to make
precise, verifiable predictions. They can help identify key relationships between the
climate and economies, help us better understand current theories surrounding
growth and climate change, eliminate logical contradictions, help organise thoughts,
force one to identify and understand assumptions, highlight areas where
understanding is deficient, encourage debate and future research, and provide a
logical and consistent framework for developing scenarios. The questions one asks
and how results generated by IAMs are viewed, however, should differ to
approaches taken when dealing with models with high predictive power.
4. Frameworks for dealing with climate and economic
uncertainties in IAMs
Based on the discussion in the previous sections, we can conclude that the PDFs
generated by the second category of IAMs are in many cases flawed, but still useful
and interesting. They are an attempt to explore the propagation of uncertainty, can
indicate which uncertainties may have the greatest impact on results and so help
prioritise future research, and have prompted academic debate (Weitzman, 2009). To
the non-specialist eye, however, they could also be misleading, as they suggest that
uncertainties are well understood. One must appraise the PDFs underlying the
Monte Carlo simulations with a critical eye, which has been done in the models like
FUND and PAGE to various degrees of success.
As explored in Prieg and Yumashev (forthcoming), the choice of methods used to
derive economic damage functions and their associated PDFs is often not explained
in the model technical documentation and is sometimes puzzling. Moreover, the
studies that provide the data used for calibration are often very out-of-date; for
example, they are frequently from the 1990s. This reduces the data points available
for calibration and so also limits the calibration methods that can be applied. In
addition, scientific progress has been made over the past twenty years that will not
be incorporated. Many IAM PDFs are based on guesswork, not modelling or
observation. In FUND 3.9 (Anthoff and Tol, 2014), for example, agricultural damages
due to the rate of change of temperature are assumed to be proportional to rate of
change in GMST raised to a power. The PDF of this uncertain power is transparently
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
8 Pindyck himself appears to have failed to distinguish between the two groups of IAMs, which
undermines some of his arguments.
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
81
stated to be an expert guess that is not derived from any damage studies. Similarly,
the PDFs in PAGE are simply assumed to be triangular. PDFs underlying
catastrophic climate events and their impacts are particularly spurious. The likely
consequent damages are essentially unknown, and the tails of these distributions
should potentially be fatter, which could have a large impact on the resulting CBA
(Weitzman, 2014, 2009). In short, the uncertainty surrounding the economic
consequences of climate change is, at least currently, deep.
With the exception of catastrophic climate events, the PDFs underpinning climate
uncertainties are better understood than their economic counterparts. For instance,
the latest version of the PAGE model, PAGE-ICE, uses simulations from state-of-the-
art climate models and permafrost models to quantify the uncertainties in the
nonlinear Arctic feedbacks, namely carbon emissions from thawing land permafrost
and increased solar absorption due to the loss of the sea ice and land snow covers
(Yumashev et al., 2019). This is achieved by means of statistical emulators of the
complex physical models, which reproduce both the dynamics and uncertainties in
the Arctic feedbacks as simulated by the complex models.
Research into the environmental side of climate change naturally began long before
economists started thinking about the subsequent economic consequences. Scientists
started exploring the impact of gasses, such as carbon dioxide, on the Earth’s climate
in the latter half of the 19th century.9 In the 1960s, the field gained widespread
attention with the publication of Pales and Keeling (1965), and Brown and Keeling
(1965), which prompted extensive research. In the 1970s and 80s, economists began
exploring the economic consequences of climate change10, but, to this day, the area
has received much less research attention than the physical side of climate change. It
is, thus, not surprising that our understanding of climate uncertainties is more
developed. Yet there is nevertheless still much work to be done to confidently
understand even key climate uncertainties, such as the climate sensitivity (Freeman
et al., 2015). As we already discussed, there is also currently a lack of observational
data to define PDFs for many of the thousands of parameters that form large climate
models.
In addition to being aware of the limitations of current IAM PDFs, one must also
remember that one cannot expect good predictions when dealing with complex,
nonlinear systems. Economists’ track-record for predicting the evolution of
economies over even the short-term is poor in both times of crises and relatively
steady-states. Our ability to anticipate the economic consequences of climate change
is likely to be, at best, equally poor. This casts doubt on our ability to use CBA to
recommend, for example, cost-optimal emissions trajectories.
To avoid giving a false sense of certainty when communicating the results of IAMs in
the second category to decision-makers and other non-specialists, it could be more
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
9 For example, see Tyndall (1865) and Arrhenius and Holden (1897).
10 For example, see Smith and Turpak (1988).
Integrated Assessment Models and Other Climate Policy Tools
!
82
transparent to encourage non-probabilistic decision frameworks that were specifically
developed for dealing with deep uncertainty.
Non-probabilistic methods generally consider how robust decisions are to
assumptions about the future being wrong. Robustness is defined to be how well a
decision, such as investment to facilitate adaptation to climate change, performs
across many different non-probabilistic scenarios, say, the different future climates
that may result. Performance can be evaluated, for example, using scenarios of the
different costs and benefits that could emerge from each climate scenario, and netting
these against the cost of the adaptation investment. This result could then be
compared to a specified threshold for decision failure. The policy that is most robust
to different possible futures might not be the ‘optimal’ policy in any given future.
Model users are, thus, forced to entertain the idea that decisions may involve trade-
offs, for example, between maximising growth in one possible future, versus making
decisions that perform satisfactorily across many different futures. The goal is
generally not to definitively rank policies, but to inform policy makers about the
uncertainties, vulnerabilities, and trade-offs. Rather than policy makers being given
the impression that there is an objectively ‘best’ policy, these methods encourage
them to consider their priorities. For example, what is the maximum cost the decision
maker is prepared to potentially endure? What gains would the decision maker be
happy to sacrifice to avoid potentially triggering this cost threshold? The
recommended decision may thus differ for different decision makers in line with
their values and goals. For example, when probability distributions are unknown,
some argue that a maximin approach should be followed, where the focus is on
optimising the worst case scenario. Homeowners buying home insurance and large
US defence spending during the Cold War are examples of this precautionary principle
in action (Woodward and Bishop, 1997).
Scenario analysis is a common starting point for exploring highly uncertain futures.
Scenarios must be plausible and internally consistent, although many may be
deemed unlikely. With a mind to the questions being addressed, uncertainties that
could have a large effect on outcomes are identified via a literature review and/or
participatory multi-stakeholder workshops, and used to inform the choice of
scenarios. A different “axis of analysis” spanning each key uncertainty, for example,
is sometimes created (Laurent et al., 2015). One should then attempt to select
scenarios so that the ranges of these axes are well explored. Scenarios are typically
not ascribed a probability, so unlikely but disastrous events are given the same
consideration as seemingly probable, but more minor outcomes. Note that there is no
claim, however, that the full range of possible outcomes has been represented. One is
exploring uncertainties, without attempting to quantify them. These ensembles of
scenarios are then used to stress test decisions, identify weaknesses, explore key
drivers and relationships, make precautionary investments, and develop contingency
plans. The emphasis is once again not to identify a ‘best’ solution, but instead to
encourage decision makers to understand and explore different trade-offs: What are
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
83
the objectives? What trade-offs are tolerable? Do any decisions perform acceptably in
all possible worlds? What decisions leave one locked-in and unable to manoeuvre
should the future not turn out as expected? By considering their values, decision
makers should come to their own idea about the most appropriate choice; this should
not be decided by the model or modeller, who exists merely to present alternatives
and their pros and cons.
As discussed in this chapter, scenario analysis is already a popular tool for exploring
climate change, since the IPCC uses different IAMs in the first category to generate a
range of future population, economies, technology and GHG emissions scenarios,
which are not assigned probabilities. IAMs in the second category also typically use
these scenarios as non-probabilistic exogenous inputs, before attempting to quantify
the uncertainty around their output variable in each of these scenarios. Scenario
analysis is also frequently used in other disciplines. The Bank of England, for
example, used scenarios to explore the possible impacts of Brexit (BoE, 2018). In
addition, businesses sometimes use scenario analysis to aid decision making under
deep uncertainty, as do defence ministries and armed forces (Bradfield et al., 2005;
Brown, 1968). Scenario analysis is sometimes used in combination with formal
decision-making techniques. Robust decision making (RDM), for example, explores
which decisions perform well across a variety of plausible future scenarios, and
highlights trade-offs between satisfactory outcomes in a wide variety of futures and
optimal performance (Lempert et al., 2003)11. A decision is proposed, and then a large
number of scenarios in which the decision succeeds or fails as per specified terms are
collected. Statistical techniques are then often used to identify which uncertainties or
assumptions seem key to success or failure. RDM has been used in a variety of
environmental economic decision contexts, including resource management, and
capital investment (Kalra et al., 2014).
Non-scenario, non-probabilistic frameworks for decision making under deep
uncertainty have also been developed. Info-gap theory, for example, starts with a
‘best-guess’ of parameters and model structure, and then explores how wrong one
would have to be in one’s assumptions for a chosen decision to not meet a given
target. Decisions that can withstand the greatest error in uncertain parameters or
structure are then highlighted. Where multiple parameters are uncertain, they can be
assigned weights to indicate if some are more uncertain than others, or can all be
given equal weight if they are all equally uncertain. The methodology is frequently
used in ecological modelling to help determine, say, conservation strategies (Hayes et
al., 2013), or to help tackle environmental problems, such as, flood risk (Hine and
Hall, 2010). The initial ‘best guess’ for a parameter or the model structure, however,
affects info-gap results.
While scenario analysis and non-probabilistic decisions frameworks should be
encouraged since the uncertainties surrounding the PDFs in IAMs in the second
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
11 Sometimes probabilities are ascribed to different scenarios (Hall et al., 2012).
Integrated Assessment Models and Other Climate Policy Tools
!
84
category are deep, academics clearly should continue trying to improve the
quantification of climate and economic uncertainties and understand their
propagation. The teams quantifying economic uncertainties could potentially learn
from the methods used to quantify environmental uncertainties. A good example is
the use of Gaussian Process emulators, together with the GLUE or MCMC techniques
supplemented by extensive observational data, to quantify the probability
distributions of the key uncertain parameters of complex hydrological and
atmospheric chemistry models (Yang et al., 2018). Such techniques are more rigorous
than the approaches, summarised in Prieg and Yumashev (forthcoming), that are
commonly used to determine the PDFs associated with economic damage functions.
Although, one should note, that economic data and estimates of economic damages
are sparser than climate data. If researchers chose to use probability or expected
utility frameworks, such as CBA, when communicating research results it should be
made very clear that the degree of confidence in many of the model’s PDFs is
currently low, and so, given the current deep uncertainty, the output PDF is only
exploratory. The PDFs generated by IAMs in the second category provide a useful
range of plausible SCC values for each, say, RCP scenario for GHG emissions, but
one cannot be confident that the full range of possibilities has been captured, nor of
the relative likelihood of the different values. Such explicit statements are rarely
provided in the literature, and would add great value in even journal articles, as the
latter may be read by academics and non-academics from a range of specialities, i.e.
not just IAM specialists who are aware that many PDFs are speculative.
Conclusion
IAMs are already frequently used both in deterministic and probabilistic modelling
frameworks to assess impacts of climate change under different policy assumptions.
For example, large IAMs, which we assign to a first category of IAMs, are used to
develop scenarios for future population, economies, technology and GHG emissions.
While these IAMs are deterministic, which is mostly due to the lack of sufficient data
that would allow one to derive the necessary probability distributions for their
parameters, their results are not generally interpreted as precise predictions. Instead,
the IPCC, for example, encouraged different modelling teams, both in the IAM and
climate modelling domains, to produce results for the same GHG atmospheric
concentration trajectory, referred to as scenario. Probabilities were not assigned to
the different scenarios, and multi-mode uncertainties were explored separately for
each scenario. Smaller IAMs, which we assign to a second category, typically use
these scenarios as exogenous inputs. Within each of the scenarios, however, these
IAMs then use a probabilistic framework (Monte Carlo simulations) to quantify
uncertainty surrounding various components of the climate systems, economy and
interactions between the two, with the aim to estimate the economic impacts from
climate change. This probabilistic approach is possible because of the relative
simplicity and small size of these models.
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
85
Models are tools to help us learn. By attempting to quantify uncertainty, IAMs in the
second category are helping organise thoughts, explore the propagation of
uncertainty, illustrate the implications of different modelling assumptions, prompt
discussion, and highlight areas about which there is a paucity of knowledge and so
help direct future research. There is a danger, however, that the speculative PDFs
used in these IAMs may encourage a false confidence in the model results and
current knowledge of the likely economic effects of climate change. The uncertainty
surrounding the latter is, at least currently, deep, which means we lack the basic
information needed to define the PDFs of many model parameters. In addition, one
cannot expect good predictions when dealing with complex, nonlinear systems. This
casts doubt on our ability to generate, for example, cost-optimal emissions
trajectories. To avoid giving a false sense of certainty when communicating the
results of IAMs in the second category, it could be more transparent to encourage the
use of non-probabilistic decision frameworks, such as robust scenario analysis and
info-gap theory. In short, IAMs in the second category already use RCP scenarios of
GHG emissions as inputs, and modellers should consider also taking such an
approach to these models’ outputs, as these models do provide a logical and
consistent framework for developing scenarios. Monte Carlo simulations could, for
example, be run, and the output statistics used to suggest plausible ranges for non-
probabilistic economic and social impact scenarios for each RCP. In addition to
encouraging non-probabilistic decision frameworks, modellers should, of course,
continue trying to improve the quantification of uncertainties surrounding all
components of the coupled climate-economy system, including the economic impacts
of climate change. Until confidence has greatly improved, however, it should be
made explicitly clear when communicating research findings that output PDFs are
merely exploratory. To further highlight uncertainty, research articles should also
continue to compare results generated by different models, as is a common practice
in the climate modelling community.
References
ANTHOFF, D., TOL R.S. (2014), The Climate Framework for Uncertainty,
Negotiation and Distribution (FUND): Technical description, Version 3.9.
Forschungsstelle Für Nachhalt. Entwickl. Univ. Hambg. Hambg. Download 6.
ANTHOFF D., TOL R.S.J. (2013), The uncertainty about the social cost of carbon: A
decomposition analysis using fund. Clim. Change 117, 515–530.
ARREHENIUS S., HOLDEN E.S. (1897), On the influence of carbonic acid in the air
upon the temperature of the earth. Publ. Astron. Soc. Pac. 9, 14–24.
ASAFU-ADJAVE J., BYRNE D., ALVAREZ M. (2016), Economic growth, fossil fuel
and non-fossil consumption: A Pooled Mean Group analysis using proxies for
capital. Energy Econ. 60, 345–356.
BEVEN K.J. (2009), Environmental modelling: an uncertain future?: an introduction to
techniques for uncertainty estimation in environmental prediction. Routledge, London.
Integrated Assessment Models and Other Climate Policy Tools
!
86
BHARWANI S., BITHELL M., DOWNING T.E., et al. (2005), Multi-agent modelling
of climate outlooks and food security on a community garden scheme in Limpopo,
South Africa. Philos. Trans. R. Soc. B Biol. Sci. 360, 2183–2194.
BoE (2018), EU withdrawal scenarios and monetary and financial stability: A
response to the House of Commons Treasury Committee. Bank of England. [WWW
Document]. URL https://www.bankofengland.co.uk/report/2018/eu-withdrawal-
scenarios-and-monetary-and-financial-stability
BRADFIELD R., WRIGHT G., BURT G. et al. (2005), The origins and evolution of
scenario techniques in long range business planning. Futures 37, 795–812.
BROWN C.W., KEELING C.D. (1965), The concentration of atmospheric carbon
dioxide in Antarctica. J. Geophys. Res. 70, 6077–6085.
BROWN S. (1968), Scenarios in systems analysis, in: Quade, E.S., Boucher, W.J. (Eds.),
Systems Analysis and Policy Planning: Applications in Defence. American Elsevier
Publishing Co, New York.
BURKE M., HSIANG S.M., MIGUEL E. (2015), Global non-linear effect of
temperature on economic production. Nature 527, 235.
CHENG C., SA-NGASOONGSONG A., BEYCA O. et al. (2015), Time series
forecasting for nonlinear and non-stationary processes: a review and comparative
study. IIE Trans. 47, 1053–1071.
CLEMENTS M.P., FRANSES P.H., SWANSON N.R. (2004), Forecasting economic
and financial time-series with non-linear models. Int. J. Forecast. 20, 169–183.
DE GOOIJER J.G., HYNDMAN R.J. (2006), 25 years of time series forecasting. Int. J.
Forecast. 22, 443–473.
EDGE R.M., GURKAYNAK R.S. (2011), How Useful are Estimated DSGE Model
Forecasts? SSRN Electron. J.
FILDES R., STEKLER H. (2002), “The state of macroeconomic forecasting”, J.
Macroecon. 24, 435–468.
FREEMAN M.C., WAGNER G., ZECKHAUSER R.J. (2015), “Climate sensitivity
uncertainty: when is good news bad?”, Philos. Trans. R. Soc. Math. Phys. Eng. Sci. 373,
20150092.
FRIEDMAN M. (1953), Essays in positive economics. University of Chicago Press,
Chicago.
GOLESTANI A., GRAS R. (2014), “Can we predict the unpredictable?”, Sci. Rep. 4,
6834.
GOODWIN P., WRIGHT G. (2010), “The limits of forecasting methods in anticipating
rare events”, Technol. Forecast. Soc. Change 77, 355–368.
HALL J.W., LEMPERT R.J., KELLER K. et al. (2012), “Robust Climate Policies Under
Uncertainty: A Comparison of Robust Decision Making and Info-Gap Methods: A
Comparison of Robust Decision Making and Info-Gap Methods”, Risk Anal. 32, 1657–
1672.
HAYES K.R., BARRY S.C., HOSACK G.R. et al. (2013), “Severe uncertainty and info-
gap decision theory”, Methods Ecol. Evol. 4, 601–611.
Frameworks for Dealing with Climate and Economic Uncertainties in IAM, Prieg, Yumashev
!
87
HINE D., HALL J.W. (2010), “Information gap analysis of flood model uncertainties
and regional frequency analysis”, Water Resour. Res. 46.
HOPE C. (2013), “Critical issues for the calculation of the social cost of CO2: why the
estimates from PAGE09 are higher than those from PAGE2002”, Clim. Change 117,
531–543.
HOPE C., NEWBERY D.M. (2006), Calculating The Social Cost Of Carbon (Cambridge
Working Papers in Economics). Faculty of Economics, University of Cambridge.
IPCC (2014), Fifth Assessment Report (AR5). URL https://www.ipcc.ch/report/ar5/
(accessed 2.16.18).
JOURNAL OF ECONOMIC PERSPECTIVE (2015), “Editorial Note: Correction to
Richard S. Tol’s “The Economic Effects of Climate Change”, J. Econ. Perspect. 29, 217–
220.
KALRA N., HALLEGATE S., LEMPERT R. et al. (2014), Agreeing on Robust Decisions:
New Processes for Decision Making under Deep Uncertainty, Policy Research Working
Papers. The World Bank.
KLEEMAN R. (2011), “Information Theory and Dynamical System Predictability”,
Entropy 13, 612–649.
KUZNETS S. (1955), “Economic Growth and Income Inequality”, Am. Econ. Rev. 45,
1–28.
LAURENT K.L., FRIEDMAN K.B., KRANTZBERG G. et al. (2015), “Scenario
analysis: An integrative and effective method for bridging disciplines and achieving
a thriving Great Lakes-St. Lawrence River basin”, J. Gt. Lakes Res. 41, 12–19.
LEMPERT R.J., POPPER S.W., BANKES S.C. (2003), Shaping the next one hundred years:
new methods for quantitative, long-term policy analysis. RAND, Santa Monica, CA.
LORENZ E.N. (1963), “Deterministic Nonperiodic Flow”, J. Atmospheric Sci. 20, 130–
141.
MAKI U. (2009), The Methodology of Positive Economics Reflections on the Milton
Friedman legacy. Cambridge University Press, Cambridge.
MORGAN M.G. (2017), Theory and Practice in Policy Analysis: Including Applications in
Science and Technology. Cambridge University Press.
NORDHAUS W., SZTORC P., (2013), DICE 2013R: Introduction and User’s Manual.
Second edition
http://www.econ.yale.edu/~nordhaus/homepage/homepage/documents/DICE_
Manual_100413r1.pdf (accessed 10.31.18).
NORDHAUS W.D. (2017), “Revisiting the social cost of carbon”, Proc. Natl. Acad. Sci.
U. S. A. 114, 1518–1523.
ORESKES N., SHRADER-FRECHETTE K., BELITZ K. (1994), “Verification,
validation, and confirmation of numerical models in the earth sciences”, Sci. Wash,
263, 641–646.
PALES J.C., KEELING C.D. (1965), “The concentration of atmospheric carbon dioxide
in Hawaii”, J. Geophys. Res, 70, 6053–6076.
Integrated Assessment Models and Other Climate Policy Tools
!
88
PINDYCK R.S. (2017), “The Use and Misuse of Models for Climate Policy”, Rev.
Environ. Econ. Policy 11, 100–114.
PIONTEK F., KALKUHL M., KRIEGLER E. et al. (2018), “Economic growth effects of
alternative climate change impact channels in economic modelling”, Environmental
and Resource Economics, 1-29.
PRIEG L.F., YUMASHEV D., (2019), Improving estimates of the economic effects of
climate change in integrated assessment models., in: Ninan, K. (Ed.), Environmental
Assessments - Scenarios, Modelling and Policy. Edward Elgar Publishing, Cheltenham.
RICHARDSON P., DANG T.-T., PAIN N. et al. (2014), OECD Forecasts During and
After the Financial Crisis (OECD Economics Department Working Papers No. 1107).
SHINBROT T., GREBOGI C., WISDOM J. et al. (1992), “Chaos in a double
pendulum”, Am. J. Phys. 60, 491–499.
SMITH J.B., TURPAK D.A. (1988), The potential effects of global climate change on
the United States. Report to Congress. United States Environmental Protection
Agency.
SOLOW R.M. (1985), “Economic History and Economics”, Am. Econ. Rev. 75, 328–331.
TYNDALL J. (1865), Heat considered as a mode of motion. D. Appleton & Company,
New York.
VAN VUUREN D.P., STEHFEST E., DEN ELSEN M.G.J. et al. (2011), “RCP2.6:
exploring the possibility to keep global mean temperature increase below 2°C”, Clim.
Change 109, 95.
VLAD, S., PASCU P., MORARIU N. (2010), “Chaos Models in Economics”, J. Comput.
2, 79–83.
WANG W.-X., YANG R., LAI Y.-C. et al. (2011), “Predicting Catastrophes in
Nonlinear Dynamical Systems by Compressive Sensing”, Phys. Rev. Lett. 106.
WEITZMAN M.L. (2014), “Fat Tails and the Social Cost of Carbon”, Am. Econ. Rev.
104, 544–546.
WEITZMAN M.L (2009), “On Modeling and Interpreting the Economics of
Catastrophic Climate Change”, Rev. Econ. Stat. 91, 1–19.
WISE M., CALVIN K., THOMSON A. et al. (2009), “Implications of limiting CO2
concentrations for land use and energy”, Science 324, 1183–1186.
WOODWARD R.T., BISHOP R.C. (1997), “How to decide when experts disagree:
Uncertainty-based choice rules in environmental policy”, Land Econ. 73, 492.
YANG J., JAKEMAN A., FANG G. et al. (2018), “Uncertainty analysis of a semi-
distributed hydrologic model based on a Gaussian Process emulator”, Environ. Model.
Softw. 101, 289–300.
YUMASHEV D., HOPE C., SCHAEFER K. et al. (2019), “Climate policy implications
of nonlinear decline of Arctic land permafrost and other cryosphere elements”,
Nature Communications, 10(1), 1900.
... Beyond forestry, climate issues and the future are often explored with detailed, bottom-up models of the energy or agricultural sectors. Like FSM, because of their large size and computational complexity, these models are often deterministic and rely on scenario analysis and inter-model comparisons to assess uncertainties (Prieg and Yumashev 2020) 46 . In this broader context, our work, through an illustration based on forestry and wildfires, illustrates how a probabilistic approach can be used to propagate and assess uncertainties in such a model. ...
Thesis
Full-text available
Forest policy increasingly mobilizes the forest sector to address environmental concerns. Owing to the forest sector’s complexity and time scales involved, simulation models are often used as research methods to explore the future. This thesis investigates the contributions of Forest Sector Models (FSM), bio-economic simulation models commonly used for prospective analysis, to this transition.We first adopt a conceptual perspective and, through a parallel exploration of the early literature in forest economics and the epistemology of model use, we show that forest policy has been, and still is, a strong driver of FSM research, influencing representation processes in models as well as narratives used to drive research. We also highlight that the nature of facts within the forest sector, the local context, data availability and past practices are other important determinants of model-based research. We subsequently review more recent literature to assess the extent to which environmental issues have been addressed. While originally focused on timber production and trade, a majority of the research now focuses on goals such as renewable energy production or the conservation of biodiversity. The treatment of such objectives has however been unequal, and those closer to the models’ original target are treated more often and more deeply. On the contrary, modelling is hindered when economic values are hard to estimate or when models cannot handle spatialized data, hence objectives related to cultural and some regulation services are less commonly studied.The remainder of the thesis addresses two aspects of climate change, namely mitigation and adaptation, and brings methodological contributions by leveraging two ways of overcoming obstacles to the investigation of environmental objectives with large-scale bio-economic models: model couplings and the consideration of local environmental conditions. Both chapters focus on France, where the diversity of local contexts makes analyses focused on the upstream forest sector relevant, and use the French Forest Sector Model (FFSM).First, using the FFSM and Hartman’s model for optimal rotations with non-timber amenities, we investigate consequences for forestry and landscapes of management practices aiming at both producing timber and sequestering carbon. We show that, while postponing harvests can increase carbon stocks in the short-term, changes in management regimes and species choice yield additional benefits in the long-term. Over time, these changes lead to more diverse forest landscapes in terms of composition and structure, with potential implications for policy and environmental co-benefits. However, trends show a high level of spatial variability across and within regions, highlighting the importance of considering the local context.In-situ carbon stocks are however exposed to risks of non-permanence. We assess implications for the forest sector of climate-induced changes in wildfire regimes, as well as implications for model projections of uncertainties related to these changes. To do so, we use a probabilistic model of wildfire activity, which we couple to the FFSM, and we carry out multiple simulations using various radiative forcing levels and different climate models. Although locally significant, wildfires’ impacts remain limited at the sectoral scale. Fires affect a limited amount of the resource every year but in a cumulative manner, and the influence of climate change is mostly witnessed in the latter half of the century. Inter-annual fluctuations in fire activity only marginally propagate to the forest sector, and most uncertainty comes from the choice of climate models and scenarios. Stochasticity in the fire process, although never predominant, accounts for a significant share of uncertainty. These results stress the importance of considering multiple possible outcomes and the inherent variability in environmental processes in large-scale model projections.
... The analysis performed here also highlights the limitations of the system dynamics approach, which works very well for capturing the inner system non-linear feedbacks, but when dealing with rapidly increasing non-linear interactions, uncertainty becomes a 'demon in the machine' that affects the reliability of the model projections [49]. In this sense this work contributes to the analysis of limitations of complex system dynamics models and IAMs [50] in general and in particular to the MEDEAS models. ...
Article
Full-text available
Today's decision-makers rely heavily on Integrated Assessment Models to guide the decarbonisation of the energy system. Uncertainty is embedded in the assumptions these models are built upon. Unless those uncertainties are adequately assessed, using Integrated Assessment Models for policy design is unadvised. In this work we run Monte Carlo simulations with the MEDEAS model at European Union scale to assess how the uncertainties on the main drivers of the transition affect key socioeconomic and environmental indicators. In addition, One-at-a-time sensitivity exploration is performed to grade the contribution of a set of model parameters to the uncertainty in the same key indicators. The combination of the uncertainties in the model drivers magnify the uncertainty in the model outputs, which widens over time. Parameters affecting sectorial and households' energy efficiency and households' transport energy use ranked amongst the most impacting ones on simulation results.
Chapter
Integrated assessment models (IAMs) are cross-disciplinary tools that explore how economic activity interacts with the environment and vice versa. They differ in their design and scope and are frequently used by the Intergovernmental Panel on Climate Change (IPCC), national environmental agencies and others to assess the economic consequences of climate change. One particular group of IAMs goes as far as estimating the total economic effect of climate change, which includes three essential components: costs of emissions abatement, costs of adaptation to changing climate, and residual impacts on the economy. The most prominent models in this group, the Dynamic Integrated Model of Climate and the Economy (DICE), Climate Framework for Uncertainty, Negotiation and Distribution (FUND) and Policy Analysis of the Greenhouse Effect (PAGE), follow a commonly adopted approach to use empirical damage functions as a way of translating climate-driven environmental changes into the economic impacts, which could be either costs or benefits. The chapter explores how damage functions are constructed and calibrated in these three IAMs by examining model documentation, and summarizes the limitations of the current approaches. It then reflects on how damage functions could be improved in order to help build confidence in these IAMs, and ultimately to provide better estimates of the impacts of climate change for decision-makers and academics.
Article
Full-text available
Arctic feedbacks accelerate climate change through carbon releases from thawing permafrost and higher solar absorption from reductions in the surface albedo, following loss of sea ice and land snow. Here, we include dynamic emulators of complex physical models in the integrated assessment model PAGE-ICE to explore nonlinear transitions in the Arctic feedbacks and their subsequent impacts on the global climate and economy under the Paris Agreement scenarios. The permafrost feedback is increasingly positive in warmer climates, while the albedo feedback weakens as the ice and snow melt. Combined, these two factors lead to significant increases in the mean discounted economic effect of climate change: +4.0% ($24.8 trillion) under the 1.5 °C scenario, +5.5% ($33.8 trillion) under the 2 °C scenario, and +4.8% ($66.9 trillion) under mitigation levels consistent with the current national pledges. Considering the nonlinear Arctic feedbacks makes the 1.5 °C target marginally more economically attractive than the 2 °C target, although both are statistically equivalent.
Article
Full-text available
Despite increasing empirical evidence of strong links between climate and economic growth, there is no established model to describe the dynamics of how different types of climate shocks affect growth patterns. Here we present the first comprehensive, comparative analysis of the long-term dynamics of one-time, temporary climate shocks on production factors, and factor productivity, respectively, in a Ramsey-type growth model. Damages acting directly on production factors allow us to study dynamic effects on factor allocation, savings and economic growth. We find that the persistence of impacts on economic activity is smallest for climate shocks directly impacting output, and successively increases for direct damages on capital, loss of labor and productivity shocks, related to different responses in savings rates and factor-specific growth. Recurring shocks lead to large welfare effects and long-term growth effects, directly linked to the persistence of individual shocks. Endogenous savings and shock anticipation both have adaptive effects but do not eliminate differences between impact channels or significantly lower the dissipation time. Accounting for endogenous growth mechanisms increases the effects. We also find strong effects on income shares, important for distributional implications. This work fosters conceptual understanding of impact dynamics in growth models, opening options for links to empirics.
Article
Full-text available
Despite various criticisms of GLUE (Generalized Likelihood Uncertainty Estimation), it is still a widely-used uncertainty analysis technique in hydrologic modelling that can give an appreciation of the level and sources of uncertainty. We introduce an augmented GLUE approach based on a Gaussian Process (GP) emulator, involving GP to conduct a Bayesian sensitivity analysis to narrow down the influential factor space, and then performing a standard GLUE uncertainty analysis. This approach is demonstrated for a SWAT (Soil and Water Assessment Tool) application in a watershed in China using a calibration and two validation periods. Results show: 1) the augmented approach led to the screening out of 14–18 unimportant factors, effectively narrowing factor space; 2) compared to the more standard GLUE, it substantially improved the sampling efficiency, and located the optimal factor region at lower computational cost. This approach can be used for other uncertainty analysis techniques in hydrologic and non-hydrologic models.
Chapter
Integrated assessment models (IAMs) are cross-disciplinary tools that explore how economic activity interacts with the environment and vice versa. They differ in their design and scope and are frequently used by the Intergovernmental Panel on Climate Change (IPCC), national environmental agencies and others to assess the economic consequences of climate change. One particular group of IAMs goes as far as estimating the total economic effect of climate change, which includes three essential components: costs of emissions abatement, costs of adaptation to changing climate, and residual impacts on the economy. The most prominent models in this group, the Dynamic Integrated Model of Climate and the Economy (DICE), Climate Framework for Uncertainty, Negotiation and Distribution (FUND) and Policy Analysis of the Greenhouse Effect (PAGE), follow a commonly adopted approach to use empirical damage functions as a way of translating climate-driven environmental changes into the economic impacts, which could be either costs or benefits. The chapter explores how damage functions are constructed and calibrated in these three IAMs by examining model documentation, and summarizes the limitations of the current approaches. It then reflects on how damage functions could be improved in order to help build confidence in these IAMs, and ultimately to provide better estimates of the impacts of climate change for decision-makers and academics.
Article
DSGE models are a prominent tool for forecasting at central banks and the competitive forecasting performance of these models relative to alternatives--including official forecasts--has been documented. When evaluating DSGE models on an absolute basis, however, we find that the benchmark estimated medium scale DSGE model forecasts inflation and GDP growth very poorly, although statistical and judgmental forecasts forecast as poorly. Our finding is the DSGE model analogue of the literature documenting the recent poor performance of macroeconomic forecasts relative to simple naive forecasts since the onset of the Great Moderation. While this finding is broadly consistent with the DSGE model we employ--ie, the model itself implies that under strong monetary policy especially inflation deviations should be unpredictable--a wrong model may also have the same implication. We therefore argue that forecasting ability during the Great Moderation is not a good metric to judge the usefulness of model forecasts.
Book
Milton Friedman's 1953 essay 'The methodology of positive economics' remains the most cited, influential, and controversial piece of methodological writing in twentieth-century economics. Since its appearance, the essay has shaped the image of economics as a scientific discipline, both within and outside of the academy. At the same time, there has been an ongoing controversy over the proper interpretation and normative evaluation of the essay. Perceptions have been sharply divided, with some viewing economics as a scientific success thanks to its adherence to Friedman's principles, others taking it as a failure for the same reason. In this book, a team of world-renowned experts in the methodology of economics cast new light on Friedman's methodological arguments and practices from a variety of perspectives. It provides the 21st century reader with an invaluable assessment of the impact and contemporary significance of Friedman's seminal work.
Article
In recent articles, I have argued that integrated assessment models (IAMs) have flaws that make them close to useless as tools for policy analysis. IAM-based analyses of climate policy create a perception of knowledge and precision that is illusory, and can fool policy-makers into thinking that the forecasts the models generate have some kind of scientific legitimacy. But some have claimed that we need some kind of model, and that IAMs can be structured and used in ways that correct for their shortcomings. For example, it has been argued that although we know little or nothing about key relationships in the model, we can get around this problem by attaching probability distributions to various parameters and then simulating the model using Monte Carlo methods. I argue that this would buy us nothing, and that a simpler and more transparent approach to the design of climate change policy is preferable. I briefly outline what that approach would look like.
Article
Significance The most important single economic concept in the economics of climate change is the social cost of carbon (SCC). At present, regulations with more than $1 trillion of benefits have been written for the United States that use the SCC in their economic analysis. The DICE model (Dynamic Integrated model of Climate and the Economy) is one of three integrated assessment models used to estimate the SCC in the United States. The present study presents updated estimates based on a revised DICE model (DICE-2016R). The study estimates that the SCC is $31 per ton of CO 2 in 2010 US$ for the current period (2015). This study will be an important step in developing the next generation of estimates of the SCC in the United States and other countries.
Article
This study employs a Pooled Mean Group estimator to examine the nexus between economic growth and fossil and non-fossil fuel consumption for 53 countries between 1990 and 2012. The global sample was divided into four categories: developed exporters, developed importers, developing exporters and developing importers. The purpose of these categories was to observe whether factors unique to these countries influence the relationship between energy consumption and economic growth. With the exception of developing importers, evidence of bi-directional causality between fossil fuel consumption and real GDP across all subsamples is observed. This leads to the conclusion that efforts to directly conserve fossil fuels may harm economic growth. In terms of non-fossil fuel use, the results are more diverse. Bi-directional causality between non-fossil fuel use and real GDP is found in the long and short run for developed importers; bi-directional causality only in the long run for developed exporters; negative long-run causality from real GDP to non-fossil fuels for developing exporters; and long-run causality from non-fossil fuel use to real GDP for developing importers. These results lead to the conclusion that other factors have been responsible for the progress seen in non-fossil fuel use. Thus it is concluded that economic growth on its own is insufficient to promote clean energy development. There is a need for policy makers to create an environment conducive to renewable energy investment.