THE SCIENTIFIC UNDERSTATEMENT OF CLIMATE RISKS
BY DAVID SPRATT & IAN DUNLOP
EXCESSIVE CAUTION 4
UNDERESTIMATION OF RISKS 7
CLIMATE MODELS 10
TIPPING POINTS 11
CLIMATE SENSITIVITY 12
CARBON BUDGETS 15
ARCTIC SEA ICE 16
POLAR ICE MASS LOSS 17
SEA-LEVEL RISE 19
POLITICAL CONSENSUS 21
GOALS ABANDONED 22
A FAILURE OF IMAGINATION 23 AUTHORS
David Spratt is Research Director for Breakthrough
and co-author of
Climate Code Red: The case for
. His recent reports include
Recount: It’s time to “Do the math” again
Climate Reality Check, Antarctic Tipping Points for
a Multi-metre Sea-level Rise
Climate change, conict and risk
(with Ian Dunlop)
Ian Dunlop is a senior member of the Advisory
Board for Breakthrough. Ian was an international
oil, gas and coal industry executive, chairman of the
Australian Coal Association and chief executive of
the Australian Institute of Company Directors. From
1998-2000 he chaired the Australian Greenhouse
Ofce Experts Group on Emissions Trading. He is a
member of the Club of Rome.
Published by Breakthrough - National Centre for Climate Restoration
Melbourne, Australia | September 2017 breakthroughonline.org.au
Human-induced climate change is an existential risk to human civilisation: an adverse outcome
that would either annihilate intelligent life or permanently and drastically curtail its potential.
Special precautions that go well beyond conventional risk management practice are required if
the “fat tails” — the increased likelihood of very large impacts — are to be adequately dealt with.
The potential consequences of these lower-probability, but higher-impact, events would be
devastating for human societies.
The bulk of climate research has tended to underplay these risks, and exhibited a preference for
conservative projections and scholarly reticence, albeit increasing numbers of scientists have
spoken out in recent years on the dangers of such an approach.
Climate policymaking and the public narrative are significantly informed by the important work of
the Intergovernmental Panel on Climate Change (IPCC). However, IPCC reports also tend
toward reticence and caution, erring on the side of “least drama”, and downplaying more extreme
and more damaging outcomes. Whilst this has been understandable historically, given the
pressure exerted upon the IPCC by political and vested interests, it is now becoming
dangerously misleading, given the acceleration of climate impacts globally. What were
lower-probability, higher-impact, events are now becoming more likely.
This is a particular concern with potential climatic “tipping points” — passing critical thresholds
which result in step changes in the system — such as the polar ice sheets (and hence sea
levels), and permafrost and other carbon stores, where the impacts of global warming are
non-linear and difficult to model at present. Under-reporting on these issues contributes to the
“failure of imagination” that is occurring today in our understanding of, and response to, climate
If climate policymaking is to be soundly based, a reframing of scientific research within an
existential risk-management framework is now urgently required. This must be taken up not just
in the work of the IPCC, but also in the UN Framework Convention on Climate Change
negotiations if we are to address the real climate challenge.
Current processes will not deliver either the speed or the extent of change required.
WHAT LIES BENEATH 1
Three decades ago, when serious debate on human-induced climate change began at the global level, a great deal of
statesmanship was on display. There was a preparedness to recognise that this was an issue transcending nation states,
ideologies and political parties which had to be addressed proactively in the long-term interests of humanity as a whole,
even if the existential nature of the risk it posed was far less clear cut than it is today.
As global institutions were established to take up this challenge, such as the UN Framework Convention on Climate Change
(UNFCCC) at the Rio Earth Summit in 1992, and the extent of change this would demand of the fossil-fuel-dominated world
order became clearer, the forces of resistance began to mobilise. Today, as a consequence, and despite the diplomatic
triumph of the 2015 Paris
, the debate around climate change policy has never been more dysfunctional, indeed
In his book 1984, George Orwell describes a double-speak totalitarian state where most of the population accepts “the most
flagrant violations of reality, because they never fully grasped the enormity of what was demanded of them, and were not
sufficiently interested in public events to notice what was happening. By lack of understanding they remained sane.”
Orwell could have been writing about climate change and policymaking. International agreements talk of limiting global
warming to 1.5–2°C, but in reality they set the world on a path of 3–5°C. Goals are reaffirmed, only to be abandoned. Coal
is “clean”. Just 1°C of warming is already dangerous, but this cannot be said. The planetary future is hostage to myopic
national self-interest. Action is delayed on the assumption that as yet unproven technologies will save the day, decades
hence. The risks are existential, but it is “alarmist”
to say so. A one-in-two chance of missing a goal is normalised as
Climate policymaking for years has been cognitively dissonant, “a flagrant violation of reality”. So it is unsurprising that there
is a lack of a understanding amongst the public and elites of the full measure of the climate challenge. Yet most Australians
sense where we are heading: three-quarters of Australians see climate change as catastrophic risk, and half see our way
of life ending within the next 100 years.
Politics and policymaking have norms: rules and practices, assumptions and boundaries, that constrain and shape them. In
recent years, the previous norms of statesmanship and long-term thinking have disappeared, replaced by an obsession with
short-term political and commercial advantage Climate policymaking is no exception.
Since 1992, short-term economic interest has trumped environmental and future human needs. The world today emits 48%
more carbon dioxide (CO2) from the consumption of energy than it did 25 years ago, and the global economy has more than
doubled in size. The UNFCCC strives "to
, but every
year humanity’s ecological footprint becomes larger and less sustainable. Humanity now requires the biophysical capacity
of 1.7 planets annually to survive as it rapidly chews up the natural capital.
A fast, emergency-scale transition to a post-fossil fuel world is absolutely necessary to address climate change. But this is
excluded from consideration by policymakers because it is considered to be too disruptive. The orthodoxy is that there is
1 CommunicateResearch 2017, ‘Global Challenges Foundation global risks survey’, ComRes
, 24 May 2017,
2 Randle, MJ & Eckersley, R 2015, ‘Public perceptions of future threats to humanity and different societal responses: a cross-national study’,
, vol. 72, pp. 4-16.
WHAT LIES BENEATH 2
time for an orderly economic transition within the current short-termist political paradigm. Discussion of what would be safe
–– less warming that we presently experience –– is non-existent. And so we have a policy failure of epic proportions.
Policymakers, in their magical thinking, imagine a mitigation path of gradual change, to be constructed over many decades
in a growing, prosperous world. The world not imagined is the one that now exists: of looming financial instability; of a global
crisis of political legitimacy; of a sustainability crisis that extends far beyond climate change to include all the fundamentals
of human existence and most significant planetary boundaries (soils, potable water, oceans, the atmosphere, biodiversity,
and so on); and of severe global energy sector dislocation.
In anticipation of the upheaval that climate change would impose upon the global order, the Intergovernmental Panel on
Climate Change (IPCC), was established by the UN in 1988, charged with regularly assessing the global consensus on
climate science as a basis for policymaking. The IPCC Assessment
), produced every 5–6 years, play a large
part in the public framing of the climate narrative: new reports are a global media event. AR5
was produced in 2013-14,
due in 2022. The IPCC has done critical, indispensable work of the highest standard in pulling together a periodic
consensus of what must be the most exhaustive scientific investigation in world history. It does not carry out its own
research, but reviews and collates peer-reviewed material from across the spectrum of this incredibly complex area,
identifying key issues and trends for policymaker consideration.
However, the IPCC process suffers from all the dangers of consensus-building in such a wide-ranging and complex arena.
For example, IPCC reports, of necessity, do not always contain the latest available information. Consensus-building can
lead to “least drama”, lowest-common-denominator outcomes which overlook critical issues. This is particularly the case
with the “fat-tails” of probability distributions, that is, the high-impact but relatively low-probability events where scientific
knowledge is more limited. Vested interest pressure is acute in all directions; climate denialists accuse the IPCC of
alarmism, whereas climate action proponents consider the IPCC to be far too conservative. To cap it all, the IPCC
conclusions are subject to intense political oversight before being released, which historically has had the effect of
substantially watering-down sound scientific findings.
These limitations are understandable, and arguably were not of overriding importance in the early period of the IPCC.
However, as time has progressed, it is now clear that the risks posed by climate change are far greater than previously
anticipated. We have moved out of the twilight period of much talk but relatively limited climate impacts. Climate change is
now turning nasty, as we have witnessed in 2017 in the USA, South Asia, the Middle East and Europe, with record-breaking
heatwaves and wildfires, more intense flooding and more damaging hurricanes.
The distinction between climate science and risk is now the critical issue, for the two are not the same. Scientific reticence
— a reluctance to spell out the full risk implications of climate science in the absence of perfect information — has become
a major problem. Whilst this is understandable, particularly when scientists are continually criticised by denialists and
political apparatchiks for speaking out, it is extremely dangerous given the “fat tail” risks of climate change. Waiting for
perfect information, as we are continually urged to do by political and economic elites, means it will be too late to act.
Irreversible, adverse climate change on the global scale now occurring is an existential risk to human civilisation. Many of
the world’s top climate scientists quoted in this report well understand these implications — James Hansen, Michael E.
Mann, John Schellnhuber, Kevin Anderson, Eric Rignot, Naomi Oreskes, Kevin Trenberth, Michael Oppenheimer, Stefan
Rahmstorf and others — and are forthright about their findings, where we are heading, and the limitations of IPCC reports.
3 Dunlop, I & Spratt, D 2017, Disaster
, Breakthrough National Centre for Climate Restoration,
WHAT LIES BENEATH 3
This report seeks to alert the wider community and leaders to these limitations and urges change to the IPCC approach,
and to the wider UNFCCC negotiations. It is clear that existing processes will not deliver the transformation to a carbon
negative world in the limited time now available.
We urgently require a reframing of scientific research within an existential risk-management framework. This requires
special precautions that go well beyond conventional risk management. Like an iceberg, there is great danger “In what lies
A 2013 study by Naomi Oreskes and fellow researchers examined a number of past predictions made by climate scientists,
and found they have been “conservative in their projections of the impacts of climate change” and that “at least some of the
key attributes of global warming from increased atmospheric greenhouse gases have been under-predicted, particularly in
IPCC assessments of the physical science”. They concluded that climate scientists are not biased toward alarmism but
rather the reverse of “erring on the side of least drama [ESLD], whose causes may include adherence to the scientific
norms of restraint, objectivity, skepticism, rationality, dispassion, and moderation”. ESLD may cause scientists “to
underpredict or downplay future climate changes”.
This tallies with the views of economist Prof. Ross Garnaut, who in 2011 reflected on his experience in presenting two
climate reports to the Australian Government. Garnaut questioned whether climate research had a conservative “systematic
bias” due to “scholarly reticence”. He pointed to a pattern across diverse intellectual fields of research predictions being
“not too far away from the mainstream” expectations and observed that in the climate field that this “has been associated
with understatement of the risks”.
As far back as 2007, then NASA climate science chief Prof. James Hansen suggested that scientific reticence hinders
communication with the public about dangers of global warming and potentially large sea-level rises. More recently he
wrote that: “the affliction is widespread and severe. Unless recognized, it may severely diminish our chances of averting
dangerous climate change”.
Ten years after his 2006 climate report to the UK government, Sir Nicholas Stern reflected that: “science is telling us that
impacts of global warming – like ice sheet and glacier melting – are now happening much more quickly than we
anticipated”. In 2013 he said that "Looking back, I underestimated the risks… Some of the effects are coming through more
quickly than we thought then.”
A recent study of climate scientists found "a community which still identified strongly with an idealised picture of scientific
rationality, in which the job of scientists is to get on with their research quietly and dispassionately". The study said most
4 Brysse, K, Oreskes, N, O’Reilly, J & Oppenheimer, M 2013, ‘Climate change prediction: Erring on the side of least drama?’, Global
, vol. 23, no. 1, pp. 327-337.
5 Garnaut, R 2011, Update
Garnaut Climate Change Review Update, Canberra, pp. 53-55.
6 Hansen, J 2007, ‘Scientific reticence and sea level rise’, Environmental
, vol. 2, no. 2, 024002.
7 McKee, R 2016, ‘Nicholas Stern: cost of global warming “is worse than I feared”’, The Guardian, 6 November 2016,
8 Stewart, H & Elliott, L 2013, ‘Nicholas Stern: “I got it wrong on climate change – it's far, far worse”’, The Guardian, 27 January 2013,
WHAT LIES BENEATH 4
climate scientists are resistant to participation in public/policy engagement, leaving this task to a minority who are attacked
by the media and even by their own colleagues.
Kevin Trenberth, head of climate analysis at the US National Center for Atmospheric Research and a lead author of key
sections of the 2001 and 2007 IPCC reports, says: "We're underestimating the fact that climate change is rearing its head…
and we're underestimating the role of humans, and this means we're underestimating what it means for the future and what
we should be planning for."
Prof. Michael E. Mann of Pennsylvania State University says the IPCC’s 2012 report on climate extremes missed an
opportunity to provide politicians with a clear picture of the extent of the climate crisis: "Many scientists felt that report erred
by underplaying the degree of confidence in the linkage between climate change and certain types of severe weather,
including heat wave severity, heavy precipitation and drought, and hurricane intensity.”
Prof. Kevin Anderson of the University of Manchester says there is "an endemic bias prevalent amongst many of those
building emission scenarios to underplay the scale of the 2°C challenge. In several respects, the modelling community is
actually self-censoring its research (focus) to conform to the dominant political and economic paradigm… ".
A good example is the 1.5°C target agreed to at the Paris December 2015 climate policy conference. IPCC assessment
reports until that time (and in conformity with the dominant political paradigm) had not devoted any significant attention to
1.5°C emission-reduction scenarios, and the Paris delegates had to request the IPCC to do so as a matter of urgency. This
is a clear case of politics driving the science research agenda. Research needs money, and too often money is allocated
according to the political priorities of the day.
Anderson says it is incumbent on the scientific community to communicate research clearly and candidly to those delivering
on the climate goals established by civil society, and "to draw attention to inconsistencies, misunderstandings and
deliberate abuse of the scientific research. It is not our job to be politically expedient with our analysis or to curry favour with
our funders. Whether our conclusions are liked or not is irrelevant."
Much has been written about the inadequacy of IPCC processes, and the politicisation of decision-making.
Scientists say one reason the IPCC's work is too conservative is that unwieldy processes mean reports do not take the most
recent research into account. The cutoff point for science to be considered in a report is so far in advance of publication that
the reports are out of date upon release. This is a crucial failure in a field of research that is rapidly changing. Inez Fung at
9 Hoggett, P & Randall, R 2016, ‘Socially constructed silence? Protecting policymakers from the unthinkable’, Transformation
, 6 June 2016,
10 Scherer, G 2012a, ‘How the IPCC underestimated climate change’, Scientific
, 6 December 2012,
11 Scherer, G 2012b, ‘Climate science predictions prove too conservative’, Scientific
, 6 December 2012,
12 Anderson, K 2016, ‘Going beyond ‘dangerous’ climate change’, LSE presentation, 4 February 2016,
13 Anderson, K 2015, ‘Duality in climate science’, Nature
, vol. 8, pp. 898–900.
WHAT LIES BENEATH 5
the Berkeley Institute of the Environment, California says that for her research to be considered in the 2007 IPCC report,
she had to complete it by 2004. This is a typical experience that she identifies as "an awful lag in the IPCC process".
are compiled by working groups of scientists within guidelines that urge the building of
consensus conclusions from evidence presented, though that evidence itself may be diverse and sometimes contradictory
in nature. The general result may be described as “middle of the road” reporting, in which propositions supported by the
greater quantity of research papers presented win out against propositions that might be outliers in terms of quantity of
papers presented, though the latter may be no less scientifically significant.
The higher-impact possibilities may have less research available for consideration, but there are good risk-management
reasons for giving such possibilities more prominence, even if the event probability is relatively low (see Underestimating
As one example, the projected sea-level rise in the 2007 assessment report was well below the subsequent observations.
This occurred because scientists compiling the report could not agree on how much would be added to sea-level rise by
melting polar ice sheets, and so left out the data altogether to reach “consensus”. Science historian Naomi Oreskes calls
this "consensus by omission".
This is the consensus problem at the scientific level, but there is a second problem at the political level. Whilst the full-length
are compiled by scientists, the shorter and more widely reported Summary
) require consensus from diplomats in “a painstaking, line-by-line revision by [political] representatives from more than
100 world governments — all of whom must approve the final summary document”.
As early as the IPCC's first report in 1990, US, Saudi and Russian delegations acted in “watering down the sense of the
alarm in the wording, beefing up the aura of uncertainty”. Prof. Martin Parry of the UK Met Office, co-chairman of an IPCC
working group at the time, has exposed the arguments between scientists and political officials over the 2007 IPCC SPM
"Governments don't like numbers, so some numbers were brushed out of it".
In 2014, The
reported of increasing evidence that "the policy summaries on climate impacts and mitigation by the
IPCC were significantly 'diluted' under political pressure from some of the world's biggest greenhouse gas emitters,
including Saudi Arabia, China, Brazil and the United States".
One of the 2014 report’s more powerful sections was deleted during last minute negotiations over the text. The section tried
to specify other measures that would indicate whether we are entering a danger zone of profound climate impact, and just
how dramatic emissions cuts will have to be in order to avoid crossing that threshold. Prof. Michael Oppenheimer, an
eminent climate scientist at Princeton who was also part of the core writing team, suggests that politics got in the way.
14 Barras, C 2007, ‘Rocketing CO2 prompts criticisms of IPCC’, New
, 24 October 2007,
15 Scherer 2012a, op cit.
17 Leggett, J 1999,The
Routledge, New York.
18 Adam, D 2007, ‘How climate change will affect the world’, The
, 20 September 2007,
19 Ahmed, N 2014, ‘IPCC reports 'diluted' under 'political pressure' to protect fossil fuel interests’, The
, 15 May 2014,
20 Leggett, J 2014, ‘Why two crucial pages were left out of the latest UN climate report’, Jeremy
4 November 2014,
WHAT LIES BENEATH 6
UNDERESTIMATION OF RISKS
IPCC reports have underplayed high-end possibilities and failed to assess risks in a balanced manner. The failure to fully
account for potential future changes in the permafrost layer and other carbon-cycle feedbacks is just one example.
Dr Barrie Pittock, a former leader of the Climate Impact Group in CSIRO, wrote in 2006 that: "until now many scientists may
have consciously or unconsciously downplayed the more extreme possibilities at the high end of the uncertainty range, in
an attempt to appear moderate and ‘responsible’ (that is, to avoid scaring people). However, true responsibility is to provide
evidence of what must be avoided: to define, quantify, and warn against possible dangerous or unacceptable outcomes."
The situation has not improved. Sir Nicholas Stern said of the IPCC’s Fifth
“Essentially it reported on a
body of literature that had systematically and grossly underestimated the risks [and costs] of unmanaged climate change.”
Prof. Ross Garnaut has also pointed to the "understatement of the risks”. We seem to be playing scientific catch-up, as
reality is consistently on the most pessimistic boundary of previous projections. The Australian Climate Council reported in
2015: "Changes in the climate system are occurring more rapidly than previously projected, with larger and more damaging
impacts now observed at lower temperatures than previously estimated." Such a situation is not a satisfactory basis on
which to plan our future.
Former senior coal fossil fuel executive and government advisor, Ian Dunlop, notes that: "dangerous impacts from the
underlying (warming) trend have also manifested far faster and more extensively than global leaders and negotiators are
prepared to recognise".
Researchers say it is important to carry out analyses “to identify what risky outcomes are possible — cannot be ruled out —
starting with the biggest ones. In such analyses, it is useful to distinguish between two questions: ‘What is most likely to
happen?’ and ‘How bad could things get?’” In looking at how to reframe climate change assessments around risk, it is
… deal adequately with low-probability, high-consequence outcomes, which can dominate calculations of total risk,
and are thus worthy of special attention. Without such efforts, we court the kinds of ‘failures of imagination’ that can
prove so costly across risk domains. Traditional climate assessments have focused primarily on areas where the
science is mature and uncertainties well characterized. For example, in the IPCC lexicon, future outcomes are
considered “unlikely” if they lie outside the central 67% of the probability distribution. For many types of risk
assessment, however, a 33% chance of occurrence would be very high; a 1% or 0.1% chance (or even lower
probabilities) would be more typical thresholds. They emphasise that ‘the envelope of possibilities’, that is the full
range of possibilities for which one must be prepared, is often more important than the most likely future outcome,
especially when the range of outcomes includes those that are particularly severe. They conclude that the
“application of scientific rather than risk-based norms in communicating climate change uncertainty has also made
it easier for policymakers and other actors to downplay relevant future climate risks.
21 Pittock, AB 2006, ‘Are scientists underestimating climate change?’, EOS
, vol. 87, no. 34, pp. 340-41.
22 Stern, N 2016, ‘Economics: Current climate models are grossly misleading’, Nature
, vol. 530, pp. 407-409.
23 Steffen, W, Hughes, L & Pearce, A 2015, Climate
, Climate Council, Sydney.
24 Dunlop, I 2016, Foreword to Spratt, D 2016, Climate
, Breakthrough, Melbourne.
25 Weaver, C, Moss, R, Ebi, K, Gleick, P, Stern, P, Tebaldi, C, Wilson, R & Arvai, J 2017, ‘Reframing climate change assessments around
risk: recommendations for the US National Climate Assessment’, Environmental
, vol. 12, no. 8, 080201.
WHAT LIES BENEATH 7
A prudent risk-management approach means a tough and objective look at the real risks to which we are exposed,
especially those high-end events who consequences may be damaging beyond quantification, and which human civilization
as we know it would be lucky to survive. It is important to understand the potential of, and plan for, the worst that can
happen, and be pleasantly surprised if it doesn’t. Focusing on "middle of the road" outcomes, and ignoring the high-end
possibilities, may result in an unexpected catastrophic event that we could and should have seen coming.
Integral to this approach is the issue of “fat tail” risks in which the likelihood of very large impacts is greater than we would
expect under typical statistical assumptions. A normal distribution, with the appearance of a bell curve, is symmetric in
probabilities of low outcomes (left of curve) and high outcomes (right of curve) as per Figure 1(a). But, as Prof. Michael E.
Mann explains, “global warming instead displays what we call a ‘heavy-tailed’ or ‘fat-tailed’ distribution. There is more area
under the far right extreme of the curve than we would expect for a normal distribution, a greater likelihood of warming that
is well in excess of the average amount of warming predicted by climate models”.
Figure 1: Normal probability distribution (left) and An estimate of the likelihood of warming due to a doubling of greenhouse gas concentrations, from
Wagner & Weitzman “Climate Shock” (right)
, economists Gernot Wagner and Martin Weitzman
explore the implications of this fat-tail distribution for climate policy, and “why we face an existential threat in human-caused
climate change”. Mann explains:
Let us consider...the prospects for warming well in excess of what we might term “dangerous” (typically considered
to be at least 2°C warming of the planet). How likely, for example, are we to experience a catastrophic 6°C
warming of the globe, if we allow greenhouse gas concentrations to reach double their pre-industrial levels
(something we’re on course to do by the middle of this century given business-as-usual burning of fossil fuels)?
Well, the mean or average warming that is predicted by models in that scenario is about 3°C, and the standard
deviation about 1.5°C. So the positive tail, defined as the +2 sigma limit, is about 6°C of warming. As shown by
Wagner & Weitzman [Figure 1(b) above], the likelihood of exceeding that amount of warming isn’t 2% as we would
expect for a bell-curve distribution. It’s closer to 10%!
In fact, it’s actually even worse than that when we consider the associated risk. Risk is defined as the product of
the likelihood and consequence of an outcome. We just saw that the likelihood of warming is described by a
heavy-tailed distribution, with a higher likelihood of far-greater-than-average amounts of warming than we would
expect given typical statistical assumptions. This is further compounded by the fact that the damages caused by
climate change — i.e. the consequence — also increases dramatically with warming. That further increases the
27 Mann, M 2016, ‘The ‘fat tail’ of climate change risk’, Huffington
, 11 September 2016,
WHAT LIES BENEATH 8
With additional warming comes the increased likelihood that we exceed certain “tipping points”, like the melting of
large parts of the Greenland and Antarctic ice sheet and the associated massive rise in sea level that would
produce… Uncertainty is not our friend when it comes to the prospects for dangerous climate change.
IPCC reports have not given attention to fat-tail risk analysis, in part because the reports are compiled using a consensus
method, as discussed above. Prof. Stefan Rahmstorf of Potsdam University says that: “The magnitude of the fat tail risks of
global warming is not widely appreciated and must be discussed more. For over two decades I have argued that the risk of
a collapse of the Atlantic meridional overturning circulation in this century is perhaps five per cent or so, but that this is far
too great a risk to take, given what is at stake. Nobody would board an aircraft with a five per cent risk of crashing.” He adds
that: "Defeatism and doomerism is not the same as an accurate, sincere and sober discussion of worst-case risks. We don’t
need the former, we do need the latter.”
It is now clear that climate change is an existential risk to human civilisation: that is, an adverse outcome that would either
annihilate intelligent life or permanently and drastically curtail its potential. Temperature rises that are now in prospect,
ever after the Paris
, are in the range of 3–5°C. The Paris
voluntary emission reduction
commitments, if implemented, would result in the planet warming by 3°C, without taking into account “long-term”
carbon-cycle feedbacks. With a higher climate sensitivity figure of 4.5°C, for example, which would account for such
feedbacks, the Paris path would lead to around 5°C of warming, according to a MIT study. A study by Schroder
Investment Management published in June 2017 found — after taking into account indicators across a wide range of the
political, financial, energy and regulatory sectors — the average temperature increase implied across all sectors was 4.1°C.
Warming of 4°C or more could reduce the global human population by 80% or 90%, and the World Bank reports “there is
no certainty that adaptation to a 4°C world is possible”. A study by two US national security think tanks concluded that 3°C
of warming and a 0.5 metre sea-level rise would likely lead to “outright chaos”. A recent study by the European
Commission’s Joint Research Centre found that if global temperatures rise 4°C, then extreme heatwaves with “apparent
temperatures” peaking at over 55°C will begin to regularly affect many densely populated parts of the world. At 55°C or so,
much activity in the modern industrial world would have to stop. (“Apparent temperatures” refers to the Heat Index, which
quantifies the combined effect of heat and humidity to provide people with a means of avoiding dangerous conditions.)
30 Rahmstorf, S, pers. comm., 8 August 2017.
31 Dunlop and Spratt 2017, op cit.
32 Reilly, J, Paltsev, S, Monier, E, Chen, H, Sokolov, A, Huang, J, Ejaz, Q, Scott, J, Morris, J & Schlosser, A 2015, Energy
, MIT Program on the Science and Policy of Global Change, Cambridge MA.
33 Schroder Investment Management 2017, Climate
Schroders Investment Management, London,
34 Anderson, K 2011, ‘Going beyond dangerous climate change: Exploring the void between rhetoric and reality in reducing carbon
emissions’, LSE presentation, 11 July 2011,
<http://www.slideshare.net/DFID/professor-kevin-anderson-climate-change-going-beyond-dangerous>; Fyall, J 2009, ‘Warming will 'wipe
out billions’, The
, 29 November 2009,
35 World Bank 2012, Turn
, World Bank, New York.
36 Campbell, K, Gulledge, J, McNeill, JR, Podesta, J, Ogden, P, Fuerth, L, Woolsley, J, Lennon, A, Smith, J, Weitz, R & Mix, D 2007, The
, Centre for Strategic and International
Studies & Centre for New American Security, Washington.
37 Ayre, J 2017, ‘Extreme heatwaves with ‘apparent temperatures’ as high as 55° celsius to regularly affect much of world’,
11 August 2017,
WHAT LIES BENEATH 9
The 2007 report on climate change and national security by the US Center for Strategic and International Studies and the
Center for a New American Security recognised that: “Recent observations indicate that projections from climate models
have been too conservative; the effects of climate change are unfolding faster and more dramatically than expected” and
that “multiple lines of evidence” support the proposition that the 2007 IPCC report’s “projections of both warming and
attendant impacts are systematically biased low”. For instance:
the models used to project future warming either omit or do not account for uncertainty in potentially important
positive feedbacks that could amplify warming (e.g., release of greenhouse gases from thawing permafrost,
reduced ocean and terrestrial CO2 removal from the atmosphere), and there is some evidence that such feedbacks
may already be occurring in response to the present warming trend. Hence, climate models may underestimate the
degree of warming from a given amount of greenhouse gases emitted to the atmosphere by human activities
alone. Additionally, recent observations of climate system responses to warming (e.g., changes in global ice cover,
sea-level rise, tropical storm activity) suggest that IPCC models underestimate the responsiveness of some
aspects of the climate system to a given amount of warming.
There is a consistent pattern in the IPCC of presenting detailed, quantified (numerical) modelling results, but then briefly
noting more severe possibilities — such as feedbacks that the models do not account for — in a descriptive, non-quantified
form. Sea levels, Arctic sea ice and some carbon-cycle feedbacks are three examples. Because policymakers and the
media are often drawn to headline numbers, this approach results in less attention being given to the most devastating,
high-end, non-linear and difficult-to-quantify outcomes.
Consensus around numerical results can result in an understatement of the risks. Oppenheimer et al. point to the problem:
The emphasis on consensus in IPCC reports has put the spotlight on expected outcomes, which then become
anchored via numerical estimates in the minds of policymakers… it is now equally important that policymakers
understand the more extreme possibilities that consensus may exclude or downplay… given the anchoring that
inevitably occurs around numerical values, the basis for quantitative uncertainty estimates provided must be
broadened to give observational, paleoclimatic, or theoretical evidence of poorly understood phenomena comparable
weight with evidence from numerical modeling… One possible improvement would be for the IPCC to fully include
judgments from expert elicitations.
Glaciologist Prof. Eric Rignot, says that “One of the problems of IPCC is the strong desire to rely on physical models." He
For instance, in terms of sea-level rise projection, the IPCC tends downplay the importance of semi-empirical
models. In the case of Antarctica, it may be another ten years before fully-coupled ice sheet–ocean–sea
ice–atmosphere models get the southern hemisphere atmospheric circulation right, the Southern Ocean right, and
the ice sheet right using physical models, with the full physics, at a high spatial resolution. In the meantime, it is
essential to move forward our scientific understanding and inform the public and policy makers based on
observations, basic physics, simpler models, well before the full-fledged physical models eventually get there.
38 Campbell et al., op cit.
39 Oppenheimer, M, O’Neill, B, Webster, M & Agrawala, S 2007, ‘The Limits of Consensus’, Science,
vol. 317, pp. 1505-1506.
40 Rignot, E, pers. comm., 8 August 2017.
WHAT LIES BENEATH 10
It is important to understand the distinction between full climate models and the semi-empirical approach, because IPCC
reports appear to privilege the former at the expense of the latter. Sea-level rise projections are a good example of this.
●Full coupled GCMs (global climate models or general circulation models) are mathematical representations
of the Earth’s climate system, based on the laws of physics and chemistry. Run on computers, they simulate the
interactions of the important drivers of climate, including atmosphere–oceans–land surface–ice interactions, to
solve the full equations for mass and energy transfer and radiant exchange. Models are tested in the first instance
by hindsight: how well, once loaded with the observed climate conditions (parameters) at a time in the past, do
they reproduce what has happened since that point. They are limited by the capacity of modellers to understand
the physical processes involved, so as to be able to represent them in quantitative terms. For example, ice sheet
dynamics are poorly reproduced, and therefore key processes that control the response of ice flow to a warming
climate are not included in current ice sheet models. GCMs are being improved over time, and new higher-capacity
computers allow models of finer resolution to be developed.
●A semi-empirical model is a simpler, physically plausible model of reduced complexity that exploits statistical
relationships. It combines current observations with some basic physical relationships observed from past climates,
and theoretical considerations relating variables through fundamental principles, to project future climate
conditions. For example, semi-empirical models “can provide a pragmatic alternative to estimate the sea-level
response”. Observing past rates of sea-level change from the climate record when the forcing (energy imbalance
in the system) was similar to today, gives insights into how quickly sea levels may rise in the next period. Thus a
semi-empirical approach to projecting future sea-level rise may relate the global sea-level rise to global mean
surface temperature.This approach was used by Rahmstorf in 2007, to project a 0.5–1.4 metres sea-level rise by
2100, compared to the IPCC’s 2007 report, based on GCMs, which gave a figure of 0.18–0.59 metre based on
Semi-empirical models rely on observations from climate history (paleoclimatology) to establish relationships between
variables. In privileging GCMs over semi-empirical models, the IPCC downplays insights from paleoclimate research.
A tipping point may be understood as the passing of a critical threshold in an Earth–climate system component — such as
major ocean and atmospheric circulation patterns, the polar ice sheets, and the terrestrial and ocean carbon stores — which
produces a step change in the system. In some cases, passing one threshold will trigger further threshold events, for
example where substantial greenhouse gas releases from permafrost carbon stores increase warming, releasing even more
permafrost carbon in a positive feedback, but also pushing other systems, such as polar ice sheets, past a threshold point.
Progress toward a tipping point is often driven by positive feedbacks, in which a change in a component leads to other
changes that eventually “feed back” onto the original component to amplify the change. A classic case in global warming is
41 Rahmstorf, S 2007, ‘A semi-empirical approach to projecting future sea-level rise, Science
vol. 315, pp. 368-370.
WHAT LIES BENEATH 11
the ice–albedo feedback, where decreases in the ice cover area change surface reflectivity, trapping more heat and
producing further ice loss.
In a period of rapid warming, most major tipping points once crossed are irreversible in human time frames, principally due
to the longevity of atmospheric CO2 (a thousand years). It is crucial that we understand as much as possible about
near-term tipping points for this reason.
Large-scale human interventions in slow-moving earth system tipping points might allow a tipping point to be reversed; for
example, by a large-scale atmospheric CO2 drawdown program, or solar radiation management.
The scientific literature on tipping points is relatively recent. Our knowledge is limited because a system-level understanding
of critical processes and feedbacks is still lacking in key Earth climate components, such as the polar regions, and “no
serious efforts have been made so far to identify and qualify the interactions between various tipping points”.
Climate models are not yet good at dealing with tipping points. This is partly due to the nature of tipping points, where a
particular and complex confluence of factors abruptly change a climate system characteristic and drive it to a different state.
To model this, all the contributing factors and their forces have to well identified, as well as their particular interactions, plus
the interactions between tipping points. Researchers say that “complex, nonlinear systems typically shift between
alternative states in an abrupt, rather than a smooth manner, which is a challenge that climate models have not yet been
able to adequately meet”.
The IPCC has made no projections regarding tipping-point thresholds, nor emphasised the importance of building robust
risk-management assessments of them in the absence of quantitative data.
The question of climate sensitivity is a vexed one. Climate sensitivity is the amount by which the global average
temperature will rise due to a doubling of the atmospheric greenhouse gas level, at equilibrium. (Equilibrium refers to the
state of a system when all the perturbations have been resolved and the system is in balance.)
IPCC reports have focused on what is often called Equilibrium Climate Sensitivity (ECS). The 2007 IPCC report gives a
best estimate of climate sensitivity of 3°C and says it "is likely to be in the range 2°C to 4.5°C". The 2014 report says: "no
best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across
assessed lines of evidence and studies" and only gives a range of 1.5°C to 4.5°C. This was a backward step.
What the IPCC reports fail to make clear is that the ECS measure omits key "long-term" carbon-cycle feedbacks that a
significant rise in the planet's temperature will trigger, such as the permafrost feedback and other changes in the terrestrial
carbon cycle, or a decrease in the ocean's carbon-sink efficiency.
44 Solomon, S, Plattner, GK, Knutti, R & Friedlingstein, P 2008, ’Irreversible climate change due to carbon dioxide emissions’, Proceedings
vol. 106, pp. 1704–1709.
45 Schellnhuber, J 2009, ‘Tipping elements in the Earth system’, Proceedings
vol. 106, no. 6, pp.
46 Duarte, C, Lenton, T, Wadhams, P & Wassmann, P 2012, ‘Abrupt climate change in the Arctic, Nature
vol. 2, pp.
WHAT LIES BENEATH 12
Climate sensitivity which includes these feedbacks — known as Earth System Sensitivity (ESS) — appears not to be
acknowledged in the 2014 IPCC reports at all. Yet, there is a wide range of literature which suggest an ESS of 4-6°C.
It is conventionally considered that these "long-term" feedbacks –– such as changes in the polar carbon stores and the
polar ice sheets –– operate on millennial timescales. Yet the rate at which human activity is changing the Earth’s energy
balance is without precedent in the last 66 million years and about ten times faster than during the Paleocene–Eocene
Thermal Maximum, a period with one of the largest extinction events on record. The rate of change in energy forcing is now
so great that these “long-term” feedbacks have already begun to operate within short time frames. The IPCC is not
forthcoming on this issue. Instead it sidesteps with statements (from 2007) such as this: "Models used to date do not
include uncertainties in climate–carbon cycle feedback... because a basis in published literature is lacking... Climate–carbon
cycle coupling is expected to add CO2 to the atmosphere as the climate system warms, but the magnitude of this feedback
is uncertain". This is the type of indefinite language that politicians and the media are likely to gloss over, in favour of a
It should be noted that carbon budgets — the amount of carbon that could be emitted before a temperature target is
exceeded — are generally based on a climate sensitivity mid-range value around 3°C. Yet this figure may be too low.
Fasullo and Trenberth found that the climate models that most accurately capture observed relative humidity in the tropics
and subtropics and associated clouds were among those with a higher sensitivity of around 4°C. Sherwood et al. also found
a sensitivity figure of greater than 3°C. And Zhai et al. found that seven models that are consistent with the observed
seasonal variation of low-altitude marine clouds yield an ensemble-mean sensitivity of 3.9°C.
In research published in late 2016, Friedrich et al. show that climate models may be underestimating climate sensitivity
because it is not uniform across different circumstances, but in fact higher in warmer, interglacial periods (such as the
present) and lower in colder, glacial periods. Based on a study of glacial cycles and temperatures over the last 800,000
years, the authors conclude that in warmer periods climate sensitivity averages around 4.88°C. The higher figure would
mean warming for 450 parts per million of atmospheric CO2 (a figure on current trends we will reach within 25 years) would
be around 3°C, rather than the 2°C bandied around in policy-making circles. Professor Michael Mann, of Penn State
University, says the paper appears "sound and the conclusions quite defensible".
47 The Geological Society 2013, An
Society, London, December 2013,
Final.pdf>; Hansen, J, Sato, M, Russell, G & Kharecha, P 2013, ’Climate sensitivity, sea level and atmospheric carbon dioxide’,
A, vol. 371, no. 2001, 20120294.
48 Fasullo, J & Trenberth, K 2012, ’A less cloudy future: the role of subtropical subsidence in climate sensitivity’, Science,
vol. 338, no. 6108,
pp. 792-794; Sherwood, S, Bony, S & Dufresne, JL 2014, ’Spread in model climate sensitivity traced to atmospheric convective mixing’,
vol. 505, pp. 37-42; Zhai, C, Jiang, J & Su, H 2015, ’Long-term cloud change imprinted in seasonal cloud variation: More evidence
of high climate sensitivity’, Geophysical
, vol. 42, no. 20, pp. 8729-8737.
49 Friedrich, T, Timmermann, A, Timm, OE & Ganopolski, A 2016, ‘Nonlinear climate sensitivity and its implications for future greenhouse
, vol. 2, no. 11, e1501923.
50 Johnston, I 2016, ‘Climate change may be escalating so fast it could be 'game over', scientists warn’, Independent
, 9 November 2016,
WHAT LIES BENEATH 13
Related to the issue of climate sensitivity is the question of the stability of permafrost (frozen carbon stores on land and
under seabed). Scientists estimate that the world’s permafrost holds 1.5 trillion tons of frozen carbon, more than twice the
amount of carbon in the atmosphere. The Arctic is warming faster than anywhere else on earth, and researchers are seeing
soil temperatures climb rapidly. Some permafrost degradation is already occurring. Large-scale tundra wildfires in 2012
added to the concern, as have localised methane outbursts.
The 2007 IPCC assessment on permafrost did not venture beyond saying: "Changes in snow, ice and frozen ground have
with high confidence increased the number and size of glacial lakes, increased ground instability in mountain and other
permafrost regions and led to changes in some Arctic