ArticlePDF Available

Evaluating the Performance of Past Climate Model Projections

Authors:
  • NASA Goddard Institute for Space Studies,

Abstract and Figures

Plain Language Summary Climate models provide an important way to understand future changes in the Earth's climate. In this paper we undertake a thorough evaluation of the performance of various climate models published between the early 1970s and the late 2000s. Specifically, we look at how well models project global warming in the years after they were published by comparing them to observed temperature changes. Model projections rely on two things to accurately match observations: accurate modeling of climate physics and accurate assumptions around future emissions of CO2 and other factors affecting the climate. The best physics‐based model will still be inaccurate if it is driven by future changes in emissions that differ from reality. To account for this, we look at how the relationship between temperature and atmospheric CO2 (and other climate drivers) differs between models and observations. We find that climate models published over the past five decades were generally quite accurate in predicting global warming in the years after publication, particularly when accounting for differences between modeled and actual changes in atmospheric CO2 and other climate drivers. This research should help resolve public confusion around the performance of past climate modeling efforts and increases our confidence that models are accurately projecting global warming.
This content is subject to copyright. Terms and conditions apply.
A preview of the PDF is not available
... Moreover, following the approach of Hargreaves (2010), Hausfather et al. (2020) calculate a skill score for each model for both temperature versus time and radiative forcing. The skill score is based on the root-mean-square errors of the model projection trend versus observations compared to a zero-change null-hypothesis projection. ...
... The results of Hausfather et al. (2020) demonstrate that even when the temperature versus time yields inaccurate results, the corresponding temperature versus radiative forcing results may still be somewhat acceptable. For example, while Nordhaus (1977) and Schneider and Thompson (1981) projected more warming than observed, their radiative forcings are not unacceptable. ...
... Although the methods used by Hausfather et al. (2020) for their evaluations may be deemed useful by climatologists, any well-trained statistician might feel less comfortable with such methods and may demand more sophisticated and probabilistically sound methods that properly quantify uncertainties. For instance, a high skill score indicates close agreement between observed and projected temperatures for the small time frame considered, but for longer time periods the differences between observed and projected temperatures may turn out to be increasing. ...
Thesis
Full-text available
Inverse problems, where in a broad sense the task is to learn from the noisy response about some unknown function, usually represented as the argument of some known functional form, has received wide attention in the general scientific disciplines. However, apart from the class of traditional inverse problems, there exists another class of inverse problems, which qualify as more authentic class of inverse problems, but unfortunately did not receive as much attention. In a nutshell, the other class of inverse problems can be described as the problem of predicting the covariates corresponding to given responses and the rest of the data. Since the model is built for the responses conditional on the covariates, the inverse nature of the prediction problem is evident. Our motivating example in this regard arises in palaeoclimate reconstruction, where the model is built for the multivariate species composition conditional on climate; however, it is of interest to predict past climate given the modern species and climate data and the fossil species data. In the Bayesian context, it is natural to consider a prior for covariate prediction. In this thesis, we bring to attention such a class of inverse problems, which we refer to as ‘inverse regression problems’ to distinguish them from the traditional inverse problems, which are typically special cases of the former, as we point out. Development of the Bayesian inverse regression setup is the goal of this thesis. We particularly focus on Bayesian model adequacy test and Bayesian model and variable selection in the inverse contexts, proposing new approaches and illuminating their asymptotic properties. Towards Bayesian model adequacy, we adopt and extend the inverse reference distribution approach of Bhattacharya (2013), proving the convergence properties. Along the way, out of necessity, we develop asymptotic theories for Bayesian covariate consistency and posterior convergence theories of unknown functions modeled by suitable stochastic processes embedded in normal, double-exponential, binary and Poisson distributions that include rates of convergence and misspecifications. In the realm of inverse model and variable selection, we first develop an asymptotic theory for Bayes factors in the general setup, and then introduce pseudo-Bayes factors for model selection, showing that the asymptotic properties of the two approaches are in agreement, while the latter is more useful from several theoretical and computational perspectives. Along with the inverse regression setup we also develop the forward regression context, where the aim is to predict new responses given known covariate values, and illustrate the suitability, differences and advantages of the approaches, with various theoretical examples and simulation experiments. We further propose and develop a novel Bayesian multiple testing procedure for model and variable selection in the inverse regression setup, also exploring its elegant asymptotic properties. Our simulation studies demonstrate that this approach outperforms Bayes and pseudo-Bayes factors with respect to inverse model and variable selection. As an interesting application encompassing most of our developments, we attempt to evaluate if the future world is likely to experience the terrifying global warming projection that has perturbed the scientists and policymakers the world over. Showing that the question falls within the purview of inverse regression problems, we propose a novel nonparametric model for climate dynamics based on Gaussian processes and exploit our inverse regression methodologies to conclude that there is no real threat to the world as far as global warming is concerned.
... 6 Note that Katzav gives only a partial picture of how climate models are evidentially supported. One could construe his account as a sort of strawman of climate model evaluation by surveying, e.g., the predictive successes of climate models (Hausfather et al. 2020), support for climate models in paleoclimate assessments (Schmidt et al. 2012; Weart 2020), and process-level assessments of climate models (Lloyd, Bukovsky, and Mearns 2021). ...
... This background knowledge includes not only facts about model components such as aerosols' chemistry representations etc., but also about past model performance. The importance of expert background knowledge in understanding climate model evaluations has been noted elsewhere in the philosophical literature (e.g.,Winsberg (2018a);Jebeile and Crucifix (2020)) and is also evident in some of the other examples of model error diagnosis discussed above.The fact that Pitari et al. evaluate only four models, in contrast to the 30+ that were evaluated in AMIP or the 100+ models that may be evaluated in the ongoing sixth phase of the Coupled Model Intercomparison Project(Hausfather et al. 2020), may be one key reason why it was possible to give such an in-depth look at the features and performance capabilities of each model. While this is outside the scope of my argument here, I should note that studies likePitari et al. (2014), in which a small number of models are compared to one another at a high level of detail, may overturn some of the conclusions reached byTouzé-Peiffer et al.'s (2020) in their analysis of how CMIP influences climate science research. ...
Thesis
Full-text available
I give a philosophical analysis of model intercomparison practices in climate modeling. Model intercomparisons serve as part of the basis for regular scientific assessments such as the Intergovernmental Panel on Climate Change reports. On my analysis, climate modeling should be viewed as part of a collaborative research endeavor in which model similarities matter far more than model differences. I show both that climate model intercomparisons are scientifically fruitful (along multiple dimensions) and that many philosophical critiques of climate models are deeply misguided. After providing both a general overview of the dissertation and a history of climate model intercomparisons, I offer a justification for the scientific significance of model agreement. The gist is that when vastly complex models from different modeling institutions agree on some result, this agreement must be understood in conjunction with the fact that the models are all part of the same model family or model-type, and that the models themselves, their components, and their outputs all have both empirical and theoretical sources of evidential support. Then, I give a descriptive account of how scientists diagnose model errors, challenges facing error diagnosis, and prospects for more proactively diagnosing model errors in future model intercomparisons. Finally, I examine the senses in which climate models are competing with one another. I argue that competition between models yields scientific benefits in multiple directions: models found to be better at a given task can provide more reliable climate projections (for that task); moreover, models which are found to be worse can indirectly help, via constraining estimates of different climate variables and by providing data with which to test other climate models. Ultimately, I show how model intercomparison projects are both richly informative and ripe for philosophical analysis, and that skeptical claims philosophers have made about climate models must be thoroughly revised or else given up.
... The scientific community has long reached a consensus on how greenhouse gases affect the climate system (Cook et al. 2013). As early as five decades ago, on the basis of carbon dioxide (CO 2 ) emissions and first physical principles that regulate Earth's energy budget, climate models accurately predicted the level of global warming we see today (Hausfather et al. 2019). In contrast, it is more difficult to Abstract The unfolding climate crisis is in many respects a human issue, one caused by anthropogenic emissions of CO 2 to the atmosphere, and that can only be solved through a concerted effort across all sectors of society. ...
Article
Full-text available
The unfolding climate crisis is in many respects a human issue, one caused by anthropogenic emissions of CO2 to the atmosphere, and that can only be solved through a concerted effort across all sectors of society. In this prospective synthesis, I explain how expanding the scope of biogeochemical research would lead to a more rigorous and impactful climate change mitigation and adaptation agenda. Focusing on biogeochemistry as an area of interdisciplinary convergence, I review theories and empirical studies in the environmental and social sciences, to distill five principles and three phases of implementation for sustainable carbon capture projects. I discuss how land conservation, management, and restoration might be coordinated to prepare for climate change and to achieve multiple social and ecological benefits, including enhanced carbon drawdown and permanence on land. On the conservation front, the abundance of threatened plant and animal species spatially correlates with the distribution of carbon- and water-rich habitats within and across key regions, which can be prioritized for biodiversity protection with major climatic benefits. On the management front, long-term records of socioecological change warrant a revision of current models for sustainable forestry and agriculture in favor of decentralized system-specific prescriptions, including interventions where organic or inorganic carbon capture may improve wood and food production under future climate scenarios. On the restoration front, experiments in degraded landscapes reveal mechanisms of carbon stabilization, such as iron-coordination of organic complexes, which amplify the benefits of ecological succession and lead to carbon accumulation beyond thresholds predicted for undisturbed ecosystems. These examples illustrate the potential for innovation at the intersection of basic and applied biogeoscience, which could accelerate atmospheric carbon capture through new discoveries and collective action.
... Future model projections make vital contribution to physical science basis for coping with climate change impacts and risks (IPCC 2014(IPCC , 2021(IPCC , 2022aHausfather et al. 2020). Projected changes in mean climate and extreme events have been widely addressed for the near term, mid-term, and longterm and under different global warming targets using model simulations particularly from the continuing Coupled Model Intercomparison Project (CMIP) (Meehl et al. 2000;Taylor et al. 2012;Eyring et al. 2016;Watts et al. 2019). ...
Article
Full-text available
Future climate projections provide vital information for preventing and reducing disaster risks induced by the global warming. However, little attention has been paid to climate change projections oriented towards carbon neutrality. In this study, we address projected changes in daily maximum (Tmax) and minimum (Tmin) temperatures as well as diurnal temperature range (DTR) over East Asia for the carbon neutrality period of 2050–2060 under the newly available SSP1-1.9 pathway of sustainable development by using CMIP6 model simulations. CMIP6 multi-model ensemble results show that Tmax and Tmin will significantly increase with varying magnitudes during the carbon neutrality period of 2050–2060 under SSP1-1.9 over the whole East Asia while both upward and downward changes will occur for the DTR. Projected Tmax, Tmin, and DTR changes all exhibit new spatial patterns during 2050–2060 under SSP1-1.9 compared with those over the same period under SSP2-4.5 and SSP5-8.5. Compared to 1995–2014, projected Tmax and Tmin averaged over East Asia during 2050–2060 will significantly warm up by 1.43 ℃ and 1.40 ℃ under SSP1-1.9, while the warming magnitudes are 1.93 ℃ and 2.04 ℃ under SSP2-4.5, and 2.67 ℃ and 2.85 ℃ under SSP5-8.5. Research on carbon neutrality-oriented climate change projections needs to be strengthened for jointly achieving a net-zero future.
... This renders an unique opportunity to address the fundamental question if surface temperature changes projected by CMIPs is consistent with the ones observed in the last 2 decades. Although older future climate projections from IPCC 1st, 2nd and 3rd Assessment Reports were evaluated against observations in past studies (e.g., [19][20][21][22], more recent studies have compared newer CMIPs with observations but only regarding the past (historical) period and not the future projections (e.g., [23][24][25][26][27][28][29][30][31] ). Lewandowsky et al. 32 analyzed a well-known alleged divergence between model projections and observations (the warming pause or hiatus) comparing CMIP5 RCP8.5 projected temperatures with observations up to 2016, and showed that there is no robust statistical evidence for the so-called divergence between projected temperatures and observations. ...
Article
Full-text available
Despite the dire conclusions of the Intergovernmental Panel on Climate Change (IPCC) Assessment Reports in terms of global warming and its impacts on Earth’s climate, ecosystems and human society, a skepticism claiming that the projected global warming is alarmist or, at least, overestimated, still persists. Given the years passed since the future climate projections that served as basis for the IPCC 4th, 5th and 6th Assessment Reports were released, it is now possible to answer this fundamental question if the projected global warming has been over or underestimated. This study presents a comparison between CMIP3, CMIP5 and CMIP6 future temperature projections and observations. The results show that the global warming projected by all CMIPs and future climate scenarios here analyzed project a global warming slightly lower than the observed one. The observed warming is closer to the upper level of the projected ones, revealing that CMIPs future climate scenarios with higher GHG emissions appear to be the most realistic ones. These results show that CMIPs future warming projections have been slightly conservative up to 2020, which could suggest a similar cold bias in their warming projections up to the end of the current century. However, given the short future periods here analyzed, inferences about warming at longer timescales cannot be done with confidence, since the models internal variability can play a relevant role on timescales of 20 years and less.
... Instead, in generating and publishing data, our climate modeller has several objectives. First, while being interested in how their model performs relative to observations and other models (e.g., [87] and [88]), they likely care more about the model's performance in purely meteorological-process terms than about energy-relevant surface climate variables (e.g., they may focus on wind-speed at 10 m -as at this level there are surface observations to validate models against -rather than estimating wind-power capacity factors at turbine hubheight). Second, though some surface climate variables (or "Essential Climate Variables") are extensively evaluated in climate models, that evaluation tends to focus on static statistical properties (e.g., mean, variance, and probability distributions) of meteorological variables in isolation rather than on time series of and co-variability between energy-system-relevant variables (e.g., in a kalte dunkelflaute [89,90]). ...
Preprint
Full-text available
Energy system models underpin decisions by energy system planners and operators. Energy system modelling faces a transformation: accounting for changing meteorological conditions imposed by climate change. To enable that transformation, a community of practice in energy-climate modelling has started to form that aims to better integrate energy system models with weather and climate models. Here, we evaluate the disconnects between the energy system and climate modelling communities, then lay out a research agenda to bridge those disconnects. In the near-term, we propose interdisciplinary activities for expediting uptake of future climate data in energy system modelling. In the long-term, we propose a transdisciplinary approach to enable development of (1) energy-system-tailored climate datasets for historical and future meteorological conditions and (2) energy system models that can effectively leverage those datasets. This agenda increases the odds of meeting ambitious climate mitigation goals by systematically capturing and mitigating climate risk in energy sector decision making.
Thesis
Seit der industriellen Revolution haben Menschen durch Verbrennung von fossilen Energieträgern die Treibhausgaskonzentration in der Atmosphäre erhöht. Die daraus folgende Erderwärmung hat weitreichende Folgen für das Klima, unter anderem häufigere und intensivere Wetterextreme. Wegen ihrer gravierenden Auswirkungen auf die Gesellschaft, ist es von allgemeinem Interesse zu verstehen, wie der menschengemachte Klimawandel diese Wetterextreme beeinflusst. In dieser kumulativen Dissertation analysiere ich erst zwei komplexe Wettereignisse, die die Nahrungsmittelproduktion in Europa beeinträchtigen: Frosttage nach dem Beginn der Apfelblüte und Feuchte Frühsommerperioden nach warmen Wintern. In einer dritten Studie untersuche ich wie dynamische Klimaveränderungen in den mittleren Breiten der Nordhalbkugel zu beständigerem Sommerwetter beitragen. Schließlich beschäftige ich mich mit tropischen Stürmen im Nordatlantik und damit, wie sie von der globalen Erwärmung beeinflusst werden. Eine zentrale methodische Herausforderung in diesem Forschungsfeld ist, dass Wetterextreme per Definition selten sind und dass es aufgrund der starken internen Klimavariabilität schwierig ist, die Veränderungen zu quantifizieren, die auf den menschgemachten Klimawandel zurück zu führen sind. In dieser Arbeit verfolge ich zweigegenläufige Ansätze um mit dieser Herausforderung um zu gehen: 1) Ich verwende große Klimasimulationsensembles um den Effekt der internen Klimavariabilität aus zu glätten und dadurch die erzwungenen Veränderungen beim Apfelfrost und in der Persistenz zu ergründen. 2) Mit Methoden, die auf Beobachtungsdaten beruhen, quantifiziere ich den Einfluss der internen Klimavariabilität auf tropische Zyklone um dann einschätzen zu können, in welchem Maß der beobachtete Anstieg der tropischen Zyklonaktivität im Atlantik der internen Klimavariabilität oder erzwungenen Veränderungen zugeschrieben werden kann.
Preprint
Full-text available
Here we briefly reflect on the philosophical foundations that ground the quest towards ever-detailed models and identify four practical dangers derived from this pursuit: explosion of the model's uncertainty space, model black-boxing, computational exhaustion and model attachment. We argue that the growth of a mathematical model should be carefully and continuously pondered lest models become extraneous constructs chasing the Cartesian dream.
Chapter
Das Phänomen der Klimaänderungen war bereits im 19. Jahrhundert bekannt, als der Treibhauseffekt und Umweltveränderungen in der Erdgeschichte entdeckt wurden. Auch die ersten Klimamessungen reichen so weit zurück. Seit der Mitte des 20. Jahrhunderts gibt es standardisierte Messungen der CO2-Konzentrationen in der Atmosphäre, seit 1979 Satellitenmessungen von Witterungsdaten. Ungefähr zur gleichen Zeit wurden die ersten Klimamodelle entwickelt. Die daraus erwachsende Erkenntnis einer wahrscheinlich vom Menschen verursachten Klimaerwärmung führte zur Gründung des „Weltklimarats“ IPCC.
Article
Concealed deep beneath the oceans is a carbon conveyor belt, propelled by plate tectonics. Our understanding of its modern functioning is underpinned by direct observations, but its variability through time has been poorly quantified. Here we reconstruct oceanic plate carbon reservoirs and track the fate of subducted carbon using thermodynamic modelling. In the Mesozoic era, 250 to 66 million years ago, plate tectonic processes had a pivotal role in driving climate change. Triassic–Jurassic period cooling correlates with a reduction in solid Earth outgassing, whereas Cretaceous period greenhouse conditions can be linked to a doubling in outgassing, driven by high-speed plate tectonics. The associated ‘carbon subduction superflux’ into the subcontinental mantle may have sparked North American diamond formation. In the Cenozoic era, continental collisions slowed seafloor spreading, reducing tectonically driven outgassing, while deep-sea carbonate sediments emerged as the Earth’s largest carbon sink. Subduction and devolatilization of this reservoir beneath volcanic arcs led to a Cenozoic increase in carbon outgassing, surpassing mid-ocean ridges as the dominant source of carbon emissions 20 million years ago. An increase in solid Earth carbon emissions during Cenozoic cooling requires an increase in continental silicate weathering flux to draw down atmospheric carbon dioxide, challenging previous views and providing boundary conditions for future carbon cycle models. Oceanic plate carbon reservoirs are reconstructed and the fate of subducted carbon is tracked using thermodynamic modelling, challenging previous views and providing boundary conditions for future carbon cycle models.
Article
Full-text available
Earthquake ruptures dynamically activate coseismic off‐fault damage around fault cores. Systematic field observation efforts have shown the distribution of off‐fault damage around main faults, while numerical modeling using elastic‐plastic off‐fault material models have demonstrated the evolution of coseismic off‐fault damage during earthquake ruptures. Laboratory‐scale microearthquake experiments have pointed out the enhanced high‐frequency radiation due to the coseismic off‐fault damage. However, the detailed off‐fault fracturing mechanisms, subsequent radiation, and its contribution to the overall energy budget remain to be fully understood because of limitations of current observational techniques and model formulations. Here, we constructed a new physics‐based dynamic earthquake rupture modeling framework, based on the combined finite‐discrete element method, to investigate the fundamental mechanisms of coseismic off‐fault damage, and its effect on the rupture dynamics, the radiation and the overall energy budget. We conducted a 2‐D systematic case study with depth and showed the mechanisms of dynamic activation of the coseismic off‐fault damage. We found the decrease in rupture velocity and the enhanced high‐frequency radiation in near field due to the coseismic off‐fault damage. We then evaluated the overall energy budget, which shows a significant contribution of the coseismic off‐fault damage to the overall energy budget even at depth, where the damage zone width becomes narrower. The present numerical framework for the dynamic earthquake rupture modeling thus provides new insights into earthquake rupture dynamics with the coseismic off‐fault damage.
Article
Full-text available
The Community Earth System Model Version 2 (CESM2) has an equilibrium climate sensitivity (ECS) of 5.3 K. ECS is an emergent property of both climate feedbacks and aerosol forcing. The increase in ECS over the previous version (CESM1) is the result of cloud feedbacks. Interim versions of CESM2 had a land model that damped ECS. Part of the ECS change results from evolving the model configuration to reproduce the long‐term trend of global and regional surface temperature over the twentieth century in response to climate forcings. Changes made to reduce sensitivity to aerosols also impacted cloud feedbacks, which significantly influence ECS. CESM2 simulations compare very well to observations of present climate. It is critical to understand whether the high ECS, outside the best estimate range of 1.5–4.5 K, is plausible.
Article
Full-text available
We outline a new and improved uncertainty analysis for the Goddard Institute for Space Studies Surface Temperature product version 4 (GISTEMP v4). Historical spatial variations in surface temperature anomalies are derived from historical weather station data and ocean data from ships, buoys, and other sensors. Uncertainties arise from measurement uncertainty, changes in spatial coverage of the station record, and systematic biases due to technology shifts and land cover changes. Previously published uncertainty estimates for GISTEMP included only the effect of incomplete station coverage. Here, we update this term using currently available spatial distributions of source data, state‐of‐the‐art reanalyses, and incorporate independently derived estimates for ocean data processing, station homogenization, and other structural biases. The resulting 95% uncertainties are near 0.05 °C in the global annual mean for the last 50 years and increase going back further in time reaching 0.15 °C in 1880. In addition, we quantify the benefits and inherent uncertainty due to the GISTEMP interpolation and averaging method. We use the total uncertainties to estimate the probability for each record year in the GISTEMP to actually be the true record year (to that date) and conclude with 87% likelihood that 2016 was indeed the hottest year of the instrumental period (so far).
Article
Full-text available
Plain Language Summary Tsunami hazard assessment is routinely based on assessing the impacts of long‐period waves generated by vertical seafloor motions reaching the coast tens of minutes after the earthquake in typical subduction‐zone environments. This view is inadequate for assessing hazard associated with strike‐slip earthquakes such as the magnitude 7.5 2018 Palu earthquake, which resulted in tsunami effects much larger than would normally be associated with horizontal fault motion. From an extraordinary collection of 38 amateur and closed circuit television videos we estimated tsunami arrival times, amplitudes, and wave periods at different locations around Palu Bay, where the most damaging waves were reported. We found that the Palu tsunamis devastated widely separated coastal areas within a few minutes after the mainshock and included unusually short wave periods, which cannot be explained by the earthquake fault slip alone. Post‐tsunami surveys show changes in the coastline, and this combined with video footage provides potential locations of submarine landslides as tsunami sources that would match the arrival times of the waves. Our results emphasize the importance of estimating tsunami hazards along coastlines bordering strike‐slip fault systems and have broad implications for considering shorter‐period nearly instantaneous tsunamis in hazard mitigation and tsunami early warning systems.
Article
Full-text available
Global warming in response to external radiative forcing is determined by the feedback of the climate system. Recent studies have suggested that simple mathematical models incorporating a radiative response which is related to upper- and deep-ocean disequilibrium (ocean heat uptake efficacy), inhomogeneous patterns of surface warming and radiative feedbacks (pattern effect), or an explicit dependence of the strength of radiative feedbacks on surface temperature change (feedback temperature dependence) may explain the climate response in atmosphere-ocean coupled general circulation models (AOGCMs) or can be useful for interpreting the instrumental record. We analyze a two-layer model with an ocean heat transport efficacy, a two-region model with region specific heat capacities and radiative responses; a one-layer model with a temperature dependent feedback; and a model which combines elements of the two-layer/region models and the state-dependent feedback parameter. We show that, from the perspective of the globally averaged surface temperature and radiative imbalance, the two-region and two-layer models are equivalent. State-dependence of the feedback parameter introduces a nonlinearity in the system which makes the adjustment timescales forcing-dependent. Neither the linear two-region/layer models, nor the state-dependent feedback model adequately describes the behavior of complex climate models. The model which combines elements of both can adequately describe the response of more comprehensive models but may require more experimental input than is available from single forcing realizations.
Article
Full-text available
A new release of the Max Planck Institute for Meteorology Earth System Model version 1.2 (MPI-ESM1.2) is presented. The development focused on correcting errors in and improving the physical processes representation, as well as improving the computational performance, versatility, and overall user friendliness. In addition to new radiation and aerosol parameterizations of the atmosphere, several relatively large, but partly compensating, coding errors in the model's cloud, convection, and turbulence parameterizations were corrected. The representation of land processes was refined by introducing a multilayer soil hydrology scheme, extending the land biogeochemistry to include the nitrogen cycle, replacing the soil and litter decomposition model and improving the representation of wildfires. The ocean biogeochemistry now represents cyanobacteria prognostically in order to capture the response of nitrogen fixation to changing climate conditions and further includes improved detritus settling and numerous other refinements. As something new, in addition to limiting drift and minimizing certain biases, the instrumental record warming was explicitly taken into account during the tuning process. To this end, a very high climate sensitivity of around 7 K caused by low-level clouds in the tropics as found in an intermediate model version was addressed, as it was not deemed possible to match observed warming otherwise. As a result, the model has a climate sensitivity to a doubling of CO 2 over preindustrial conditions of 2.77 K, maintaining the previously identified highly nonlinear global mean response to increasing CO 2 forcing, which nonetheless can be represented by a simple two-layer model.
Article
Full-text available
Estimating the equilibrium climate sensitivity (ECS; the equilibrium warming in response to a doubling of CO2) from observations is one of the big problems in climate science. Using observations of interannual climate variations covering the period 2000 to 2017 and a model-derived relationship between interannual variations and forced climate change, we estimate that ECS is likely 2.4–4.6 K (17–83% confidence interval), with a mode and median value of 2.9 and 3.3 K, respectively. This analysis provides no support for low values of ECS (below 2 K) suggested by other analyses. The main uncertainty in our estimate is not observational uncertainty but rather uncertainty in converting observations of short term, mainly unforced climate variability to an estimate of the response of the climate system to long-term forced warming.
Article
Full-text available
Model calibration (or tuning) is a necessary part of developing and testing coupled ocean–atmosphere climate models regardless of their main scientific purpose. There is an increasing recognition that this process needs to become more transparent for both users of climate model output and other developers. Knowing how and why climate models are tuned and which targets are used is essential to avoiding possible misattributions of skillful predictions to data accommodation and vice versa. This paper describes the approach and practice of model tuning for the six major US climate modeling centers. While details differ among groups in terms of scientific missions, tuning targets, and tunable parameters, there is a core commonality of approaches. However, practices differ significantly on some key aspects, in particular, in the use of initialized forecast analyses as a tool, the explicit use of the historical transient record, and the use of the present-day radiative imbalance vs. the implied balance in the preindustrial era as a target.
Article
Earth system models are complex and represent a large number of processes, resulting in a persistent spread across climate projections for a given future scenario. Owing to different model performances against observations and the lack of independence among models, there is now evidence that giving equal weight to each available model projection is suboptimal. This Perspective discusses newly developed tools that facilitate a more rapid and comprehensive evaluation of model simulations with observations, process-based emergent constraints that are a promising way to focus evaluation on the observations most relevant to climate projections, and advanced methods for model weighting. These approaches are needed to distil the most credible information on regional climate changes, impacts, and risks for stakeholders and policy-makers.