Article

Satellite bulk tropospheric temperatures as a metric for climate sensitivity

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We identify and remove the main natural perturbations (e.g. volcanic activity, ENSOs) from the global mean lower tropospheric temperatures (TLT) over January 1979 - June 2017 to estimate the underlying, potentially human-forced trend. The unaltered value is +0.155 K dec⁻¹ while the adjusted trend is +0.096 K dec⁻¹, related primarily to the removal of volcanic cooling in the early part of the record. This is essentially the same value we determined in 1994 (+0.09 K dec⁻¹, Christy and McNider, 1994) using only 15 years of data. If the warming rate of +0.096 K dec⁻¹ represents the net TLT response to increasing greenhouse radiative forcings, this implies that the TLT tropospheric transient climate response (ΔTLT at the time CO2 doubles) is +1.10 ± 0.26 K which is about half of the average of the IPCC AR5 climate models of 2.31 ± 0.20 K. Assuming that the net remaining unknown internal and external natural forcing over this period is near zero, the mismatch since 1979 between observations and CMIP-5 model values suggests that excessive sensitivity to enhanced radiative forcing in the models can be appreciable. The tropical region is mainly responsible for this discrepancy suggesting processes that are the likely sources of the extra sensitivity are (a) the parameterized hydrology of the deep atmosphere, (b) the parameterized heat-partitioning at the oceanatmosphere interface and/or (c) unknown natural variations.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... As we did with Christy and McNider (2017) and Lewis and Curry (2018) in D20, we sampled Lewis (2022) using the technique of inverse transform sampling. We ran the FUND model for 10,000 Monte Carlo simulations under the five separate economic growth scenarios assumed by the U.S. Interagency Working Group (US Interagency Working Group 2013). ...
... (2022) ECS distribution to be an appropriate choice, recommends not adjusting the CO 2 fertilization parameter, and recommends use of a 5% discount rate as being in the middle of the range in guidance of the US Office of Management and Budget, and also close to that used in DICE based on empirical considerations. Figure 2 shows the 2050 SCC best estimates using a 5% discount rate, no CO 2 fertilization adjustment, and ECS distributions from, respectively, Roe and Baker (2007), Lewis (2022), Lewis and Curry (2018), and Christy and McNider (2017). The Lewis and Curry (2018) estimate is included both for comparison and also because it is close to the ECS distribution obtained by Lewis (2022) using only post-1869 temperature data. ...
... As of 2050 under M22's preferred configuration, the SCC is $3.39 and has a 33.4 percent chance of being negative. Using the Christy and McNider (2017) ECS estimate, the ECS mean value is slightly negative. Either way, these values imply that the SCC is close enough to zero through 2050 as to make it difficult to justify even moderate climate mitigation policies, much less those needed for Paris compliance or achievement of Net Zero by 2050. ...
Article
Full-text available
Meyer (Environ Econ Policy Stud, 2022) questions a number of assumptions behind the social cost of carbon (SCC) calculations in Dayaratna et al. (Environ Econ Policy Stud 22:433–448, 2020), especially the CO 2 fertilization benefit and the climate sensitivity estimate. He recommends against increasing the CO 2 effect and suggests applying a recent climate sensitivity estimate in Lewis, Clim Dyn (2022), but did not calculate the resulting SCC distribution. Herein we critically assess his recommendations and compute the SCC distribution they imply. It has a median SCC value in 2050 of $3.39 and implies a 33.4 percent probability of the optimal carbon tax being negative. While a bit higher than the results in Dayaratna et al. (Environ Econ Policy Stud 22:433–448, 2020), they are not materially different for the purposes of setting optimal climate policy.
... Christy and McNider (2017) used satellite bulk atmospheric temperature data from 1979-2016 and estimated a TCR of 1.1 ± 0.26 °C which is similar to the Lewis and Curry (2018) estimate of 1.2 °C (5-95% 0.9-1.7 °C). Using the estimated ECS/TCR ratio of 1.3 in Lewis and Curry (2018) implies a corresponding ECS mode of 1.4 °C in Christy and McNider (2017). ...
... It is also noteworthy that once the ECS distribution is changed to that in Lewis and Curry (2018), the SCC estimates are, whether positive or negative, very small. As a public policy matter, after downscaling these estimates by the marginal cost of public funds (Sandmo 1975), the model's implication would be that the optimal (2018) ECS distribution (LC18); LC18 and 15% increase in CO 2 fertilization parameter (LC18 + 15%); LC18 and 30% increase in CO 2 fertilization parameter (LC18 + 30%) Table 3 we present the results for the 3% discount rate case using the ECS estimate derived from Christy and McNider (2017). It is notable that this ECS estimate is based on a different temperature data set than Lewis and Curry (2018), and focuses only on the last 40 years and on the lower troposphere where models project a somewhat stronger warming response than at the surface. ...
... Specifically, the IPCC's preferred estimate of aerosol forcing (cooling) has declined over time, which leads to a lower preferred ECS estimate in empirical energy balance models. The methodology of Christy and McNider (2017) provides an independent and model-free check on this approach. Also, while climate models with high ECS values can be made to fit the surface warming trend, they have shown demonstrably excess warming elsewhere, especially in the troposphere over the tropics (Fu et al. 2011;McKitrick and Christy 2018). ...
Article
Full-text available
We explore the implications of recent empirical findings about CO2 fertilization and climate sensitivity on the social cost of carbon (SCC) in the FUND model. New compilations of satellite and experimental evidence suggest larger agricultural productivity gains due to CO2 growth are being experienced than are reflected in FUND parameterization. We also discuss recent studies applying empirical constraints to the probability distribution of equilibrium climate sensitivity and we argue that previous Monte Carlo analyses in IAMs have not adequately reflected the findings of this literature. Updating the distributions of these parameters under varying discount rates is influential on SCC estimates. The lower bound of the social cost of carbon is likely negative and the upper bound is much lower than previously claimed, at least through the mid-twenty-first century. Also the choice of discount rate becomes much less important under the updated parameter distributions.
... Fuente: (Christy, J.R. & McNider R. T, 2017) La mixtificación de las ciencias naturales con las sociales -habitual en los modelos del IPCC-, lejos de aportar mejoras, termina por arruinar ambas: se pervierte el método científico y se arruina a la sociedad. Porque la ciencia natural no puede prescribir lo que debe hacerse a nivel social. ...
... (Christy, J.R. & McNider R. T, 2017) ...
Article
Según las últimas estimaciones, la edad de la Tierra se cifra en 4.543 millones de años. En todo ese tiempo, lo único permanente ha sido, paradójicamente, el cambio. Convertido inicialmente en una inmensa bola de fuego, nuestro planeta ha ido conformándose a base de todo tipo de catástrofes naturales: meteoritos, terremotos, inundaciones, huracanes y los más bruscos cambios de temperatura han sido la causa directa de todo cuanto hoy vemos a nuestro alrededor.Los seres humanos anatómicamente modernos sólo hemos sido testigos de los últimos 200.000 años y nuestro éxito como especie se ha debido, precisamente, a la gran capacidad de adaptación frente al entorno hostil que representaba para nosotros la Tierra y su cambiante clima.
... Despite model-satellite agreement on the overall pattern of atmospheric temperature change, there is still considerable debate regarding the significance of model-versusobserved differences in the rate of tropical tropospheric warming and whether these differences can be attributed to deficiencies in GCMs. The average simulated temperature change in phases 3, 5, and 6 of the Coupled Model Intercomparison Project (CMIP) exhibits greater tropospheric warming than observations over the satellite era (1979 to present), particularly in the tropics (8)(9)(10)(11)(12)(13). ...
... This discontinuity spuriously inflates CESM2 tropical TMT trends by 0.04 KÁdecade À1 , though the magnitude of this effect remains to be quantified in other GCMs. In addition to model forcing biases and internal variability, GCM response errors and residual satellite biases are also likely to contribute to model-versussatellite differences in tropospheric warming (9,(49)(50)(51). ...
Article
Full-text available
Climate-model simulations exhibit approximately two times more tropical tropospheric warming than satellite observations since 1979. The causes of this difference are not fully understood and are poorly quantified. Here, we apply machine learning to relate the patterns of surface-temperature change to the forced and unforced components of tropical tropospheric warming. This approach allows us to disentangle the forced and unforced change in the model-simulated temperature of the midtroposphere (TMT). In applying the climate-model-trained machine-learning framework to observations, we estimate that external forcing has produced a tropical TMT trend of 0.25 ± 0.08 K⋅decade-1 between 1979 and 2014, but internal variability has offset this warming by 0.07 ± 0.07 K⋅decade-1. Using the Community Earth System Model version 2 (CESM2) large ensemble, we also find that a discontinuity in the variability of prescribed biomass-burning aerosol emissions artificially enhances simulated tropical TMT change by 0.04 K⋅decade-1. The magnitude of this aerosol-forcing bias will vary across climate models, but since the latest generation of climate models all use the same emissions dataset, the bias may systematically enhance climate-model trends over the satellite era. Our results indicate that internal variability and forcing uncertainties largely explain differences in satellite-versus-model warming and are important considerations when evaluating climate models.
... The values of the two unknown parameters (De Veaux et al., 2005, p In this section, an empirical mathematical model for the secular GMT trend dT dy / was derived and validated using previous results (Christy & McNider, 2017;Swanson et al., 2009;Wu et al., 2011). In the next section, this model is used to determine the climate sensitivity . ...
... The empirical model for the secular GMT trend derived above could also be verified by using the tropospheric temperature trend analysis result reported byChristy and McNider (2017): "We identify and remove the main natural perturbations (e.g. volcanic activity, El Niño Southern Oscillations (ENSOs)) from the global mean lower tropospheric temperatures ( LT T ) over January 1979-June 2017 to estimate the underlying, potentially human-forced trend. ...
Article
Full-text available
Previous studies have reported that human influences are required to explain the observed global warming using a linear model (LM) ΔT=λLMΔF that relates change in solar forcing ΔF to change in global mean temperature (GMT) ΔT. This model has the shortcoming of assuming a given ΔF causes the same ΔT irrespective of the value of the initial global warming rate (dT/dy)i. Analysis of the GMT data showed that this warming rate has been increasing linearly since steady state ((dT/dy)o=0 for year yo=1864.5) as given by dT/dy=(dT/dy)i+aT(y−yi), where aT=7.1954×10−5 °C/year² is the secular GMT acceleration and y−yi is the number of years of the change. The secular solar forcing due to 18% of the 11 yr solar cycle forcing of 0.19 W/m2 (0.08% of Total Solar Irradiance) was expressed as ΔF=0.18×0.19(y−yi)/11. Defining the climate sensitivity as λ=Δ(dT/dy)/ΔF=aT/(0.18×0.19/11)=0.023143 °C/year per W/m2 removed the shortcoming of the LM and integration of the model for dT/dy above and then simplifying gave a secular GMT‐solar forcing model given by ΔT=(dT/dy)i(y−yi)+(λ2/(2aT))(ΔF)2 that explained all of the observed global warming and increase in atmospheric CO2, sea level and ocean heat content. Therefore, for this nonlinear empirical model, invoking human influences to explain climate change was not required. The annual GMT model predicts a pause in global warming until 2040.
... Several nonmutually exclusive explanations have been proposed to explain these model-observation differences. One explanation is that GCMs are too sensitive to increases in the atmospheric concentration of greenhouse gases (9)(10)(11). Natural climate variability and systematic model-forcing errors are also possible explanations. ...
... There is a tendency for models with larger ECS values to have larger tropical TMT trends (Fig. 4). The correlation coefficient between each model's ensemble average tropical TMT trend and ECS is 0.59 (SI Appendix, Fig. S5B), consistent with suggestions that exaggerated model sensitivity to greenhouse gases is contributing to modelobservational differences in the rate of tropical tropospheric warming (11). Indeed, models with larger climate sensitivity values tend to have fewer simulations in accord with satellite observations (Fig. 4). ...
Article
Full-text available
Significance Climate models have, on average, simulated substantially more tropical tropospheric warming than satellite data, with few simulations matching observations. It has been suggested that this discrepancy arises because climate models are overly sensitive to greenhouse gas increases. Tropical tropospheric temperature trends from a large ensemble of simulations performed with a single climate model span a wide range that is solely due to natural climate variability. A subset of these simulations have warming rates in accord with satellite observations. Simulations with diminished tropical tropospheric warming due to climate variability exhibit characteristic patterns of surface warming that are similar to the observed record. Our results indicate that multidecadal variability can explain current model–observational differences in the rate of tropical tropospheric warming.
... Long-term warming of the troposphere is consistent with our understanding of greenhouse warming. Other factors, such as volcanic eruptions, decadal variability, and solar activity also modulate the long-term warming trend (Christy and McNider 2017; Po-Chedley et al. 2022). Interannual variations in global LTT are dominated by the El Niño-Southern Oscillation, which has largely been in a La Niña state since August 2020 (see section 4b and Sidebar 3.1 for details; Figs. ...
... Long-term warming of the troposphere is consistent with our understanding of greenhouse warming. Other factors, such as volcanic eruptions, decadal variability, and solar activity also modulate the long-term warming trend (Christy and McNider 2017; Po-Chedley et al. 2022). Interannual variations in global LTT are dominated by the El Niño-Southern Oscillation, which has largely been in a La Niña state since August 2020 (see section 4b and Sidebar 3.1 for details; Figs. ...
... Long-term warming of the troposphere is consistent with our understanding of greenhouse warming. Other factors, such as volcanic eruptions, decadal variability, and solar activity also modulate the long-term warming trend (Christy and McNider 2017; Po-Chedley et al. 2022). Interannual variations in global LTT are dominated by the El Niño-Southern Oscillation, which has largely been in a La Niña state since August 2020 (see section 4b and Sidebar 3.1 for details; Figs. ...
Article
Full-text available
Rapid warming due to human-caused climate change is reshaping the Arctic, enhanced by physical processes that cause the Arctic to warm more quickly than the global average, collectively called Arctic amplification. Observations over the past 40+ years show a transition to a wetter Arctic, with seasonal shifts and widespread disturbances influencing the flora, fauna, physical systems, and peoples of the Arctic.
... Much of this might be explained by an additional contribution from anthropogenic forcings [1, 2,17,106,107,111]. Although, if so, it probably would involve a much lower climate sensitivity to greenhouse gases than the CMIP6 models imply-as several studies have suggested, e.g., [17,18,31,33,34,[138][139][140][141][142][143][144][145][146][147][148]. There may also be additional non-climatic biases remaining in the data [5,7,10,76]. ...
Article
Full-text available
A statistical analysis was applied to Northern Hemisphere land surface temperatures (1850–2018) to try to identify the main drivers of the observed warming since the mid-19th century. Two different temperature estimates were considered—a rural and urban blend (that matches almost exactly with most current estimates) and a rural-only estimate. The rural and urban blend indicates a long-term warming of 0.89 °C/century since 1850, while the rural-only indicates 0.55 °C/century. This contradicts a common assumption that current thermometer-based global temperature indices are relatively unaffected by urban warming biases. Three main climatic drivers were considered, following the approaches adopted by the Intergovernmental Panel on Climate Change (IPCC)’s recent 6th Assessment Report (AR6): two natural forcings (solar and volcanic) and the composite “all anthropogenic forcings combined” time series recommended by IPCC AR6. The volcanic time series was that recommended by IPCC AR6. Two alternative solar forcing datasets were contrasted. One was the Total Solar Irradiance (TSI) time series that was recommended by IPCC AR6. The other TSI time series was apparently overlooked by IPCC AR6. It was found that altering the temperature estimate and/or the choice of solar forcing dataset resulted in very different conclusions as to the primary drivers of the observed warming. Our analysis focused on the Northern Hemispheric land component of global surface temperatures since this is the most data-rich component. It reveals that important challenges remain for the broader detection and attribution problem of global warming: (1) urbanization bias remains a substantial problem for the global land temperature data; (2) it is still unclear which (if any) of the many TSI time series in the literature are accurate estimates of past TSI; (3) the scientific community is not yet in a position to confidently establish whether the warming since 1850 is mostly human-caused, mostly natural, or some combination. Suggestions for how these scientific challenges might be resolved are offered.
... • C, respectively. Actually, using the UAH-MSU-lt record from 1979 to 2017, Christy and McNider [39] estimated that the (lower tropospheric) transient climate response (TCR) should be equal to 1.10 ± 0.26 • C, which well agrees with the 0.81-1.22 • C range. ...
Article
Full-text available
Global climate models (GCMs) from the sixth Coupled Model Intercomparison ProjectPhases (CMIP6) have been employed to simulate the twenty-first-century temperatures for the risk assessment of future climate change. However, their transient climate response (TCR) ranges from 1.2 to 2.8 ◦C, whereas their equilibrium climate sensitivity (ECS) ranges from 1.8 to 5.7 ◦C, leading to large variations in the climatic impact of an anthropogenic increase in atmospheric CO2 levels. Moreover, there is growing evidence that many GCMs are running “too hot” and are hence unreliable for directing policies for future climate changes. Here, I rank 41 CMIP6 GCMs according to how successfully they hindcast the global surface warming between 1980 and 2021 using both their published ECS and TCR estimates. The sub-ensemble of GCMs with the best performance appears to be composed of the models with ECS ranging between 1.8 and 3.0 ◦C (which confirms previous studies) and TCR ranging between 1.2 and 1.8 ◦C. This GCM sub-ensemble is made up of a total of 17 models. Depending on the emission scenarios, these GCMs predict a 2045–2055 warming of 1.5–2.5 ◦C compared to the pre-industrial era (1850–1900). As a result, the global aggregated impact and risk estimates seem to be moderate, which implies that any negative effects of future climate change may be adequately addressed by adaptation programs. However, there are also doubts regarding the actual magnitude of global warming, which might be exaggerated because of urban heat contamination and other local non-climatic biases. A final section is dedicated to highlighting the divergences observed between the global surface temperature records and a number of alternative temperature reconstructions from lower troposphere satellite measurements, tree-ring-width chronologies, and surface temperature records based on rural stations alone. If the global warming reported by the climate records is overestimated, the real ECS and TCR may be significantly lower than what is produced by the CMIP6 GCMs, as some independent studies have already suggested, which would invalidate all of the CMIP6 GCMs.
... Above-average lower tropospheric temperatures are consistent with long-term greenhouse gas warming and less-pronounced volcanic cooling over the past 3 decades (relative to significant cooling from the eruptions of Agung, El Chichón, and Pinatubo in 1963, 1982, and 1991, respectively; e.g., Santer et al. 2014;Christy and McNider 2017). Recent warmth is recorded by in-situ radiosonde (balloon-borne), microwave (satellite), and reanalysis datasets ( Fig. 2.8). ...
Article
Editors note: For easy download the posted pdf of the State of the Climate in 2020 is a very low-resolution file. A high-resolution copy of the report is available by clicking here . Please be patient as it may take a few minutes for the high-resolution file to download.
... Above-average lower tropospheric temperatures are consistent with long-term greenhouse gas warming and less-pronounced volcanic cooling over the past 3 decades (relative to significant cooling from the eruptions of Agung, El Chichón, and Pinatubo in 1963, 1982, and 1991, respectively; e.g., Santer et al. 2014;Christy and McNider 2017). Recent warmth is recorded by in-situ radiosonde (balloon-borne), microwave (satellite), and reanalysis datasets ( Fig. 2.8). ...
... Numerous studies have pointed to a tendency across climate models to project too much contemporary warming in the tropical troposphere (Bengtsson & Hodges, 2009;Douglass et al., 2007;Fu et al., 2011;Karl et al., 2006;McKitrick et al., 2010;McKitrick & Vogelsang, 2014;Po-Chedley & Fu, 2012;Thorne et al., 2011) with additional evidence pointing to a global tropospheric bias as well (Christy & McNider, 2017). Here we present an updated comparison using the first 38 models made available in the newly released sixth-generation Coupled Model Intercomparison Project (CMIP6) archive comparing model reconstructions of historical layer-averaged lower-troposphere (LT) and midtroposphere (MT) temperature series against observational analogs from satellites, balloon-borne radiosondes, and reanalysis products. ...
Article
Full-text available
Plain Language Summary It has long been known that previous generations of climate models exhibit excessive warming rates in the tropical troposphere. With the release of the CMIP6 (Coupled Model Intercomparison Project Version 6) climate model archive we can now update the comparison. We examined historical (hindcast) runs from 38 CMIP6 models in which the models were run using historically observed forcings. We focus on the 1979–2014 interval, the maximum for which all models and observational data are available and for which the models were run with historical forcings. What was previously a tropical bias is now global. All model runs warmed faster than observations in the lower troposphere and midtroposphere, in the tropics, and globally. On average, and in most individual cases, the trend difference is significant. Warming trends in models tend to rise with the model Equilibrium Climate Sensitivity (ECS), and we present evidence that the distribution of ECS values across the model is unrealistically high.
... Therefore, rather than considering just one value for the climate sensitivity, for the rest of our analysis, we will consider a range of six different values for TCR: 0.5, 1.0, 1.5, 2.0, 2.5, and 3.0 °C This covers the IPCC's current "likely" range of 1.0-2.5 °C, but also considers a lower value of 0.5°C, recognizing that several recent studies have argued that the TCR could be less than 1.0 °C, e.g., refs. [152,207,208,215,216,219,221,227] as well as a higher value of 3.0°C. Similarly, we consider a range of six different values for ECS (1, 2, 3, 4, 5, and 6 °C), which encompasses the IPCC's current "likely" range of 1.5-4.5 °C, but also considers the possibility that the ECS might be lower than 1.5 °C, e.g., refs. ...
Article
Full-text available
In order to assess the merits of national climate change mitigation policies, it is important to have a reasonable benchmark for how much human-caused global warming would occur over the coming century with "Business-As-Usual" (BAU) conditions. However, currently, policymakers are limited to making assessments by comparing the Global Climate Model (GCM) projections of future climate change under various different "scenarios", none of which are explicitly defined as BAU. Moreover, all of these estimates are ab initio computer model projections, and policymakers do not currently have equivalent empirically derived estimates for comparison. Therefore, estimates of the total future human-caused global warming from the three main greenhouse gases of concern (CO2, CH4, and N2O) up to 2100 are here derived for BAU conditions. A semi-empirical approach is used that allows direct comparisons between GCM-based estimates and empirically derived estimates. If the climate sensitivity to greenhouse gases implies a Transient Climate Response (TCR) of ≥ 2.5 °C or an Equilibrium Climate Sensitivity (ECS) of ≥ 5.0 °C then the 2015 Paris Agreement's target of keeping human-caused global warming below 2.0 °C will have been broken by the middle of the century under BAU. However, for a TCR < 1.5 °C or ECS < 2.0 °C, the target would not be broken under BAU until the 22nd century or later. Therefore, the current Intergovernmental Panel on Climate Change (IPCC) "likely" range estimates for TCR of 1.0 to 2.5 °C and ECS of 1.5 to 4.5 °C have not yet established if human-caused global warming is a 21st century problem.
Article
Full-text available
State of land and ocean surface winds in 2022.
Article
Dayaratna et al. (Environ Econ Policy Stud 22:433–448, 2020) proposes several improvements to economic- and climate-simulating IAMs and to their input variables. They show that an empirically based ECS estimate, agricultural damage estimates from recent literature, and discount rates as low as 2.5%, when used within the FUND model, yield a negative social cost of carbon through mid-twenty-first century. Five of their propositions, that “fat-tails” of the ECS distribution are improperly simulated by IAMS, FUND does not simulate enough benefits for agriculture, the ECS is near 1.5 °C, DICE and PAGE overestimate the SCC, and the marginal cost of public funds for carbon taxes has a value of at least a few dollars, are more closely analyzed. Estimates of ECS from a recent paper and a 5% discount rate applied to FUND result in an SCC estimated to lie within the range of about −1to1 to 2.50 per metric ton CO2 for the year 2020.
Preprint
Full-text available
The equilibrium climate sensitivity (ECS) of the CMIP6 global circulation models (GCMs) varies from 1.83 {\deg}C to 5.67 {\deg}C. Herein, 38 GCMs are grouped into three ECS classes (low, 1.80-3.00 {\deg}C; medium, 3.01-4.50 {\deg}C; high, 4.51-6.00 {\deg}C) and compared against the ERA5-T2m records from 1980-1990 to 2011-2021. We found that all models with ECS > 3.0 {\deg}C overestimate the observed global surface warming and that spatial t-statistics rejects the data-model agreement over 60% (using low-ECS GCMs) to 81% (using high-ECS GCMs) of the Earth's surface. Thus, the high and medium-ECS GCMs are unfit for prediction purposes. The low-ECS GCMs are not fully satisfactory yet, but they are found unalarming because by 2050 they predict a moderate warming (ΔTpreindustrial20502C\Delta T_{preindustrial\rightarrow2050}\lesssim2\:{^\circ}C).
Article
Full-text available
Plain Language Summary The last‐generation Coupled Model Intercomparison Projects (CMIP6) global circulation models (GCMs) are used by scientists and policymakers to interpret past and future climatic changes and to determine appropriate (adaptation or mitigation) policies to optimally address scenario‐related climate‐change hazards. However, these models are affected by large uncertainties. For example, their equilibrium climate sensitivity (ECS) varies from 1.83°C to 5.67°C, which makes their 21st‐century predicted warming levels very uncertain. This issue is here addressed by testing the GCMs' global and local performance in predicting the 1980–2021 warming rates against the ERA5‐T2m records and by grouping them into three equilibrium climate sensitivity (ECS) classes (low‐ECS, 1.80–3.00°C; medium‐ECS, 3.01–4.50°C; high‐ECS, 4.51–6.00°C). We found that: (a) all models with ECS > 3.0°C overestimate the observed global surface warming; (b) Student t‐tests show model failure over 60% (low‐ECS) to 81% (high‐ECS) of the Earth's surface. Thus, the high and medium‐ECS GCMs do not appear to be consistent with the observations and should not be used for implementing policies based on their scenario forecasts. The low‐ECS GCMs perform better, although not optimally; however, they are also found unalarming because for the next decades they predict moderate warming: ΔTpreindustrial→2050 ≲ 2°C.
Article
Multiple records of global temperature contain periods of decadal length with flat or declining temperature trend, often termed a ‘hiatus’. Towards assessing the physical reality of two such periods (1940–1972 and 1998–2014), lightning data are examined. Lightning activity is of particular interest because on many different time scales it has been shown to be non-linearly dependent on temperature. During the earlier hiatus, declining trends in regional thunder days have been documented. During the more recent hiatus, lightning observations from the Lightning Imaging Sensor in space show no trend in flash rate. Surface-based, radiosonde-based and satellite-based estimates of global temperature have all been examined to support the veracity of the hiatus in global warming over the time interval of the satellite-based lightning record. Future measurements are needed to capture the total global lightning activity on a continuous basis.
Article
Full-text available
Editor’s note: For easy download the posted pdf of the State of the Climate for 2018 is a low-resolution file. A high-resolution copy of the report is available by clicking here. Please be patient as it may take a few minutes for the high-resolution file to download.
Article
Overall climate sensitivity to CO2 doubling in a general circulation model results from a complex system of parameterizations in combination with the underlying model structure. We refer to this as the model's major hypothesis, and we assume it to be testable. We explain four criteria that a valid test should meet: measurability, specificity, independence, and uniqueness. We argue that temperature change in the tropical 200- to 300-hPa layer meets these criteria. Comparing modeled to observed trends over the past 60 years using a persistence-robust variance estimator shows that all models warm more rapidly than observations and in the majority of individual cases the discrepancy is statistically significant. We argue that this provides informative evidence against the major hypothesis in most current climate models.
Article
Full-text available
Volcanic activity and the formation of the Puente Hills half-graben (PHHG) in the northeastern Los Angeles basin constrain the timing of extension within the greater Los Angeles basin. Eruption of the Glendora Volcanics (ca. 16-14 Ma) and El Modeno Volcanics (ca. 14 Ma) mark the beginning of lithospheric extension. Continuing extension is recorded in strata of the PHHG, with movement on the proto-Whittier normal fault and intrusion of diabase sills into the La Vida Member (13.5-9.4 Ma) of the Puente Formation. A change from extension to N-S horizontal contraction and vertical uplift at ca. 8 Ma, which resulted in compressional inversion of the PHHG, has produced the Puente Hills anticline and the throughgoing Whittier fault. Our high-resolution 3-D P-wave velocity model (with block dimensions 10x10x3 km) shows two anomalous higher-velocity bodies (6.63 km/s) at depths of 9-18 km, which we also relate to extension. The 6.63 km/s velocity of the higher velocity volumes represents an increase over regional velocities of ca. 4 to 8 percent. These tomographic anomalies may be generated by plutons, possibly gabbro with a bulk density ca. 3 gm/cc, that were magma sources for at least some of the overlying volcanic rock. A velocity anomaly near the Whittier Narrows, here named the Whittier Narrows pluton, extends vertically over three stacked grid blocks at depths of 9-18 km and is well placed to have been a source for the Glendora Volcanics and the La Vida diabase sills. A southeast trending layer of four high-velocity grid blocks at depths of 9-12 km marks a possible source for El Modeno Volcanics, here named El Modeno pluton. Three active faults converge and appear to terminate in the vicinity of the Whittier Narrows pluton: (1) the Elysian Park blind thrust, (2) the Puente Hills blind thrust, and (3) the Whittier fault. The Whittier Narrows pluton may play a controlling role in locating and segmenting the faults. The locations of the Whittier Narrows and El Modeno plutons may indicate the position of the initiation of crustal extension commonly attributed to clockwise rotation of the western Transverse Ranges. A fundamental rift boundary or breakaway zone may either be coincident with the trend delineated by the plutons or lie farther southwest in the undrilled area of the central Los Angeles basin. We suggest that high-resolution velocity studies of the kind we have undertaken may be of general application in understanding fault segmentation and hypocentral concentration in the Los Angeles basin.
Article
Over the past decade, several crustal P- and S-wave tomographic velocity models have been proposed for the greater Los Angeles basin, including the SCEC model (Magistrale et al., BSSA, 90, 6B, S65-S76, 2000), the model by Hauksson (JGR, 105, 13,875-13,903, 2000), and the model by Zhou (1994, 2002). The availability of both P and S velocity models provides the possibility for enhancing geologic interpretations of crustal structures. Derived Poisson ratio models show distinctive lateral variations in the middle crust that appear to correlate with lithologic differences across major strike-slip faults. In the greater Los Angeles basin, near-surface VP values for all of the models are generally in the ranges that would be expected based on outcrops and samples from wells. In the Santa Monica mountains and the Santa Ana Mountains, VP values in the range of 5.5-6.0 km/s correlate with the slate and granite outcrops of the igneous and metamorphic basement complex. Velocities that range from 4.25-5.5 km/s delineate the sedimentary deposits of the Los Angeles basin. Locally, high-velocity anomalies in the middle crust have been interpreted to indicate Middle Miocene plutons, but an integrated, regional interpretation remains to be developed. Recognition of basement structures and the sedimentary wedges that are known to characterize the basin margins in the Los Angeles basin has met with limited success. A new generation of crustal velocity models based on deformable-layered tomography (DLT), which explicitly integrates surface geology, well data and seismicity, may result in improved resolution of crustal structures. Conventional cell- or grid-based tomography, including all previous tomographic studies in this region, seeks to determine velocity values on a regular grid of nodes or blocks that are fixed in space. Such methods do not directly constrain the depths to crystalline basement and the Moho. DLT will represent the velocity field by depth-varying velocity interfaces (variable grid blocks) and solve for the best-fit depths of the interfaces. The DLT models should be in the evaluation phase by early 2003.
Article
Analysis of structural and stratigraphic data from hundreds of oil wells, surface geologic maps and earthquake focal mechanisms reveals a three-phase structural evolution of the Whittier fold–fault system. That evolution progressed from an extensional phase (16 Ma), accommodated by the formation of a half-graben, through an episode of graben filling (14–8 Ma) to lithospheric shortening, which began (ca. 8 Ma) with contractional reactivation of the graben-forming fault as a blind reverse fault and accompanying anticlinal uplift. About 1 Ma, the blind reverse fault broke through onto the sea floor, initiating the activity of the present Whittier fault system. Horizontal contraction estimated from reconstructed cross-sections ranges from 1.4 to 3.2 km and has been at an average rate of 0.3 mm/year in the P direction (N–S) of the 1987 Whittier Narrows earthquake for the past ca. 8 My. That rate may have accelerated after breakthrough of the blind reverse fault to as much as 1.0 mm/year. The ratio of strike-slip to dip-slip displacement on the Whittier fault over the past ca. 8 My is about 0.4.
Article
Full-text available
Temperature sounding microwave radiometers flown on polar-orbiting weather satellites provide a long-term, global-scale record of upper-atmosphere temperatures, beginning in late 1978 and continuing to the present. The focus of this paper is a lower-tropospheric temperature product constructed using measurements made by the Microwave Sounding Unit channel 2 and the Advanced Microwave Sounding Unit channel 5. The temperature weighting functions for these channels peak in the middle to upper troposphere. By using a weighted average of measurements made at different Earth incidence angles, the effective weighting function can be lowered so that it peaks in the lower troposphere. Previous versions of this dataset used general circulation model output to remove the effects of drifting local measurement time on the measured temperatures. This paper presents a method to optimize these adjustments using information from the satellite measurements themselves. The new method finds a global-mean land diurnal cycle that peaks later in the afternoon, leading to improved agreement between measurements made by co-orbiting satellites. The changes result in global-scale warming [global trend (70°S-80°N, 1979-2016) = 0.174°C decade⁻¹], ~30% larger than our previous version of the dataset [global trend (70°S-80°N, 1979-2016) = 0.134°C decade⁻¹]. This change is primarily due to the changes in the adjustment for drifting local measurement time. The new dataset shows more warming than most similar datasets constructed from satellites or radiosonde data. However, comparisons with total column water vapor over the oceans suggest that the new dataset may not show enough warming in the tropics.
Article
Full-text available
The change of global-mean precipitation under global warming and interannual variability is predominantly controlled by the change of atmospheric longwave radiative cooling. Here we show that tightening of the ascending branch of the Hadley Circulation coupled with a decrease in tropical high cloud fraction is key in modulating precipitation response to surface warming. The magnitude of high cloud shrinkage is a primary contributor to the intermodel spread in the changes of tropical-mean outgoing longwave radiation (OLR) and global-mean precipitation per unit surface warming (dP/dT_s) for both interannual variability and global warming. Compared to observations, most Coupled Model Inter-comparison Project Phase 5 models underestimate the rates of interannual tropical-mean dOLR/dT_s and global-mean dP/dT_s, consistent with the muted tropical high cloud shrinkage. We find that the five models that agree with the observation-based interannual dP/dT_s all predict dP/dT_s under global warming higher than the ensemble mean dP/dT_s from the ∼20 models analysed in this study.
Chapter
Full-text available
This chapter provides an overview of the factors that will govern the rise in global mean surface temperature (GMST) over the rest of this century. We evaluate GMST using two approaches: analysis of archived output from atmospheric, oceanic general circulation models (GCMs) and calculations conducted using a computational framework developed by our group, termed the Empirical Model of Global Climate (EM-GC). Comparison of the observed rise in GMST over the past 32 years with GCM output reveals these models tend to warm too quickly, on average by about a factor of two. Most GCMs likely represent climate feedback in a manner that amplifies the radiative forcing of climate due to greenhouse gases (GHGs) too strongly. The GCM-based forecast of GMST over the rest of the century predicts neither the target (1.5 °C) nor upper limit (2.0 °C warming) of the Paris Climate Agreement will be achieved if GHGs follow the trajectories of either the Representative Concentration Pathway (RCP) 4.5 or 8.5 scenarios. Conversely, forecasts of GMST conducted in the EM-GC framework indicate that if GHGs follow the RCP 4.5 trajectory, there is a reasonably good probability (~75 %) the Paris target of 1.5 °C warming will be achieved, and an excellent probability (>95 %) global warming will remain below 2.0 °C. Uncertainty in the EM-GC forecast of GMST is primarily caused by the ability to simulate past climate for various combinations of parameters that represent climate feedback and radiative forcing due to aerosols, which provide disparate projections of future warming.
Article
Full-text available
We present an updated version of the radiosonde dataset homogenized by Iterative Universal Kriging (IUKv2), now extended through February 2013, following the method used in the original version (Sherwood et al 2008 Robust tropospheric warming revealed by iteratively homogenized radiosonde data J. Clim. 21 5336–52). This method, in effect, performs a multiple linear regression of the data onto a structural model that includes both natural variability, trends, and time-changing instrument biases, thereby avoiding estimation biases inherent in traditional homogenization methods. One modification now enables homogenized winds to be provided for the first time. This, and several other small modifications made to the original method sometimes affect results at individual stations, but do not strongly affect broad-scale temperature trends. Temperature trends in the updated data show three noteworthy features. First, tropical warming is equally strong over both the 1959–2012 and 1979–2012 periods, increasing smoothly and almost moist-adiabatically from the surface (where it is roughly 0.14 K/decade) to 300 hPa (where it is about 0.25 K/decade over both periods), a pattern very close to that in climate model predictions. This contradicts suggestions that atmospheric warming has slowed in recent decades or that it has not kept up with that at the surface. Second, as shown in previous studies, tropospheric warming does not reach quite as high in the tropics and subtropics as predicted in typical models. Third, cooling has slackened in the stratosphere such that linear trends since 1979 are about half as strong as reported earlier for shorter periods. Wind trends over the period 1979–2012 confirm a strengthening, lifting and poleward shift of both subtropical westerly jets; the Northern one shows more displacement and the southern more intensification, but these details appear sensitive to the time period analysed. There is also a trend toward more easterly winds in the middle and upper troposphere of the deep tropics.
Article
Full-text available
Comparisons of trends across climatic data sets are complicated by the presence of serial correlation and possible step-changes in the mean. We build on heteroskedasticity and autocorrelation robust methods, specifically the Vogelsang–Franses (VF) nonparametric testing approach, to allow for a step-change in the mean (level shift) at a known or unknown date. The VF method provides a powerful multivariate trend estimator robust to unknown serial correlation up to but not including unit roots. We show that the critical values change when the level shift occurs at a known or unknown date. We derive an asymptotic approximation that can be used to simulate critical values, and we outline a simple bootstrap procedure that generates valid critical values and p-values. Our application builds on the literature comparing simulated and observed trends in the tropical lower troposphere and mid-troposphere since 1958. The method identifies a shift in observations around 1977, coinciding with the Pacific Climate Shift. Allowing for a level shift causes apparently significant observed trends to become statistically insignificant. Model overestimation of warming is significant whether or not we account for a level shift, although null rejections are much stronger when the level shift is included. © 2014 The Authors. Environmetrics published by John Wiley & Sons, Ltd.
Article
Full-text available
Despite continued growth in atmospheric levels of greenhouse gases, global mean surface and tropospheric temperatures have shown slower warming since 1998 than previously. Possible explanations for the slow-down include internal climate variability, external cooling influences and observational errors. Several recent modelling studies have examined the contribution of early twenty-first-century volcanic eruptions to the muted surface warming. Here we present a detailed analysis of the impact of recent volcanic forcing on tropospheric temperature, based on observations as well as climate model simulations. We identify statistically significant correlations between observations of stratospheric aerosol optical depth and satellite-based estimates of both tropospheric temperature and short-wave fluxes at the top of the atmosphere. We show that climate model simulations without the effects of early twenty-first-century volcanic eruptions overestimate the tropospheric warming observed since 1998. In two simulations with more realistic volcanic influences following the 1991 Pinatubo eruption, differences between simulated and observed tropospheric temperature trends over the period 1998 to 2012 are up to 15% smaller, with large uncertainties in the magnitude of the effect. To reduce these uncertainties, better observations of eruption-specific properties of volcanic aerosols are needed, as well as improved representation of these eruption-specific properties in climate model simulations.
Article
Full-text available
Recent studies have estimated the magnitude of climate feedback based on the correlation between time variations in outgoing radiation flux and sea surface temperature (SST). This study investigates the influence of the natural non-feedback variation (noise) of the flux occurring independently of SST on the determination of climate feedback. The observed global monthly radiation flux is used from the Clouds and the Earth's Radiant Energy System (CERES) for the period 2000-2008. In the observations, the time lag correlation of radiation and SST shows a distorted curve with low statistical significance for shortwave radiation while a significant maximum at zero lag for longwave radiation over the tropics. This observational feature is explained by simulations with an idealized energy balance model where we see that the non-feedback variation plays the most significant role in distorting the curve in the lagged correlation graph, thus obscuring the exact value of climate feedback. We also demonstrate that the climate feedback from the tropical longwave radiation in the CERES data is not significantly affected by the noise. We further estimate the standard deviation of radiative forcings (mainly from the noise) relative to that of the non-radiative forcings, i.e., the noise level from the observations and atmosphere-ocean coupled climate model simulations in the framework of the simple model. The estimated noise levels in both CERES (>13 %) and climate models (11-28 %) are found to be far above the critical level (~5 %) that begins to misrepresent climate feedback.
Data
Full-text available
1] Using NASA's A-Train satellite measurements, we evaluate the accuracy of cloud water content (CWC) and water vapor mixing ratio (H 2 O) outputs from 19 climate models submitted to the Phase 5 of Coupled Model Intercomparison Project (CMIP5), and assess improvements relative to their counterparts for the earlier CMIP3. We find more than half of the models show improvements from CMIP3 to CMIP5 in simulating column-integrated cloud amount, while changes in water vapor simulation are insignificant. For the 19 CMIP5 models, the model spreads and their differences from the observations are larger in the upper troposphere (UT) than in the lower or middle troposphere (L/MT). The modeled mean CWCs over tropical oceans range from 33% to 15Â of the observations in the UT and 40% to 2Â of the observations in the L/MT. For modeled H 2 Os, the mean values over tropical oceans range from $1% to 2Â of the observations in the UT and within 10% of the observations in the L/MT. The spatial distributions of clouds at 215 hPa are relatively well-correlated with observations, noticeably better than those for the L/MT clouds. Although both water vapor and clouds are better simulated in the L/MT than in the UT, there is no apparent correlation between the model biases in clouds and water vapor. Numerical scores are used to compare different model performances in regards to spatial mean, variance and distribution of CWC and H 2 O over tropical oceans. Model performances at each pressure level are ranked according to the average of all the relevant scores for that level. Citation: Jiang, J. H., et al. (2012), Evaluation of cloud and water vapor simulations in CMIP5 climate models using NASA "A-Train" satellite observations, J. Geophys. Res., 117, D14105, doi:10.1029/2011JD017237.
Article
Full-text available
Measurements made by microwave sounding instruments provide a multidecadal record of atmospheric temperature change. Measurements began in late 1978 with the launch of the first Microwave Sounding Unit (MSU) and continue to the present. In 1998, the first of the follow-on series of instruments—the Advanced Microwave Sounding Units (AMSUs)—was launched. To continue the atmospheric temperature record past 2004, when measurements from the last MSU instrument degraded in quality, AMSU and MSU measure-ments must be intercalibrated and combined to extend the atmospheric temperature data records. Cali-bration methods are described for three MSU–AMSU channels that measure the temperature of thick layers of the atmosphere centered in the middle troposphere, near the tropopause, and in the lower stratosphere. Some features of the resulting datasets are briefly summarized.
Article
Full-text available
IN addition to the well-known warming of ~0.5 °C since the middle of the nineteenth century, global-mean surface temperature records1-4display substantial variability on timescales of a century or less. Accurate prediction of future temperature change requires an understanding of the causes of this variability; possibilities include external factors, such as increasing greenhouse-gas concentrations5-7 and anthropogenic sulphate aerosols8-10, and internal factors, both predictable (such as El Niño11) and unpredictable (noise12,13). Here we apply singular spectrum analysis14-20 to four global-mean temperature records1-4, and identify a temperature oscillation with a period of 65-70 years. Singular spectrum analysis of the surface temperature records for 11 geographical regions shows that the 65-70-year oscillation is the statistical result of 50-88-year oscillations for the North Atlantic Ocean and its bounding Northern Hemisphere continents. These oscillations have obscured the greenhouse warming signal in the North Atlantic and North America. Comparison with previous observations and model simulations suggests that the oscillation arises from predictable internal variability of the ocean-atmosphere system.
Article
Full-text available
1] This paper documents various unresolved issues in using surface temperature trends as a metric for assessing global and regional climate change. A series of examples ranging from errors caused by temperature measurements at a monitoring station to the undocumented biases in the regionally and globally averaged time series are provided. The issues are poorly understood or documented and relate to micrometeorological impacts due to warm bias in nighttime minimum temperatures, poor siting of the instrumentation, effect of winds as well as surface atmospheric water vapor content on temperature trends, the quantification of uncertainties in the homogenization of surface temperature data, and the influence of land use/land cover (LULC) change on surface temperature trends. Because of the issues presented in this paper related to the analysis of multidecadal surface temperature we recommend that greater, more complete documentation and quantification of these issues be required for all observation stations that are intended to be used in such assessments. This is necessary for confidence in the actual observations of surface temperature variability and long-term trends. Citation: Pielke, R. A., Sr., et al. (2007), Unresolved issues with the assessment of multidecadal global land surface temperature trends, J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229.
Article
Full-text available
1] One of the most significant signals in the thermometer-observed temperature record since 1900 is the decrease in the diurnal temperature range over land, largely due to rising of the minimum temperatures. Generally, climate models have not well replicated this change in diurnal temperature range. Thus, the cause for night-time warming in the observed temperatures has been attributed to a variety of external causes. We take an alternative approach to examine the role that the internal dynamics of the stable nocturnal boundary layer (SNBL) may play in affecting the response and sensitivity of minimum temperatures to added downward longwave forcing. As indicated by previous nonlinear analyses of a truncated two-layer equation system, the SNBL can be very sensitive to changes in greenhouse gas forcing, surface roughness, heat capacity, and wind speed. A new single-column model growing out of these nonlinear studies is used to examine the SNBL. Specifically, budget analyses of the model are provided that evaluate the response of the boundary layer to forcing and sensitivity to mixing formulations. Based on these model analyses, it is likely that part of the observed long-term increase in minimum temperature is reflecting a redistribution of heat by changes in turbulence and not by an accumulation of heat in the boundary layer. Because of the sensitivity of the shelter level temperature to parameters and forcing, especially to uncertain turbulence parameterization in the SNBL, there should be caution about the use of minimum temperatures as a diagnostic global warming metric in either observations or models. (2012), Response and sensitivity of the nocturnal boundary layer over land to added longwave radiative forcing,
Article
Full-text available
1] A new data set containing large-scale regional mean upper air temperatures based on adjusted global radiosonde data is now available up to the present. Starting with data from 85 of the 87 stations adjusted for homogeneity by Lanzante, Klein and Seidel, we extend the data beyond 1997 where available, using a first differencing method combined with guidance from station metadata. The data set consists of temperature anomaly time series for the globe, the hemispheres, tropics (30°N–30°S) and extratropics. Data provided include annual time series for 13 pressure levels from the surface to 30 mbar and seasonal time series for three broader layers (850–300, 300–100 and 100–50 mbar). The additional years of data increase trends to more than 0.1 K/decade for the global and tropical midtroposphere for 1979–2004. Trends in the stratosphere are approximately À0.5 to À0.9 K/decade and are more negative in the tropics than for the globe. Differences between trends at the surface and in the troposphere are generally reduced in the new time series as compared to raw data and are near zero in the global mean for 1979–2004. We estimate the uncertainty in global mean trends from 1979 to 2004 introduced by the use of first difference processing after 1995 at less than 0.02–0.04 K/decade in the troposphere and up to 0.15 K/decade in the stratosphere at individual pressure levels. Our reliance on metadata, which is often incomplete or unclear, adds further, unquantified uncertainty that could be comparable to the uncertainty from the FD processing. Because the first differencing method cannot be used for individual stations, we also provide updated station time series that are unadjusted after 1997. The Radiosonde Atmospheric Temperature Products for Assessing Climate (RATPAC) data set will be archived and updated at NOAA's National Climatic Data Center as part of its climate monitoring program.
Article
Full-text available
To estimate climate sensitivity from observations, Lindzen and Choi [2009] used the deseasonalized fluctuations in sea surface temperatures (SSTs) and the concurrent responses in the top-of-atmosphere outgoing radiation from the ERBE satellite instrument. Distinct periods of warming and cooling in the SST were used to evaluate feedbacks. This work was subject to significant criticism by Trenberth et al. [2009], much of which was appropriate. The present paper is an expansion of the earlier paper in which the various criticisms are addressed and corrected. In this paper we supplement the ERBE data for 1985-1999 with data from CERES for 2000-2008. Our present analysis accounts for the 36 day precession period for the ERBE satellite in a more appropriate manner than in the earlier paper which simply used what may have been undue smoothing. The present analysis also distinguishes noise in the outgoing radiation as well as radiation changes that are forcing SST changes from those radiation changes that constitute feedbacks to changes in SST. Finally, a more reasonable approach to the zero-feedback flux is taken here. We argue that feedbacks are largely concentrated in the tropics and extend the effect of these feedbacks to the global climate. We again find that the outgoing radiation resulting from
Article
Full-text available
The Pacific Decadal Oscillation (PDO) has been described by some as a long-lived El Niño-like pattern of Pacific climate variability, and by others as a blend of two sometimes independent modes having distinct spatial and temporal characteristics of North Pacific sea surface temperature (SST) variability. A growing body of evidence highlights a strong tendency for PDO impacts in the Southern Hemisphere, with important surface climate anomalies over the mid-latitude South Pacific Ocean, Australia and South America. Several independent studies find evidence for just two full PDO cycles in the past century: “cool” PDO regimes prevailed from 1890–1924 and again from 1947–1976, while “warm” PDO regimes dominated from 1925–1946 and from 1977 through (at least) the mid-1990's. Interdecadal changes in Pacific climate have widespread impacts on natural systems, including water resources in the Americas and many marine fisheries in the North Pacific. Tree-ring and Pacific coral based climate reconstructions suggest that PDO variations—at a range of varying time scales—can be traced back to at least 1600, although there are important differences between different proxy reconstructions. While 20th Century PDO fluctuations were most energetic in two general periodicities—one from 15-to-25 years, and the other from 50-to-70 years—the mechanisms causing PDO variability remain unclear. To date, there is little in the way of observational evidence to support a mid-latitude coupled air-sea interaction for PDO, though there are several well-understood mechanisms that promote multi-year persistence in North Pacific upper ocean temperature anomalies.
Article
Full-text available
This paper documents various unresolved issues in using surface temperature trends as a metric for assessing global and regional climate change. A series of examples ranging from errors caused by temperature measurements at a monitoring station to the undocumented biases in the regionally and globally averaged time series are provided. The issues are poorly understood or documented and relate to micrometeorological impacts due to warm bias in nighttime minimum temperatures, poor siting of the instrumentation, effect of winds as well as surface atmospheric water vapor content on temperature trends, the quantification of uncertainties in the homogenization of surface temperature data, and the influence of land use/land cover (LULC) change on surface temperature trends. Because of the issues presented in this paper related to the analysis of multidecadal surface temperature we recommend that greater, more complete documentation and quantification of these issues be required for all observation stations that are intended to be used in such assessments. This is necessary for confidence in the actual observations of surface temperature variability and long-term trends.
Article
Full-text available
Updated tropical lower tropospheric temperature datasets covering the period 1979–2009 are presented and assessed for accuracy based upon recent publications and several analyses conducted here. We conclude that the lower tropospheric temperature (TLT) trend over these 31 years is +0.09 ± 0.03 °C decade−1. Given that the surface temperature (Tsfc) trends from three different groups agree extremely closely among themselves (~ +0.12 °C decade−1) this indicates that the “scaling ratio” (SR, or ratio of atmospheric trend to surface trend: TLT/Tsfc) of the observations is ~0.8 ± 0.3. This is significantly different from the average SR calculated from the IPCC AR4 model simulations which is ~1.4. This result indicates the majority of AR4 simulations tend to portray significantly greater warming in the troposphere relative to the surface than is found in observations. The SR, as an internal, normalized metric of model behavior, largely avoids the confounding influence of short-term fluctuations such as El Niños which make direct comparison of trend magnitudes less confident, even over multi-decadal periods.
Article
Version 6 of the UAH MSU/AMSU global satellite temperature dataset represents an extensive revision of the procedures employed in previous versions of the UAH datasets. The two most significant results from an end-user perspective are (1) a decrease in the global-average lower tropospheric temperature (LT) trend from +0.14°C decade⁻¹ to +0.11°C decade⁻¹ (Jan. 1979 through Dec. 2015); and (2) the geographic distribution of the LT trends, including higher spatial resolution, owing to a new method for computing LT. We describe the major changes in processing strategy, including a new method for monthly gridpoint averaging which uses all of the footprint data yet eliminates the need for limb correction; a new multi-channel (rather than multi-angle) method for computing the lower tropospheric (LT) temperature product which requires an additional tropopause (TP) channel to be used; and a new empirical method for diurnal drift correction. We show results for LT, the midtroposphere (MT, from MSU2/AMSU5), and lower stratosphere (LS, from MSU4/AMSU9). A 0.03°C decade⁻¹ reduction in the global LT trend from the Version 5.6 product is partly due to lesser sensitivity of the new LT to land surface skin temperature (est. 0.01°C decade⁻¹), with the remainder of the reduction (0.02°C decade⁻¹) due to the new diurnal drift adjustment, the more robust method of LT calculation, and other changes in processing procedures.
Article
From neutral to new Many of the particles in the troposphere are formed in situ, but what fraction of all tropospheric particles do they constitute and how exactly are they made? Bianchi et al. report results from a high-altitude research station. Roughly half of the particles were newly formed by the condensation of highly oxygenated multifunctional compounds. A combination of laboratory results, field measurements, and model calculations revealed that neutral nucleation is more than 10 times faster than ion-induced nucleation, that particle growth rates are size-dependent, and that new particle formation occurs during a limited time window. Science , this issue p. 1109
Article
Globally averaged surface air temperatures in some decades show rapid increases (accelerated warming decades), and in other decades there is no warming trend (hiatus decades). A previous study showed that the net energy imbalance at the top of the atmosphere of about 1Wm22 is associated with greater increases of deep ocean heat content below 750m during the hiatus decades, while there is little globally averaged surface temperature increase or warming in the upper ocean layers. Here the authors examine processes involved with accelerated warming decades and address the relative roles of external forcing from increasing greenhouse gases and internally generated decadal climate variability associated with interdecadal Pacific oscillation (IPO). Model results from the Community Climate System Model, version 4 (CCSM4), show that accelerated warming decades are characterized by rapid warming of globally averaged surface air temperature, greater increases of heat content in the upper ocean layers, and less heat content increase in the deep ocean, opposite to the hiatus decades. In addition to contributions from processes potentially linked to Antarctic Bottom Water (AABW) formation and the Atlantic meridional overturning circulation (AMOC), the positive phase of the IPO, adding to the response to external forcing, is usually associated with accelerated warming decades. Conversely, hiatus decades typically occur with the negative phase of the IPO, when warming from the external forcing is overwhelmed by internally generated cooling in the tropical Pacific. Internally generated hiatus periods of up to 15 years with zero global warming trend are present in the future climate simulations. This suggests that there is a chance that the current observed hiatus could extend for several more years.
Article
This article describes progress in the homogenization of global radiosonde temperatures with updated versions of the Radiosonde Observation Correction Using Reanalyses (RAOBCORE) and Radiosonde Innovation Composite Homogenization (RICH) software packages. These are automated methods to homogenize the global radiosonde temperature dataset back to 1958. The break dates are determined from analysis of time series of differences between radiosonde temperatures (obs) and background forecasts (bg) of climate data assimilation systems used for the 40-yr European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERA-40) and the ongoing interim ECMWF Re-Analysis (ERA-Interim). RAOBCORE uses the obs−bg time series also for estimating the break sizes. RICH determines the break sizes either by comparing the observations of a tested time series with observations of neighboring radiosonde time series (RICH-obs) or by comparing their background departures (RICH-τ). Consequently RAOBCORE results may be influenced by inhomogeneities in the bg, whereas break size estimation with RICH-obs is independent of the bg. The adjustment quality of RICH-obs, on the other hand, may suffer from large interpolation errors at remote stations. RICH-τ is a compromise that substantially reduces interpolation errors at the cost of slight dependence on the bg. Adjustment uncertainty is estimated by comparing the three methods and also by varying parameters in RICH. The adjusted radiosonde time series are compared with recent temperature datasets based on (Advanced) Microwave Sounding Unit [(A)MSU] radiances. The overall spatiotemporal consistency of the homogenized dataset has improved compared to earlier versions, particularly in the presatellite era. Vertical profiles of temperature trends are more consistent with satellite data as well.
Article
Power spectra of global surface temperature (GST) records reveal major periodicities at about 9.1, 10-11, 19-22 and 59-62 years. The Coupled Model Intercomparison Project 5 (CMIP5) general circulation models (GCMs), to be used in the IPCC (2013), are analyzed and found not able to reconstruct this variability. From 2000 to 2013.5 a GST plateau is observed while the GCMs predicted a warming rate of about 2 K/century. In contrast, the hypothesis that the climate is regulated by specific natural oscillations more accurately fits the GST records at multiple time scales. The climate sensitivity to CO2 doubling should be reduced by half, e.g. from the IPCC-2007 2.0-4.5 K range to 1.0-2.3 K with 1.5 C median. Also modern paleoclimatic temperature reconstructions yield the same conclusion. The observed natural oscillations could be driven by astronomical forcings. Herein I propose a semi empirical climate model made of six specific astronomical oscillations as constructors of the natural climate variability spanning from the decadal to the millennial scales plus a 50% attenuated radiative warming component deduced from the GCM mean simulation as a measure of the anthropogenic and volcano contributions to climatic changes. The semi empirical model reconstructs the 1850-2013 GST patterns significantly better than any CMIP5 GCM simulation. The model projects a possible 2000-2100 average warming ranging from about 0.3 C to 1.8 C that is significantly below the original CMIP5 GCM ensemble mean range (1 K to 4 K).
Article
Surface temperatures have been observed in East Africa for more than 100 yr, but heretofore have not been subject to a rigorous climate analysis. To pursue this goal monthly averages of maximum (TMax), minimum (TMin), and mean (TMean) temperatures were obtained for Kenya and Tanzania from several sources. After the data were organized into time series for specific sites (60 in Kenya and 58 in Tanzania), the series were adjusted for break points and merged into individual gridcell squares of 1.25°, 2.5°, and 5.0°. Results for the most data-rich 5° cell, which includes Nairobi, Mount Kilimanjaro, and Mount Kenya, indicate that since 1905, and even recently, the trend of TMax is not significantly different from zero. However, TMin results suggest an accelerating temperature rise. Uncertainty estimates indicate that the trend of the difference time series (TMax − TMin) is significantly less than zero for 1946–2004, the period with the highest density of observations. This trend difference continues in the most recent period (1979–2004), in contrast with findings in recent periods for global datasets, which generally have sparse coverage of East Africa. The differences between TMax and TMin trends, especially recently, may reflect a response to complex changes in the boundary layer dynamics; TMax represents the significantly greater daytime vertical connection to the deep atmosphere, whereas TMin often represents only a shallow layer whose temperature is more dependent on the turbulent state than on the temperature aloft. Because the turbulent state in the stable boundary layer is highly dependent on local land use and perhaps locally produced aerosols, the significant human development of the surface may be responsible for the rising TMin while having little impact on TMax in East Africa. This indicates that time series of TMax and TMin should become separate variables in the study of long-term changes.
Article
The impact of time-varying radiative forcing on the diagnosis of radiative feedback from satellite observations of the Earth is explored. Phase space plots of variations in global average temperature versus radiative flux reveal linear striations and spiral patterns in both satellite measurements and in output from coupled climate models. A simple forcing-feedback model is used to demonstrate that the linear striations represent radiative feedback upon nonradiatively forced temperature variations, while the spiral patterns are the result of time-varying radiative forcing generated internal to the climate system. Only in the idealized special case of instantaneous and then constant radiative forcing, a situation that probably never occurs either naturally or anthropogenically, can feedback be observed in the presence of unknown radiative forcing. This is true whether the unknown radiative forcing is generated internal or external to the climate system. In the general case, a mixture of both unknown radiative and nonradiative forcings can be expected, and the challenge for feedback diagnosis is to extract the signal of feedback upon nonradiatively forced temperature change in the presence of the noise generated by unknown time-varying radiative forcing. These results underscore the need for more accurate methods of diagnosing feedback from satellite data and for quantitatively relating those feedbacks to long-term climate sensitivity.
Article
The IPCC AR4 (200710. IPCC AR4 . 2007 . “ Climate Change 2007, The Physical Science Basis. Contribution of Working ” . In Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change , Edited by: Soloman , S. , Qin , D. , Manning , M. , Chen , Z. , Marquis , M. , Averyt , K.B. , Tignor , M. and Miller , H.L. Cambridge, , England : Cambridge University Press . View all references) discussed bulk tropospheric temperatures as an indicator of atmospheric energy content. Here, we examine the latest publications about, and versions of, the AR4 data sets. The metric studied is the trend that represents the average rate of atmospheric energy accumulation that relates to increased greenhouse gas forcing. For temperatures from microwave instruments, UAHuntsville's indicates the lowest trend for 1979–2009 and National Oceanic and Atmospheric Administration – Center for Satellite Applications and Research (NOAA-STAR)'s the highest, being slightly higher than Remote Sensing Systems' (RSS). Updated analyses using radiosonde data suggest RSS and STAR experienced spurious warming after the mid-1990s. When satellite and radiosonde data sets are considered, the global trends for 1979–2009 of the lower and mid-troposphere are +0.15 and +0.06°C decade−1 respectively. Error ranges of these estimates, if we do not apply information that indicates some data sets contain noticeable trend problems, are at least ±0.05°C decade−1, which needs reduction to characterize forcing and response in the climate system accurately.
Article
El Niño/Southern Oscillation (ENSO) remains the most important coupled ocean–atmosphere phenomenon to cause global climate variability on seasonal to interannual time scales. This paper addresses the need for a reliable ENSO index that allows for the historical definition of ENSO events in the instrumental record back to 1871. The Multivariate ENSO Index (MEI) was originally defined as the first seasonally varying principal component of six atmosphere–ocean (COADS) variable fields in the tropical Pacific basin. It provides for a more complete and flexible description of the ENSO phenomenon than single variable ENSO indices such as the SOI or Niño 3.4 SST. Here we describe our effort to boil the MEI concept down to its most essential components (based on SLP, SST) to enable historical analyses that more than double its period of record to 1871–2005. The new MEI.ext confirms that ENSO activity went through a lull in the early- to mid-20th century, but was just about as prevalent one century ago as in recent decades. We diagnose strong relationships between peak amplitudes of ENSO events and their duration, as well as between their peak amplitudes and their spacing (periodicity). Our effort is designed to help with the assessment of ENSO conditions through as long a record as possible to be able to differentiate between ‘natural’ ENSO behaviour in all its rich facets, and the ‘Brave New World’ of this phenomenon under evolving GHG-related climate conditions. So far, none of the behaviour of recent ENSO events appears unprecedented, including duration, onset timing, and spacing in the last few decades compared to a full century before then. Copyright © 2011 Royal Meteorological Society
Article
Satellite data from the microwave sounding unit (MSU) channel 4, when carefully merged, provide daily zonal anomalies of lower-stratosphere temperature with a level of precision between 0.01 and 0.08 C per 2.5 deg latitude band. Global averages of these daily zonal anomalies reveal the prominent warming events due to volcanic aerosol in 1982 (El Chichon) and 1991 (Mt. Pinatubo), which are on the order of 1 C. The quasibiennial oscillation (QBO) may be extracted from these zonal data by applying a spatial filter between 15 deg N and 15 deg S latitude, which resembles the meridional curvature. Previously published relationships between the QBO and the north polar stratospheric temperatures during northern winter are examined but were not found to be reproduced in the MSU4 data. Sudden stratospheric warmings in the north polar region are represented in the MSU4 data for latitudes poleward of 70 deg N. In the Southern Hemisphere, there appears to be a moderate relationship between total ozone concentration and MSU4 temperatures, though it has been less apparent in 1991 and 1992. In terms of empirical modes of variability, the authors find a strong tendency in EOF 1 (39.2% of the variance) for anomalies in the Northern Hemisphere polar regions to be counterbalanced by anomalies equatorward of 40 deg N and 40 deg S latitudes. In addition, most of the modes revealed significant power in the 15-20 day period band.
Lower and mid-tropospheric temperature
  • J R Christy
  • J. R. Christy
Detection and attribution of climate change: From global to regional
  • N L Bindoff
  • N. L. Bindoff
Technical Summary In Climate Change 2013: The Physical Science Basis
  • T F Stocker
  • T. F. Stocker
2016: Climate Data Explorer
  • G J Van Oldenbrogh
2017: Lower and mid-tropospheric temperature
  • J R Christy
Changes in atmospheric constituents and radiative forcing
  • P. Forster
  • S. Solomon
Climate Data Explorer
  • G J Van Oldenbrogh
  • G. J. Oldenbrogh van
Anthropogenic and natural radiative forcing
  • G Myhre
  • G. Myhre
Evaluations of cloud and water vapor simulations in CMIP5 climate models using NASA “A-Train” satellite observations
  • J H Jiang
  • J. H. Jiang