ArticlePDF Available

A Review of the Anthropogenic Global Warming Consensus: An Econometric Forecast Based on the ARIMA Model of Paleoclimate Series


Abstract and Figures

This paper projects a climate change scenario using a stochastic paleotemperature time series model and compares it to the prevailing consensus using Autoregressive Integrated Moving Average Process Model (ARIMA). The parameter estimates of the model were below that established by the anthropogenic experts and governmental organs, such as the IPCC (UN) over a 100-year scenario. Results from the ARIMA model suggest a current period of temperature reduction and a probable cooling. The results from this study add a statistical element of paleoclimate to the debate that contradicts the current scientific consensus.
Applied Economics and Finance; Vol. 9, No. 3; 2022
ISSN 2332-7294 E-ISSN 2332-7308
Published by Redfame Publishing
A Review of the Anthropogenic Global Warming Consensus: An
Econometric Forecast Based on the ARIMA Model of Paleoclimate Series
Gilmar Veriato Fluzer Santos1, Lucas Gamalel Cordeiro1, Claudio Antonio Rojo1, & Edison Luiz Leismann1
1 PPGAdm, Universidade do Oeste do Paraná, Cascavel, Brasil
Correspondence: Gilmar Veriato Fluzer Santos, PPGAdm, Universidade do Oeste do Paraná, Cascavel, Brasil.
Received: July 19, 2022 Accepted: August 26, 2022 Available online: September 2, 2022
doi:10.11114/aef.v9i3.5703 URL:
This paper projects a climate change scenario using a stochastic paleotemperature time series model and compares it to
the prevailing consensus using Autoregressive Integrated Moving Average Process Model (ARIMA). The parameter
estimates of the model were below that established by the anthropogenic experts and governmental organs, such as the
IPCC (UN) over a 100-year scenario. Results from the ARIMA model suggest a current period of temperature reduction
and a probable cooling. The results from this study add a statistical element of paleoclimate to the debate that contradicts
the current scientific consensus.
Keywords: global warming, Paleoclimatology, time series, Arima model, climate scenarios, forecasting
1. Introduction
Controversies regarding global warming and its effects on the economy and the environment are the subject of global
discussion and debate. These controversies also partially determine how governments and companies develop their
policies and conduct their business.
Human action has been responsible for climate change and global warming (greenhouse effect) according to the followers
of anthropogeny and other international bodies such as the IPCC (Intergovernmental Panel on Climate Change - UN); as
echoed by most scientific publications (more than 90% of research studies) that show that global warming is
anthropogenic. This explanation has been established as the "official version" by IPCC advocates (Salzer, Neske & Rojo,
2019; Cook et al. 2013; Bray, 2010; Anderegg et al., 2010; Oreskes, 2004). The IPCC Working Group Chair, Jim Skea,
stated: "Limiting warming to 1.5°C is possible within the laws of chemistry and physics but doing so requires
unprecedented changes" (IPCC Special Report, 2019).
Nevertheless, assessing the state of scientific contestations on certain issues when the scientific community considers a
proposition a fact is essential as posited by Shwed and Bearman (2010). The authors also explain how internal dissent in
the face of consensus diminishes.
The defenders of the naturalistic cause present arguments that challenge published research studies by claiming that
anthropogenic global warming is theoretically fragile with calculated misinformation, and its historical sample of only
150 years is insufficient to establish a consensus often supported by agnotology and metric uncertainties (Molion, 2008;
Legates et al. 2015; Legates, Soon & Briggs, 2013; Reinsinger et al. 2010).
Notably, the more research explores the past, the more the anthropogenic thesis is weakened. Davis (2017) and Harde
(2019) found that changes in the atmospheric CO2 concentration did not cause changes in ancient climate temperature.
They also found that climate change was not related to the carbon cycle but to native impacts. Easterbrook (2016), in his
evidence-based book opposed CO2 emissions as the primary source of global warming; however, this thesis has been
captured by politics and dubious computer modeling.
Furthermore, other anthropogenic studies either ignore the paleoclimatology factor in research or factor it in as an element
of uncertainty, such as the research by Haustein et al. (2017), Cook et al. (2013), Mitchell et al. (2017), Medhaug et al.
(2017), similar to studies at the genesis of the IPCC studies (Solomon et al. 2007). However, scientists such as
Easterbrook (2016) and the arguments present in Koonin's book (2021), are increasingly pointing to data which suggests
that climate changes result from natural cycles that have been occurring for thousands of years,.
Thus, there is a gap in this debate which is the absence of a broader time horizon and statistical predictability to climate
change. This study, set out to establish a climate prediction scenario for the next 100 years based on a 12,000-year Applied Economics and Finance Vol. 9, No. 3; 2022
paleotemperature series (Holocene Period) and the uncertainties within that the data were used to make this prediction.
We adopted the Autoregressive Integrated Moving Average (ARIMA) model, also known as Box-Jenkins. Box-Jenkins
ARIMA model's objective is to provide a valid basis for forecasting, after all tests, parameters, and diagnostics have been
performed. We obtained the ARIMA database from the article by Kaufman et al. (2020), who applied five statistical
methods of thermal reconstruction to verify global mean surface temperature (GMST) to the present day.
Our results indicated the fragility of the anthropogenic thesis by showing significant divergence from the latest scenario
projected by the IPCC that predicted an increase of more than 1.5°C in the planet's temperature by 2050 (IPCC, 2019).
Therefore, we established an additional variable for the global warming debate to stimulate critical discussion about the
consensus that prevails today.
2. Data, Method, and Its Justification
The data used in this research were obtained from Kaufmann et al., (2020) unprecedented multi-method reconstruction
research of mean land surface temperature (GMST) during the Holocene era (12,000 years) to the present day, "whose
database is the most comprehensive global compilation of previously available published Holocene proxy temperature
time series" (Kaufman et al., 2020, p. 01).
Primary data from the current study is available as individual CSV files and merged as a netCDF file at figshare 35 and at
NOAA Palaeoclimatology 36 ( A CSV file with the multi-method joint
median and 5th and 95th percentiles is also available in both data repositories. All were used as input data to compose the
12k time series of paleotemperatures in the two variables (the median and the uncertainty set) and fed into IBM SPSS-
Statistics software (v. 22). The data generated for the development of this research are available in supplementary file.
The Arima model is an established predictive tool, as demonstrated in the works of Babu & Reddy (2014), Valipour
(2015) and Katimon, Shahid & Mohsenipour (2018).
In this study, our methodology was based on the use of this model for long-term forecasting in time series, both in
economic and stochastic variables, such as temperatures for example suggested in the book by Gujarati & Porter (ch.
21-22, 2011).
To respond to our research question, the data were represented graphically and fed into the software IBM - SPSS Statistics,
v. 22, for ARIMA - Box Jenkins methodology. Figure 1 shows the evolution of the 12k median of the data set extracted
from Kaufmann et al. (2020) on a 100-year scale, with milestone "0" being the year 2019 (p. 8) calculated from the
different reconstruction methods.
Figure 1. Evolution of the Global Median 12k years temperature
Source: Author elaboration (adapted from Kaufman et al. (2020 p. 06) from CSV file data at
Figure 2 depicts the 5th and 95th percentile range of the set that takes into consideration various sources of uncertainty,
including proxy temperature, chronology, and methodological choices, as per Kaufman et al. (2020 p. 03). Applied Economics and Finance Vol. 9, No. 3; 2022
Figure 2. Evolution of the parameters 5th and 95th global percentiles (uncertainties)
Source: Author elaboration (adapted from CSV file data - temp 12K all methods percentiles at
The average temperature of the 1800-1900 period for each composite was used as the pre-industrial reference period that
was defined by the authors as an anomaly of 0° C; an average which served as the reference for the IPCC (1850-1900).
Hence, this period was removed from each member of the ensemble to avoid issuing individual records and different
reconstructions (Kindly refer to page 04 of Kaufmann et al. (2020). The forecasts of the two-time series, median and
uncertainties, were generated in the IBM - SPSS Statistics software, version 22, in a specific session for ARIMA
2.1 Stochastic Processes and the Stationarity Test
To introduce the forecast, we graphically and mathematically presented the results that the SPSS software generated for
the two variables of this study, the median and the uncertainty set. We presented the graphs in this section, the
mathematical formulation of their results and the structuring of the uncertainty set (same pattern) in a supplementary file.
Firstly, we applied two tests to verify the stationarity of the time series: (1) graphical analysis and (2) the correlogram
test; condition for using the ARIMA (BJ) model.
An important condition for model reliability is the residuals of the ACF (Autocorrelation Function) and PACF (The partial
autocorrelation function) correlations, the white noise. For the model to be validated as the most adequate model, ACF
and PACF should be concentrated around the mean, and the degree of significance should be absolute (0 or close) as
represented in figure 3. (Note: Retardo means Lag; “de resíduo” means of waste)
Figure 3. Residuals of the ACF and PACF correlograms (White noise)
Source: prepared by the author (SPSS - Statistics v. 22)
∆ global temperature 5th e 95th
Global 12K GMS T temperature 5th and 95th percentis
global_5 global_95 Applied Economics and Finance Vol. 9, No. 3; 2022
Thus, once stationarity was achieved (see p. 7-9), we could model it with an autoregressive process (AR), which we
represented by Yₜ the Median (Md) at period t (Holocene) as:
󰇛󰳞 󰇜 󰘡󰇛󰳞 󰇜 󰳞󰇛󰇜
where δ is the mean of Y and ut is an uncorrelated random error with zero mean and constant variance ᾳ² (this is white
noise); thus Yt follows a first-order stochastic autoregressive or AR process (1).
The AR process we have just discussed is not just a mechanism that may have generated Y. In this case, Y may evolve into
a first order moving average process, or an MA (1). If we model Y in as shown below:
Y󰳞 󰳞 + ß󰳞-
where µ is a constant and u, as before, is a white noise stochastic error term. Here Y at period t is equal to a constant plus
a moving average of the current and past error terms. More generally, we can represent it like this
󰳞  󰳞 󰳞 󰳞 󰳞 (3)
which is an MA(q) process. Briefly, a moving average process is a linear combination of white noise error terms.
Therefore Y, most likely has characteristics of both AR and MA and is therefore ARMA. Then Yt follows an ARMA (1,1)
process, and can be written as
󰳞 󰌞 󰘡󰳞 󰳞 󰳞 (4)
because there is an autoregressive term and a moving average term. In the Equation, ɵ represents a constant term. In
general, in an ARMA (p, q) process, there will be p autoregressive terms and q moving average terms.
In the fit chart, shown in figure 4, the two lines coincide and almost overlap indicating that this is the best of the models
tested. The outliers present between 1 and 5 dates were retain to ensure the series is robust and to guarantee its impartiality
and uncertainty for future events (Stockinger & Dutter, 1987). Note: observado means observed; ajuste means adjust;
UCL: the upper control limit; LCL: the lower control limit.
Figure 4. Graph of the adjusted 12k median series
Source: elaborated by the author (SPSS- Statistics)
Froom Figures 1, 2 and 4, the series are not stationary; the data do not circulate it and express a trend around a mean
line for the 12K global temperature median series (Figure 5) . Note: número de sequência means sequence number. Applied Economics and Finance Vol. 9, No. 3; 2022
Figure 5. Graphical test for stationarity
Next, we applied correlation tests, also known as "F" correlation function: ACF (automatic) and PACF (partial) to make
the series stationary, as shown in figures 6 and 7, and Table 1. Note: coeficiente means coefficient. mero de retardo
means Lag numbers.
Figure 6. Graphical test of autocorrelation (automatic)
Table 1. LJung Box statistical report (Ho and H₁ hypotheses)
Automatic correlations
Series: MdTempGlob
Box-Ljung Statistics
a. The underlying process considered is independence (white noise).
b. Based on the asymptotic chi-square approximation. Applied Economics and Finance Vol. 9, No. 3; 2022
Figure 7. Graphical test of partial autocorrelation PACF
Graphical and correlation analysis indicated that we had to normalize the series to make it stationary. The process
occured with the choice of the first lag (lag), which exceeded the confidence interval in both tests and whose degree of
graphical significance was higher, i.e., it had the highest correlation and the lowest value according to the Ljung-Box
statistic. The lag that met these criteria, therefore, was number 1, highlighted in Table 1.
From these results, we graphically represented (Figure 8) the stationarity adjusted, as a function of the first
differentiation (lag 1):
Figure 8. Adjusted stationarity as a function of lag 1
We then replicated this modeling for the probabilistic analysis of the uncertainties, represented by the 5th and 95th
percentiles, at a 90% confidence level, since it assumed the same stationarity criteria and tests (graph and correlogram)
of the median. The graphical representation of the uncertainty set is described in the supplementary file.
2.2 Applying the Box- Jenkins Model
Box-Jenkins’s method aims (Figure 9) is to estimate a statistical model and interpret it according to the sample data. If
this estimated model is used for forecasting, we should assume that its characteristics are constant over the period and
particularly in future periods. Any model that is inferred based on stationary data can be interpreted as stationary or
stable and therefore provides a valid basis for prediction (Pokorny, 1987, Gujarati and Porter, 2011). Applied Economics and Finance Vol. 9, No. 3; 2022
Figure 9. The BoxJenkins methodology
Step 1: Identification. We concluded this step by finding the appropriate values of p, d and q. We also showed how the
correlogram and the partial correlogram facilitated this task.
Step 2: Estimation. We identified the appropriate p and q values and estimated the parameters of the autoregressive and
moving average terms included in the model.
Step 3: Diagnostic check. Having chosen a particular ARIMA model and estimated its parameters, the chosen model
fitted the data perfectly as attested by the white noise of the residuals.
Step 4, Forecasting: One of the reasons why ARIMA modeling is popular is due to its success in forecasting. In many
cases, ARIMA forecasts for both long- and short-term variables,are more reliable than those obtained from traditional
econometric modeling (Gujarati and Porter, 2011, p. 762;778).
We concluded that the MedTempGlobal (as described in the data/figures) time series model was not stationary and we
had to normalize it, making it stationary with constant mean and variance and its covariance invariant over time.
Therefore, it is an integrated time series, i.e., it combines the two autoregressive processes (AR and MA) in the same set.
An important point to note is that when using the Box- Jenkins methodology, we must have both a stationary time series
and a time series that is stationary after one or more differentiations (Gujarati and Porter, 2011).
Then, we can state that if a time series is integrated of order 1, therefore, it is I (1), after differentiating it becomes I (0),
that is, stationary. In general, if a time series is I (d), after differentiating it d times, we get an I (0) series.
If one has to differentiate a time series d times to make it stationary and apply the ARMA (p, q) model to it, one will say
that the original time series is ARIMA (p, d, q), that is, it is a moving average integrated autoregressive time series,
where p denotes the numbers of the autoregressive terms, d the number of times the series must be differentiated before it
becomes stationary, and q the number of moving average terms.
We, therefore, have in this time series an ARIMA (1,0,1) model, as it was differentiated once (d = 1) before becoming
stationary (of first difference), and can be modeled as an ARMA (1,1) process, as it has an AR term and an MA post
Finally, it is important to emphasize that to optimize the results, it was necessary to run in the software SPSS - Statistics all
the possible combinations of the ARIMA model (p,d,q) in the two parameters, to arrive at the statistically optimal model
after the decomposition of the data and meeting the criteria of analysis and execution.
3. Results
The parameters used in this study were the median and the 5th and 95th percentiles representing the estimate of
uncertainties with 90% confidence, as recommended by authors below:
“Future users of this reconstruction use the full ensemble when considering
the plausible Holocene GMST evolution. By representing the multi-method
reconstruction as a single time series, the median of the ensemble may be best
along with the 90% range of the ensemble to represent uncertainty." Applied Economics and Finance Vol. 9, No. 3; 2022
(Kaufman et al., 2020, p.04).
As per the model parameters, predictions for the median were expressed as temperature estimates for the next 100 years
represented by AR and MA. For statistical reliability, the degree of significance (Marôco, 2018) of the measured
parameters must be extremely significant in AR and very significant in MA, as shown on the output of table 2.
Table 2. 100-year scale temperature estimates of AR and MA parameters
Arima model parameters
No transformation
Lag 1
Lag 1
Source: Author elaboration with Software SPSS - Statistics v. 22.
For the uncertainty results, the 5th and 95th percentiles, a similar to that used for the median, whose configuration is
described in a supplementary file, was used. The following parameters were generated for the uncertainty results as
shown in table 3:
Table 3. Output of the parameters of the 5th and 95th percentile temperatures (model uncertainty)
ARIMA model parameters
No transformation
Lag 1
Lag 1
No transformation
Lag 1
Lag 1
Source: Author elaboration (SPSS - Statistics v. 22).
We then had a set of six different extremely significant temperature results for the estimates of the two models: 0.932℃;
-0.266℃ (Tab. 2) and 0.999℃; -0.70℃; 0.996℃; -0.382℃ (Tab. 3). The median, extracted from the set of estimates of
the two models, whose result was 0.333℃, was the most appropriate statistical measure in this case to fulfill the
objective of this study to calculate and adopt a reference standard measure. Based on our results, by the end of this
millennium we will have an average global temperature below the 1.50°C to 2.00°C projected by the IPCC. Thus, our
results indicate that, contrary to warming, the world may experience a period of decreasing temperatures over the next
hundred years.
4. Discussion
Why there is so much consensus around a scenario that leaves much room for doubt? Why there is so much scientific
unanimity around anthropogenic warming (97.2% according to Cook et al., 2013), now called "climate change"?
When comparing recent temperatures to the distribution of global maximum temperatures during the Holocene, on
average there has been a 1℃ increment over the pre-industrial period (1850-1900) and for most members of the ensemble.
Furthermore, no 200-year interval during these series exceeded the warmth of the most recent decade (Kaufman et al., p.
5, 2000). The time horizon of the anthropogenic thesis is more recent when compared to the time of man's existence on
earth (Holocene) and when the time of the anthropogenic thesis is compared to the results of this research, it lacks
substantiation if analyzed in the light of statistical science.
On the other hand, Kaufman et al., (2020), who relied on the IPCC projections, admit that this century’s temperatures are
likely to exceed 1℃ when compared to those of the pre-industrial era (1800-1900), which they considered as an anomaly
of 0℃. Although the authors claim that the Holocene GMST reconstruction is comparable to the IPCC long-term
projections and those seen in the last decade, the results presented here show a different and antagonistic scenario
especially if one considers a hundred-year scale and the historical temporality present in the statistical series.
Furthermore, in the graphical temporal observations of the studies by Kaufman et al. (2020, p.6, fig. 3), Davis (2017, p. 6,
fig. 5) and Moberg et al. (2005, p. 3, fig. 2), there is significant climate variability every 2K years casting doubt on
establishing anthropogenicity as a criterion for the last 150 years. These observations are confirmed by Moberg et al.
(2005) who conclude that "The resulting model reconstruction supports the case that multicentennial natural variability Applied Economics and Finance Vol. 9, No. 3; 2022
has been larger than is commonly thought, and that considerable natural climate variation can be expected in future." One
of the villains of anthropogenic genesis, the greenhouse effect, was unveiled in 1896 by Arrhenius as a natural
phenomenon beneficial to the development of biological life on the earth's surface (troposphere). Arrhenius’ studies were
subsequently confirmed by Miller & Spoolman (2016), who stated that, "Our climate, lives, and economies depend on the
natural greenhouse effect. Greenhouse gases absorb heat radiated by the earth and the gases then emit infrared radiation
that warms the atmosphere. Without the natural greenhouse effect, the earth would become cold and uninhabitable." On
the contrary, other studies on the threat of the man-made greenhouse effect, such as those by Gillett & Matthews (2010)
and Anderson, Hawkins & Jones (2016), are inconclusive regarding the magnitude of these effects and demonstrate
uncertainty when set against the complexity of the earth's geophysical and climate systems. Therefore, reinventing this
evidence as the proponents of anthropogeny orthodoxly claim, is something that does not hold up considering the
historical veracity of science.
It is important to say our results do not ignore the impact that human action has brought to recent climate change which
however appears insignificant in the face of the millennial variability of the climate, the size and complexity of the
universe, and all the natural and astronomical phenomena that interact with the earth in the planetary system. Lastly, our
predicted climate scenario cannot determine what are the true causes of recent climate change, whether natural or
anthropogenic, since the two may be complementary, not divergent. Further studies on paleoclimate and its variability
are needed to corroborate the estimates resulting from this research and to avail more evidence in the search for scientific
truth.Therefore, reinventing this evidence as the proponents of anthropogeny orthodoxly claim, is something that does not
hold up considering the historical veracity of science. However, our results do not ignore the impact that human action has
brought to recent climate change which however appears insignificant in the face of the millennial variability of the
climate, the size and complexity of the universe, and all the natural and astronomical phenomena that interact with the
earth in the planetary system. Lastly, our predicted climate scenario cannot determine what are the true causes of recent
climate change, whether natural or anthropogenic, since the two may be complementary, not divergent. Further studies
on paleoclimate and its variability are needed to corroborate the estimates resulting from this research and to avail more
evidence in the search for scientific truth.
5. Conclusion and Policy Recommendation
In view of our findings, it is unreasonable to subject governments and organizations to be hostages of a doubtful thesis
with all its consequences in the face of scientific relativity. Subjecting government organizations and governments to the
current consensus may condemn humanity to unjustifiable climate catastrophism.
All actors involved in the global climate change movement ought to be heard and their opinions taken into consideration
especially when it comes to phenomena not yet proven by time and human experimentation. Every scientific consensus
should not be treated as absolute truth; one of the pillars of science, which has always thrived on seeking truth through doubt.
Nevertheless, this article does not belittle the public, governmental and corporate policies in relation to the alarmism that is
being given to climate change. Any public or private initiative to mitigate the impact that the planet suffers due to human
action is laudable and necessary. But the future and the progress of the next generations, especially in third world countries,
should be compromised by an ‘official’ scientific narrative that has become official; a narrative that is to the detriment of a
minority that thinks and researches differently, but with foundations and arguments as important or equal to it.
Data Records
The data collected for this research were extracted from Kaufman et al., 2020, as referenced in the text. After processing,
the data were fed into IBM-SPSS Statistics v. 22, at, for analysis and the of results’ generation.
These data are avaiable in the figshare repository:;
Technical Validation
All data validation were done on the ARIMA platform of SPSS-Statistics as described in the body of the manuscript and
in the data repository. For logistical reasons, only the data that satisfied the criterial for the research methodology is
available in the repository.
Special thanks and recognition to professors Claudio Antonio Rojo and Edison Luiz Leismann, who provided all the
support required to carry out this research. We also wish to acknowledge the legacy left behind by the authors cited in Applied Economics and Finance Vol. 9, No. 3; 2022
this paper, besides Kaufman et al. (2020): Routson et al. (2019), Cook et al. (2013), PAGES 2k Consortium, Marcott et
al. (2013), Harde (2019), Box & Jenkins (1978), Gujarati & Porter (2011), to whom we am indebted for the knowledge
Author contributions statement
G. V. F. S. and L.G.C designed the experiment (s), C. A. R. and E. L. L. conducted the experiments, E. L. L. and C. A.
R. analyzed the results. All authors reviewed the manuscript and approved the final version for publication.
Competing interests and funding sources
The authors declare no competing interests. The authors did not receive any grants from funding agencies in the public,
commercial, or non-profit sectors to do this research.
Additional information
Correspondence and requests for materials should be addressed to G.V. F. S.
Author’s information
C. A. R:
E. L. L:
Anderegg, W. R. L., Prall, J. W., Harold, J., & Schneider, S. H. (2010). Expert Credibility in Climate Change.
Proceedings of the National Academy of Sciences Jul 2010, 107(27), 12107-12109.
Anderson, T. R., Hawkins, E., & Jones, P. D. (2016). CO2, the greenhouse effect and global warming: from the
pioneering work of Arrhenius and Callendar to today's Earth System Models. Endeavour, 40(3), 178-187.
Babu, C. N., & Reddy, B. E. (2014). A moving-average filter based hybrid ARIMAANN model for forecasting time
series data. Applied Soft Computing, 23, 27-38.
Box, G. E. P., Jenkins, G. M., & Reinsel, G. C. (2008). Time Series Analysis: Forecasting and Control (4th ed). Wiley,
Box, G. P., & Jenkins, G. M. (1978). Time series analysis: forecasting and control. Ed. rev. holden day, são francisco:
holden. Document shared on
Bray, D. (2010). The scientific consensus of climate change revisited. Ciência e política ambiental, 13(5), 340-350.
Cook, J. et al (2013). Environ. Res. Lett. 8 024024.
Davis, W. J. (2017). The Relationship between Atmospheric Carbon Dioxide Concentration and Global Temperature for
the Last 425 Million Years. Climate 2017, 5, 76.
Easterbrook, D. (Ed.). (2016). Evidence-based climate science: data opposing CO2 emissions as the primary source of
global warming. Elsevier.
Gillett, N. P., & Matthews, H. D. (2010). Accounting for carbon cycle feedbacks in a comparison of the global warming
effects of greenhouse gases. Environmental Research Letters, 5(3), 034011. Retrieved from
Gujarati, D. N., & Porter, D. C. (2011). Basic Econometrics, 5th edition. 767-778. AMGH Editora.
Harde, H. (2019). What Humans Contribute to Atmospheric CO2: Comparison of Carbon Cycle Models with
Observations. Earth Sciences, 8(3), 139-158.
Haustein, K., Allen, M. R, & Forster, P. M. et al. (2017). A real-time Global Warming Index. Sci Rep, 7, 15417.
IBM SPSS Statistics v. 22. (2020). Retrieved from
IPCC - Intergovernmental Panel on Climate Change. United Nations. N.Y. (2019). Retrieved from Applied Economics and Finance Vol. 9, No. 3; 2022
Katimon, A., Shahid, S., & Mohsenipour, M. (2018). Modeling water quality and hydrological variables using ARIMA:
a case study of Johor River, Malaysia. Sustainable Water Resources Management, 4(4), 991-998.
Kaufman, D., McKay, N., & Routson, C. et al. (2020). Holocene global mean surface temperature, a multi-method
reconstruction approach. Sci Data, 7, 201.
Koonin, S. E. (2021). Unsettled: What Climate Science Tells Us, What It Doesn't, and Why It Matters. BenBella Books.
Legates, D. R., Soon, W., & Briggs, (2013). WM Learning and Teaching Climate Science: The Perils of Consensus
Knowledge Using Agnotology. Sci & Educ, 22, 2007-2017.
Legates, D. R., Soon, W., Briggs, W. M., & C. Monckton of Brenchley (2015). Consenso Climático e 'Desinformação':
Uma Tréplica à Agnotologia, Consenso Científico e o Ensino e Aprendizagem das Mudanças Climáticas . Sci &
Educ, 24, 299-318.
Marcott, S. A., Shakun, J. D., Clark, P. U., & Mix, A. C. (2013). A reconstruction of regional and global temperature for
the past 11,300 years. Science, 339, 1198.
Maroco, J. (Pero Pinheiro, 2018). Análise estatística com o SPSS Statistics (7th ed.). 54-60.
Medhaug, I., Stolpe, M., & Fischer, E. et al. (2017). Reconciling controversies about the 'global warming hiatus'. Nature,
545, 41-47.
Miller, G. T., & Spoolman, S. E. (Cengage Learning, 2016). Environmental Science, 22-24.
Mitchell, D., James, R., & Forster, P. et al. (2016). Realizing the impacts of a 1.5 °C warmer world. Nature Clim
Change, 6, 735-737 (2016).
Moberg, A., Sonechkin, D., & Holmgren, K. et al. (2008). Highly variable Northern Hemisphere temperatures
reconstructed from low- and high-resolution proxy data. Nature, 433, 613-617.
Molion, L. C. B. (2004). Aquecimento global: Uma visão crítica. Revista brasileira de climatologia, 3. Retrieved from
Oreskes, N. (2004). Science.Vol. 306, Issue 5702, p.1686.
Pages 2k Consortium. (2019). Consistent multi-decadal variability in global temperature reconstructions and
simulations over the Common Era. Nat. Geosci., 12, 643-649.
Pokorny, M. (Basil Blackwell, 1987). An introduction to econometrics. (ed. Blackwell, B.) p. 343.
Prof. Svante Arrhenius (1896) XXXI. Sobre a influência do ácido carbônico no ar sobre a temperatura do solo. The
London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 41(251), 237-276.
Reisinger, A., Meinshausen, M., Manning, M., & Bodeker, G. (2010). Uncertainties of global warming metrics: CO2
and CH4. Geophys. Res. Lett. , 37 , L14707.
Routson, C. C., McKay, N. P., & Kaufman, D. S. et al. (2019). Mid-latitude net precipitation decreased with Arctic
warming during the Holocene. Nature, 568, 83-87.
Salzer, E., Neske, D. A. L., & Rojo, C. A. (2019). Global warming: bias analysis in divergent strategic scenarios.
Journal Multi-Science Research (Msr), Vitoria, 2(2), 144-158. Retrieved from
Shwed, U., & Bearman, P. (2010). The Temporal Structure of Scientific Consensus Formation. American Sociological
Review, 75(6), 817-840.
Solomon, S. et al. (Cambridge University Press, 2007). Climate Change 2007: The Physical Science Basis. Working
Group I Contribution to the Fourth Assessment Report of the IPCC.
Stockinger, N., & Dutter, R. (1987). Robust time series analysis: a survey. Kybernetika, 23(Suppl(1)), 3-88. Retrieved
Valipour, M. (2015). Long‐term runoff study using SARIMA and ARIMA models in the United States. Meteorological
Applications, 22(3), 592-598.
Copyright for this article is retained by the author(s), with first publication rights granted to the journal.
This is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
An extensive new multi-proxy database of paleo-temperature time series (Temperature 12k) enables a more robust analysis of global mean surface temperature (GMST) and associated uncertainties than was previously available. We applied five different statistical methods to reconstruct the GMST of the past 12,000 years (Holocene). Each method used different approaches to averaging the globally distributed time series and to characterizing various sources of uncertainty, including proxy temperature, chronology and methodological choices. The results were aggregated to generate a multi-method ensemble of plausible GMST and latitudinal-zone temperature reconstructions with a realistic range of uncertainties. The warmest 200-year-long interval took place around 6500 years ago when GMST was 0.7 °C (0.3, 1.8) warmer than the 19th Century (median, 5th, 95th percentiles). Following the Holocene global thermal maximum, GMST cooled at an average rate −0.08 °C per 1000 years (−0.24, −0.05). The multi-method ensembles and the code used to generate them highlight the utility of the Temperature 12k database, and they are now available for future use by studies aimed at understanding Holocene evolution of the Earth system.
Full-text available
The Intergovernmental Panel on Climate Change assumes that the inclining atmospheric CO2 concentration over recent years was almost exclusively determined by anthropogenic emissions, and this increase is made responsible for the rising temperature over the Industrial Era. Due to the far reaching consequences of this assertion, in this contribution we critically scrutinize different carbon cycle models and compare them with observations. We further contrast them with an alternative concept, which also includes temperature dependent natural emission and absorption with an uptake rate scaling proportional with the CO2 concentration. We show that this approach is in agreement with all observations, and under this premise not really human activities are responsible for the observed CO2 increase and the expected temperature rise in the atmosphere, but just opposite the temperature itself dominantly controls the CO2 increase. Therefore, not CO2 but primarily native impacts are responsible for any observed climate changes.
Full-text available
Multidecadal surface temperature changes may be forced by natural as well as anthropogenic factors, or arise unforced from the climate system. Distinguishing these factors is essential for estimating sensitivity to multiple climatic forcings and the amplitude of the unforced variability. Here we present 2,000-year-long global mean temperature reconstructions using seven different statistical methods that draw from a global collection of temperature-sensitive palaeoclimate records. Our reconstructions display synchronous multidecadal temperature fluctuations that are coherent with one another and with fully forced millennial model simulations from the Coupled Model Intercomparison Project Phase 5 across the Common Era. A substantial portion of pre-industrial (1300–1800 ce) variability at multidecadal timescales is attributed to volcanic aerosol forcing. Reconstructions and simulations qualitatively agree on the amplitude of the unforced global mean multidecadal temperature variability, thereby increasing confidence in future projections of climate change on these timescales. The largest warming trends at timescales of 20 years and longer occur during the second half of the twentieth century, highlighting the unusual character of the warming in recent decades.
Full-text available
The latitudinal temperature gradient between the Equator and the poles influences atmospheric stability, the strength of the jet stream and extratropical cyclones1–3. Recent global warming is weakening the annual surface gradient in the Northern Hemisphere by preferentially warming the high latitudes4; however, the implications of these changes for mid-latitude climate remain uncertain5,6. Here we show that a weaker latitudinal temperature gradient—that is, warming of the Arctic with respect to the Equator—during the early to middle part of the Holocene coincided with substantial decreases in mid-latitude net precipitation (precipitation minus evapotranspiration, at 30° N to 50° N). We quantify the evolution of the gradient and of mid-latitude moisture both in a new compilation of Holocene palaeoclimate records spanning from 10° S to 90° N and in an ensemble of mid-Holocene climate model simulations. The observed pattern is consistent with the hypothesis that a weaker temperature gradient led to weaker mid-latitude westerly flow, weaker cyclones and decreased net terrestrial mid-latitude precipitation. Currently, the northern high latitudes are warming at rates nearly double the global average4, decreasing the Equator-to-pole temperature gradient to values comparable with those in the early to middle Holocene. If the patterns observed during the Holocene hold for current anthropogenically forced warming, the weaker latitudinal temperature gradient will lead to considerable reductions in mid-latitude water resources. A reduced gradient in temperatures between low and high latitudes during the Holocene led to drier mid-latitudes.
Full-text available
Long-term trends in water quality and hydrological variables of natural systems reveal information about physical, chemical and biological changes and variations due to manmade and seasonal interventions. The objective of this study was to develop suitable stochastic models for predicting river water quality and hydrological variables through the establishment of dynamic relationship among the variables using transfer function modeling approaches. Autoregressive integrated moving average (ARIMA) model containing autoregressive (AR), integrated (I) and moving average (MA) was used for this purpose. The water quality variables, namely pH, color (TCU), turbidity (ppm), Al3+ (ppm), Fe2+ (ppm), NH4+ (ppm) and Mn2+ (ppm), and hydrological variables, namely rainfall and river discharge for Johor River, Malaysia, recorded for the period 2004–2007 were used in the study. Results showed that except Al3+, Fe2+, NH4+ and rainfall, all other variables are stationary. The non-stationary time series can be fitted with ARIMA (p, 1, q), while the stationary time series can be fitted with AR model with 1–5 time lags. The autocorrelations of all the samples were found within the 95% confidence bounds and the model residuals were found to follow normal probability distribution, which indicate the suitability of the models in forecasting water quality and hydrological variables. It is expected that the modeling approach developed in this paper can be replicated in other river basins for reliable prediction of river water quality due to changes in rainfall–runoff processes.
Full-text available
Assessing human impacts on climate and biodiversity requires an understanding of the relationship between the concentration of carbon dioxide (CO2) in the Earth’s atmosphere and global temperature (T). Here I explore this relationship empirically using comprehensive, recently-compiled databases of stable-isotope proxies from the Phanerozoic Eon (~540 to 0 years before the present) and through complementary modeling using the atmospheric absorption/transmittance code MODTRAN. Atmospheric CO2 concentration is correlated weakly but negatively with linearly-detrended T proxies over the last 425 million years. Of 68 correlation coefficients (half non-parametric) between CO2 and T proxies encompassing all known major Phanerozoic climate transitions, 77.9% are non-discernible (p > 0.05) and 60.0% of discernible correlations are negative. Marginal radiative forcing (ΔRFCO2), the change in forcing at the top of the troposphere associated with a unit increase in atmospheric CO2 concentration, was computed using MODTRAN. The correlation between ΔRFCO2 and linearly-detrended T across the Phanerozoic Eon is positive and discernible, but only 2.6% of variance in T is attributable to variance in ΔRFCO2. Of 68 correlation coefficients (half non-parametric) between ΔRFCO2 and T proxies encompassing all known major Phanerozoic climate transitions, 75.0% are non-discernible and 41.2% of discernible correlations are negative. Spectral analysis, auto- and cross-correlation show that proxies for T, atmospheric CO2 concentration and ΔRFCO2 oscillate across the Phanerozoic, and cycles of CO2 and ΔRFCO2 are antiphasic. A prominent 15 million-year CO2 cycle coincides closely with identified mass extinctions of the past, suggesting a pressing need for research on the relationship between CO2, biodiversity extinction, and related carbon policies. This study demonstrates that changes in atmospheric CO2 concentration did not cause temperature change in the ancient climate.
Full-text available
Climate warming during the course of the twenty-first century is projected to be between 1.0 and 3.7°C depending on future greenhouse gas emissions, based on the ensemble-mean results of state-of-the-art Earth System Models (ESMs). Just how reliable are these projections, given the complexity of the climate system? The early history of climate research provides insight into the understanding and science needed to answer this question. We examine the mathematical quantifications of planetary energy budget developed by Svante Arrhenius (1859–1927) and Guy Stewart Callendar (1898–1964) and construct an empirical approximation of the latter, which we show to be successful at retrospectively predicting global warming over the course of the twentieth century. This approximation is then used to calculate warming in response to increasing atmospheric greenhouse gases during the twenty-first century, projecting a temperature increase at the lower bound of results generated by an ensemble of ESMs (as presented in the latest assessment by the Intergovernmental Panel on Climate Change). This result can be interpreted as follows. The climate system is conceptually complex but has at its heart the physical laws of radiative transfer. This basic, or “core” physics is relatively straightforward to compute mathematically, as exemplified by Callendar's calculations, leading to quantitatively robust projections of baseline warming. The ESMs include not only the physical core but also climate feedbacks that introduce uncertainty into the projections in terms of magnitude, but not sign: positive (amplification of warming). As such, the projections of end-of-century global warming by ESMs are fundamentally trustworthy: quantitatively robust baseline warming based on the well-understood physics of radiative transfer, with extra warming due to climate feedbacks. These projections thus provide a compelling case that global climate will continue to undergo significant warming in response to ongoing emissions of CO2 and other greenhouse gases to the atmosphere.
Evidence-Based Climate Science: Data Opposing CO2 Emissions as the Primary Source of Global Warming, Second Edition, includes updated data related to the causes of global climate change from experts in meteorology, geology, atmospheric physics, solar physics, geophysics, climatology, and computer modeling. This book objectively gathers and analyzes scientific data concerning patterns of past climate changes, influences of changes in ocean temperatures, the effect of solar variation on global climate, and the effect of CO2 on global climate. This analysis is then presented as counter-evidence to the theory that CO2 is the primary cause behind global warming. Increasingly, scientists are pointing to data which suggests that climate changes are a result of natural cycles, which have been occurring for thousands of years. Unfortunately, global warming has moved into the political realm without enough peer-reviewed research to fully validate and exclude other, more natural, causes of climate change. For example, there is an absence of any physical evidence that CO2 causes global warming, so the only argument for CO2 as the cause of warming rests entirely in computer modeling. Thus, the question becomes, how accurate are the computer models in predicting climate? What other variables could be missing from the models? In order to understand modern climate changes, we need to look at the past history of climate changes. Vast amounts of physical evidence of climate change over the past centuries and millennia have been gathered by scientists. Significant climate changes have clearly been going on for many thousands of years, long before the recent rise in atmospheric CO2 Evidence-Based Climate Science, Data Opposing CO2 Emissions as the Primary Source of Global Warming, Second Edition, documents past climate changes and presents physical evidence for possible causes. Provides scientific evidence for issues related to global climate change that is not readily available elsewhere. Offers detailed analysis of temperature measurements with the goal of helping readers to understand conflicting claims about global warming heard every day in the news media. Presents real-time data on polar ice. Presents the real-time effect of CO2 on global warming, rather than forecasts based on computer models.
Between about 1998 and 2012, a time that coincided with political negotiations for preventing climate change, the surface of Earth seemed hardly to warm. This phenomenon, often termed the ‘global warming hiatus’, caused doubt in the public mind about how well anthropogenic climate change and natural variability are understood. Here we show that apparently contradictory conclusions stem from different definitions of ‘hiatus’ and from different datasets. A combination of changes in forcing, uptake of heat by the oceans, natural variability and incomplete observational coverage reconciles models and data. Combined with stronger recent warming trends in newer datasets, we are now more confident than ever that human influence is dominant in long-term warming.
The academic community could make rapid progress on quantifying the impacts of limiting global warming to 1.5 [deg]C, but a refocusing of research priorities is needed in order to provide reliable advice.