Article

The application of machine learning for evaluating anthropogenic versus natural climate change

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Time-series profiles derived from temperature proxies such as tree rings can provide information about past climate. Signal analysis was undertaken of six such datasets, and the resulting component sine waves used as input to an artificial neural network (ANN), a form of machine learning. By optimizing spectral features of the component sine waves, such as periodicity, amplitude and phase, the original temperature profiles were approximately simulated for the late Holocene period to 1830 AD. The ANN models were then used to generate projections of temperatures through the 20th century. The largest deviation between the ANN projections and measured temperatures for six geographically distinct regions was approximately 0.2°C, and from this an Equilibrium Climate Sensitivity (ECS) of approximately 0.6°C was estimated. This is considerably less than estimates from the General Circulation Models (GCMs) used by the Intergovernmental Panel on Climate Change (IPCC), and similar to estimates from spectroscopic methods.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Identification of continuing oscillatory patterns can potentially be valuable in forecasting temperatures. A number of studies have found evidence for oscillations in proxy records at millennial, centennial and decadal time scales by applying spectral analysis [18][19][20]. It has been suggested that some proxy records may not include low frequency oscillations because certain types of proxy (particularly tree rings) do not capture this compared to others proxies such as ice cores. ...
... The occurrence of oscillatory processes in past temperature records is important because it potentially provides a method of projecting temperature patterns based on pre-industrial influences forward into the industrial era. The availability of this forecasting capability potentially enables separation of natural and anthropogenic influences on temperature during the industrial era [19]. Identification of oscillatory patterns in temperature records are also useful in understanding potential linkages with solar activity that are known to exhibit oscillatory behaviour on similar time scales. ...
... The results from decomposition into sets of oscillations can then be used to forecast temperatures into the industrial era, based on oscillatory patterns up to 1880 AD as input data. The accuracy of fitting the data and simulating the natural patterns present can generally be improved by using the sine wave components as input to train an artificial neural network (ANN) [19]. This allows forecasts to be made beyond 1880 AD into the 20 th century, extending up to the present. ...
... Machine learning is a subfield of artificial intelligence whose purpose is to create and improve computer programs that can learn dynamically from data by employing certain algorithms, and they can be useful for forecasting purposes in the field of energy [9][10][11] as well as climate analysis [5,[12][13][14][15][16]. However, in comparison to many other fields of study, climate science has demonstrated slight tardiness to adopt machine learning-based methods, and the number of applications is quite limited [17]. For example, Abbot and Marohasy applied artificial neural networks to generate temperature projections for the period between 1880 and 2000, and their results showed that warming due to natural climate cycles would be in the range of 0.6 to 1°C depending on geographical location [17]. ...
... However, in comparison to many other fields of study, climate science has demonstrated slight tardiness to adopt machine learning-based methods, and the number of applications is quite limited [17]. For example, Abbot and Marohasy applied artificial neural networks to generate temperature projections for the period between 1880 and 2000, and their results showed that warming due to natural climate cycles would be in the range of 0.6 to 1°C depending on geographical location [17]. Mansfield et al. employed Ridge, Gaussian Process, Random Forest and Lasso regressions to estimate the correlations between short-term and long-term temperature responses to various climatic forcing scenarios [18]. ...
Article
Full-text available
In this work, anthropogenic and natural factors were used to evaluate and forecast climate change on a global scale by using a variety of machine-learning techniques. First, significance analysis using the Shapley method was conducted to compare the importance of each variable. Accordingly, it was determined that the equivalent CO2 concentration in the atmosphere was the most important variable, which was proposed as further evidence of climate change due to fossil fuel-based energy generation. Following that, a variety of machine learning approaches were utilized to simulate and forecast the temperature anomaly until 2100 based on six distinct scenarios. Compared to the preindustrial period, the temperature anomaly for the best-case scenario was found to increase a mean value of 1.23 °C and 1.11 °C for the mid and end of the century respectively. On the other hand, the anomaly was estimated for the worst-case scenario to reach to a mean value of 2.52 °C and 4.97 °C for the same periods. It was then concluded that machine learning approaches can assist researchers in predicting climate change and developing policies for national governments, such as committing firmly to renewable energy regulations.
... The results in autumn show that among the six models, RF and SVM models had the best predictions, followed by NN models, then GPR models, and LSTM and BiLSTM models had the worst results.Sens. 2023,15, 1136 The NLSAT estimation results for the autumn of 2015. ...
... Remote Sens. 2023,15, 1136 ...
Article
Full-text available
Near-land surface air temperature (NLSAT) is an important meteorological and climatic parameter widely used in climate change, urban heat island and environmental science, in addition to being an important input parameter for various earth system simulation models. However, the spatial distribution and the limited number of ground-based meteorological stations make it difficult to obtain a large range of high-precision NLSAT values. This paper constructs neural network, long short-term memory, bi-directional long short-term memory, support vector machine, random forest, and Gaussian process regression models by combining MODIS data, DEM data, and meteorological station data to estimate the NLSAT in China’s mainland and compare them with actual NLSAT observations. The results show that there is a significant correlation between the model estimates and the actual temperature observations. Among the tested models, the random forest performed the best, followed by the support vector machine and the Gaussian process regression, then the neural network, the long short-term memory, and the bi-directional long short-term memory models. Overall, for estimates in different seasons, the best results were obtained in winter, followed by spring, autumn, and summer successively. According to different geographic areas, random forest was the best model for Northeast, Northwest, North, Southwest, and Central China, and the support vector machine was the best model for South and East China.
... Machine learningbased methods are able to explore the correlation features of each meteorological factor to achieve excellent predictive performance. Common methods of machine learning include SVM [11,12], Genetic Algorithms (GA) [13], Artificial Neural Networks (ANN) [14], etc. Radhika et al. [12] used SVM to predict atmospheric temperature and compared the results with multilayer perceptron (MLP). The results demonstrated that SVM outperformed MLP's. ...
... The results demonstrated that SVM outperformed MLP's. Abbot et al. [14] used the ANN model to predict temperature in six local regions with good performance, including Swiss Alps, Canadian Rockies, Tasmania, and so on. Venkadesh et al. [13] applied GA to determine the optimal duration and resolution of the input variables and set them as inputs for the ANN model. ...
Article
Full-text available
Temperature is an important part of meteorological factors, which are affected by local and surrounding meteorological factors. Aiming at the problems of significant prediction error and insufficient extraction of spatial features in current temperature prediction research, this research proposes a temperature prediction model based on the Graph Convolutional Network (GCN) and Bidirectional Long Short-Term Memory (BiLSTM) and studies the influence of temperature time-series characteristics, urban spatial location, and other meteorological factors on temperature change in the study area. In this research, multi-meteorological influencing factors and temperature time-series characteristics are used instead of single time-series temperature as influencing factors to improve the time dimension of the input data through time-sliding windows. Meanwhile, considering the influence of meteorological factors in the surrounding area on the temperature change in the study area, we use GCN to extract the urban geospatial location features. The experimental results demonstrate that our model outperforms other models and has the smallest root mean squared error (RMSE) and mean absolute error (MAE) in the following 14-day and multi-region temperature forecasts. It has higher accuracy in areas with stable temperature fluctuations and small temperature differences than in baseline models.
... Therefore, the need for improving the accuracy and robustness of predictions and their advancements motivated the use of Artificial intelligence (AI) based ML algorithms. These algorithms use ensemble models of regression tree [bagging, random forest (RF), rotation forest, boosted regression tree (BRT), gradient boosted regression tree (GBRT) and extremely randomized trees (ETREES)] and ANN, using both the traditional and MI approaches (Abbot and Marohasy, 2017;Bochenek and Ustrnul, 2022;D echelle-Marquet, 2019;Huang et al., 2020;Huntingford et al., 2019;Juggins et al., 2015;Liang et al., 2021;Salonen et al., 2012Salonen et al., , 2019Salonen et al., 2014;Salonen et al., 2016;Wei et al., 2021;Zhang et al., 2022). These algorithms are capable of modeling nonlinear data with complex interactions and have been used successfully to address various problems in Earth Sciences. ...
... Apart from the classical approaches discussed so far, recent advancement has been seen in the AI based ML methods in which ANN has been successfully used in paleo-reconstruction and future projections of climatic and meteorological archives such as temperature, rainfall, wind speed, solar radiation, ECS computations, etc. (Abbot and Marohasy, 2017;Wei et al., 2021). Wei et al. (2021) developed five ML (tree based and SVM) based models for mean annual temperature and precipitation which was compared with the multilinear regression (MLR) models. ...
Chapter
The sediments have been a ubiquitous archive for the paleoclimate reconstruction while the marine sediments are considered to be serene and mostly undisturbed from the anthropogenic encroachments. Unlike continental records, the deep-sea sediments elucidated the evidence of at least 50 glacial and interglacial stages during the Quaternary Period and thus proved its applicability in quaternary climate reconstruction. The geochemical variations in the marine sediments are often used as proxy to decode the past productivity, redox, weathering and provenance changes as a function of past climate and oceanographic perturbations. The geochemical behaviour of the elements in sedimentary environment primarily relies on ionic potential along with redox potential and pH which leads to mobility and enrichment of selected elements and thus acting as potential evidence for ambient temporal changes. The present chapter aims to provide an overview of frequently used geochemical proxies and their applicability. Further the chapter also tries to provide significance of statistical and machine learning approaches on the geochemical datasets in understanding the climatic processes that led to changes in the geochemical variability.
... Instrumental records of rainfall and temperature generally extend back only about 100 years in Australia. Reconstructions of past temperatures, extending back hundreds or thousands of years, are available for many parts of the world [49]. These are derived from palaeo data, including tree rings, corals, ice cores and stalagmites. ...
... These three Australian palaeoclimate proxies of rainfall enabled the reconstructions of rainfall in the Murray Darling Basin of south-eastern Australia [49,50] although all three lie outside the Murray Darling Basin. The results reveal several extended periods that are likely to have been drier than indicated by the instrumental record from approximately the last century. ...
... This same result is also reported by Harde (2014) using the spectral analysis method. Abbot and Marohasy (2017) have used six temperature proxy data sets and an artificial neural network to create temperature projections throughout the twentieth century. Based on deviations between the projections and the measured temperatures, their estimate for TCS is 0.6°C. ...
... The research studies of Ollila (2014) and Harde (2014) show a TCS value of 0.6°C instead of the IPPC's TCS value of 1.9°C. The TCS value of 0.6°C is also supported by the study of Abbot and Marohasy (2017) based on the empirical data analysis. This lower TCS of CO 2 can explain why the IPCC model temperature 1.27°C in 2016 exceeds the present-day temperature of 0.85°C, but the lower warming value of CO 2 shows only a 0.28°C increase for GH gases, leaving room for natural forces. ...
Article
Full-text available
Purpose The purpose of this paper is to analyze the scientific basis of the Paris climate agreement. Design/methodology/approach The analyses are based on the IPCC’s own reports, the observed temperatures versus the IPCC model-calculated temperatures and the warming effects of greenhouse gases based on the critical studies of climate sensitivity (CS). Findings The future emission and temperature trends are calculated according to a baseline scenario by the IPCC, which is the worst-case scenario RCP8.5. The selection of RCP8.5 can be criticized because the present CO2 growth rate 2.2 ppmy⁻¹ should be 2.8 times greater, meaning a CO2 increase from 402 to 936 ppm. The emission target scenario of COP21 is 40 GtCO2 equivalent, and the results of this study confirm that the temperature increase stays below 2°C by 2100 per the IPCC calculations. The IPCC-calculated temperature for 2016 is 1.27°C, 49 per cent higher than the observed average of 0.85°C in 2000. Originality/value Two explanations have been identified for this significant difference in the IPCC’s calculations: The model is too sensitive for CO2 increase, and the positive water feedback does not exist. The CS of 0.6°C found in some critical research studies means that the temperature increase would stay below the 2°C target, even though the emissions would follow the baseline scenario. This is highly unlikely because the estimated conventional oil and gas reserves would be exhausted around the 2060s if the present consumption rate continues.
... Furthermore, machine learning (ML) architectures, applied to weather forecasting, can be grouped into static and dynamic (recurrent) in terms of whether the prediction is sequentially generated [44][45][46][47], or not. Static ML architectures include clustering (K-means, PCA) [48,49], artificial neural networks (ANN) [45,50,51], graph neural networks (GNN) [52], clustering + neural networks [53,54], decision trees such as Gradient Boosting (XGBoost [55], AdaBoost [46], CatBoost [46,56,57]), and Random Forest [58,59]. Dynamic ML architectures for weather forecasting, on the other hand, make use of the sequential nature of the training data in generating the forecast. ...
Preprint
Full-text available
Earth Observatory is a growing research area that can capitalize on the powers of AI for short time forecasting, a Now-casting scenario. In this work, we tackle the challenge of weather forecasting using a video transformer network. Vision transformer architectures have been explored in various applications, with major constraints being the computational complexity of Attention and the data hungry training. To address these issues, we propose the use of Video Swin-Transformer, coupled with a dedicated augmentation scheme. Moreover, we employ gradual spatial reduction on the encoder side and cross-attention on the decoder. The proposed approach is tested on the Weather4Cast2021 weather forecasting challenge data, which requires the prediction of 8 hours ahead future frames (4 per hour) from an hourly weather product sequence. The dataset was normalized to 0-1 to facilitate using the evaluation metrics across different datasets. The model results in an MSE score of 0.4750 when provided with training data, and 0.4420 during transfer learning without using training data, respectively.
... Temperature such as tree rings can construct time series profiles to illuminate chronological climates by (Abbot & Marohasy, 2017). An artificial neural network (ANN) trained using the sine waves from the six datasets subjected to signal analysis. ...
Article
Full-text available
Every country's population will have to deal with the effects of climate change. The meteorological department needs to implement effective forecasting methods to deal with climate changes. Accurate temperature forecasts help in protecting people and property is an essential aspect of government, business, and the general public planning. Early predictions help farmers and industrialists to make approaches and store crops more effectively. When the climate continuously changes, it is not easy to make accurate predictions for the meteorological department and government authorities. Artificial intelligence (AI) algorithms have stimulated improvements in various fields. Machine learning (ML) may find teleconnections where complicated feedbacks make it challenging to determine how proposed work from a straightforward analysis and observations. Our proposed research uses the machine learning algorithm, SARIMA Model, to comprehend and utilize existing datasets and simulations.
... That function can be used in a different dataset, named testing one, to evaluate the model, and if the results are satisfactory, it can be used in the classification or regression of any kind of application needed. In that group we find methods, such as Decision Trees, e.g., Random Forest (RF) [5] or XGBoost (XGB) [6], Artificial Neural Networks (ANN) [7], Deep Learning (DL) [8], and Support Vector Machine (SVM) [9]. The second group in machine learning is unsupervised learning (Figure 1), in which algorithms do not have labelled data to train from, and must decide upon other ways to divide a given dataset, or reduce the dimensions ...
Article
Full-text available
In this paper, we performed an analysis of the 500 most relevant scientific articles published since 2018, concerning machine learning methods in the field of climate and numerical weather prediction using the Google Scholar search engine. The most common topics of interest in the abstracts were identified, and some of them examined in detail: in numerical weather prediction research—photovoltaic and wind energy, atmospheric physics and processes; in climate research—parametrizations, extreme events, and climate change. With the created database, it was also possible to extract the most commonly examined meteorological fields (wind, precipitation, temperature, pressure, and radiation), methods (Deep Learning, Random Forest, Artificial Neural Networks, Support Vector Machine, and XGBoost), and countries (China, USA, Australia, India, and Germany) in these topics. Performing critical reviews of the literature, authors are trying to predict the future research direction of these fields, with the main conclusion being that machine learning methods will be a key feature in future weather forecasting.
... Thus, the steady progress of these techniques over the past two decades has led to better results in terms of their suitability for flood prediction (Mekanik et al. 2013;Mosavi et al. 2017). Indeed, Abbot and Marohasy (2017) found ML models to be of higher accuracy when compared to conventional methods in a recent study. ...
Article
Floods constitute one of the most devastating and destructive natural forces in the world. They have a considerable impact on the economy and can result in significant loss of life. Several strategies, including studies by advanced data analysis methods, have been adopted to curb this phenomenon and ultimately limit the accompanying damage. In this study, four supervised models based on machine learning (ML) algorithms were used to map flood vulnerability in the Souss watershed located in southern Morocco. They include random forest, x-gradient boost, k-nearest neighbors and artificial neural network. Thirteen predisposing factors including aspect, curvature, digital elevation model (DEM), distance to rivers, drainage density, flow accumulation, flow direction, geology, land use, rainfall, slope, soil type, and topographic wetness index (TWI) were selected as inputs to achieve this. Four different models were developed for each ML algorithm based on variable selection and one-hot encoding. Overall, all the models of the four algorithms have an AUC score above 80% for the testing data, which means that they all performed very well. The ranking of the ML algorithms used by considering only the most efficient model of each algorithm is as follows: KNN (98.6%), RF (98.1%), XGB (97.2%), and NNET (95.9%). Finally, all the models were overlaid to identify points of agreement or disagreement. This study provides evidence of ML models being successful in mapping flood vulnerability. These findings can be beneficial, serving as an important resource in mitigating the impacts of floods in the highlighted vulnerable areas presented in the flood vulnerability maps.
... As a result, the atmosphere contained 3120 gigatons CO2 in 1998 and 3450 gigatons in 2018. That corresponds to a CO2 residence time of about 30 years, in line with various papers in the literature, e.g., Ref. 20. Consequently, only a fraction of the emitted CO2 remains in the atmosphere (equal to 0.918), while the remaining CO2 will be absorbed by oceans, seas, and land, including photosynthesis activities. ...
... Downpour, storm, rising temperatures, sea level, and retreating glaciers are considered as the main headlines among the indicators of climate change [1][2][3][4][5][6]. anks to the popularity of Twitter and easily accessible Application Program Interface (API) [7][8][9], tweets can be stored by topics related to hashtags. ...
Article
Full-text available
As the usage of social media has increased, the size of shared data has instantly surged and this has been an important source of research for environmental issues as it has been with popular topics. Sentiment analysis has been used to determine people's sensitivity and behavior in environmental issues. However, the analysis of Turkish texts has not been investigated much in literature. In this article, sentiment analysis of Turkish tweets about global warming and climate change is determined by machine learning methods. In this regard, by using algorithms that are determined by supervised methods (linear classifiers and probabilistic classifiers) with trained thirty thousand randomly selected Turkish tweets, sentiment intensity (positive, negative, and neutral) has been detected and algorithm performance ratios have been compared. This study also provides benchmarking results for future sentiment analysis studies on Turkish texts.
... The reporting of climate sensitivity in the literature has been steadily reducing for decades, with some recent papers pointing to a very low sensitivity, of much less than 1°C [20][21][22]. A careful reading of these papers, (for example the most recent one) clearly indicates that the 0.6°C cited, is in fact an absolute maximum. ...
Article
Full-text available
It has always been complicated mathematically, to calculate the average near surface atmospheric temperature on planetary bodies with a thick atmosphere. Usually, the Stefan Boltzmann (S-B) black body law is used to provide the effective temperature, then debate arises about the size or relevance of additional complicating factors, including the albedo and the greenhouse effect. Presented here is a simple and reliable method of accurately calculating the average near surface atmospheric temperature on planetary bodies which possess a surface atmospheric pressure of over 10 KPa. The formula used is the molar mass version of the ideal gas law. This method requires a gas constant and the measurement of only three gas parameters; the average near-surface atmospheric pressure, the average near surface atmospheric density and the average mean molar mass of the near-surface atmosphere. This indicates that all information on the effective plus the residual nearsurface atmospheric temperature on planetary bodies with thick atmospheres, is automatically ‘baked-in’ to the three mentioned gas parameters. It is known that whenever an atmospheric pressure exceeds 10 KPa, convection then dominates over radiative interactions as the main method of energy transfer, and a rising thermal gradient is formed. This rising thermal gradient continues on down, (if there is a depression or a mine shaft) to even below the average surface level. Given this thermodynamic situation, it is very likely that no one gas has an anomalous effect on atmospheric temperatures that is significantly more than any other gas. In short; there is unlikely to be any significant warming from the greenhouse effect on any planetary body in the parts of atmospheres which are >10 KPa. Instead, it is proposed that the residual temperature difference between the S-B effective temperature and a measured near-surface temperature (the atmospheric effect) is a thermal enhancement which is actually caused by auto-compression.
... Even though underground parking garages are popular in many urbanized living environments, little is known about air quality monitoring in underground parking garages. The Intergovernmental Panel on Climate Change (IPCC) has presented artificial intelligence (AI), big data, and the Internet of Things (IoT) technologies as solutions to climate change [26]. Currently, Korea is attempting weather forecasting using AI. ...
Article
Full-text available
Exposure to particulate materials (PM) is known to cause respiratory and cardiovascular diseases. Respirable particles generated in closed spaces, such as underground parking garages (UPGs), have been reported to be a potential threat to respiratory health. This study reports the concentration of pollutants (PM, TVOC, CO) in UPGs under various operating conditions of heating, ventilation and air-conditioning (HVAC) systems using a real-time monitoring system with a prototype made up of integrated sensors. In addition, prediction of the PM concentration was implemented using modeling from vehicle traffic volumes and an artificial neural network (ANN), based on environmental factors. The predicted PM concentrations were compared with the level acquired from the real-time monitoring. The measured PM10 concentrations of UPGs were higher than the modeled PM10 due to short-term sources induced by vehicles. The average inhalable and respirable dosage for adult was calculated for the evaluation of health effects. The ANN predicted PM concentration showed a close correlation with measurements resulting in R2 ranging from 0.69 to 0.87. This study demonstrates the feasibility of the use of the air quality monitoring system for personal-exposure to vehicle-induced pollutant in UPGs and the potential application of modeling and ANN for the evaluation of the indoor air quality.
... Examples of ML are land classification using remote sensing [20][21][22], amending satellite data assimilation [23], or decomposing the causes of climate change [24]. Starting with the simplest example, that is, linear regression, the objective of both SVR [26] and LS-SVR [27] is to fit a linear relation y ¼ w T x þ b between the x regressors and the dependent variable y in the so-called feature space. ...
... These are the same models used by the Intergovernmental Panel on Climate Change (IPCC) to forecast climate change over decades. Though recent studies suggest ANNs have considerable application here, including to evaluate natural versus climate change over millennia, and also to beter understand equilibrium climate sensitivity [84]. While machine learning is now a well-established discipline, and artiicial neural networks a well understood subcomponent, this technology is only beginning to be applied to rainfall forecasting, so far with most of this efort concentrated in China, India and Australia as shown in Table 6. ...
... The reported figures for equilibrium climate sensitivity to CO 2 in the literature have already been steadily reducing for decades, with recent papers pointing to a very low sensitivity of less than 1°C; [50,51,52,68]. A careful reading of these papers, (for example the most recent ones) clearly indicates that the 0.6°C cited, is in fact an absolute maximum. ...
Article
Full-text available
Presented here is a simple and reliable method of accurately calculating the average near surface atmospheric temperature on all planetary bodies which possess a surface atmospheric pressure of over 0.69kPa, by the use of the molar mass version of the ideal gas law. This method requires a gas constant and the near-surface averages of only three gas parameters; the atmospheric pressure, the atmospheric density and the mean molar mass. The accuracy of this method proves that all information on the effective plus the residual near-surface atmospheric temperature on planetary bodies with thick atmospheres, is automatically 'baked-in' to the three mentioned gas parameters. It is also known that whenever an atmospheric pressure exceeds 10kPa, convection and other modes of energy transfer will totally dominate over radiative interactions in the transfer of energy, and that a rising thermal gradient always forms from that level. This rising thermal gradient continues down to the surface, and even below it if there is a depression or a mine-shaft present. This measured thermodynamic situation, coupled with other empirical science presented herein, mean that it is very likely that no one gas has an anomalous effect on atmospheric temperatures that is significantly more than any other gas. In short; there is unlikely to be any significant net warming from the greenhouse effect on any planetary body in the parts of atmospheres which are >10kPa. Instead, it is proposed that the residual temperature difference between the effective temperature and the measured near-surface temperature, is a thermal enhancement caused by gravitationally-induced adiabatic auto-compression, powered by convection. A new null hypothesis of global warming or climate change is therefore proposed and argued for; one which does not include any anomalous or net warming from greenhouse gases in the tropospheric atmospheres of any planetary body.
... The reporting of climate sensitivity in the literature has been steadily reducing for decades, with many recent papers pointing to a very low sensitivity, of much less than 1°C; [20,21,22]. A careful reading of these papers, (for example the most recent one) clearly indicates that the 0.6°C cited, is in fact an absolute maximum. ...
Article
Full-text available
It has always been complicated mathematically, to calculate the average near surface atmospheric temperature on planetary bodies with a thick atmosphere. Usually, the Stefan Boltzmann (S-B) black body law is used to provide the effective temperature, then debate arises about the size or relevance of additional factors, including the ‘greenhouse effect’. Presented here is a simple and reliable method of accurately calculating the average near surface atmospheric temperature on planetary bodies which possess a surface atmospheric pressure of over 10kPa. This method requires a gas constant and the knowledge of only three gas parameters; the average near-surface atmospheric pressure, the average near surface atmospheric density and the average mean molar mass of the near-surface atmosphere. The formula used is the molar version of the ideal gas law. It is here demonstrated that the information contained in just these three gas parameters alone is an extremely accurate predictor of atmospheric temperatures on planets with atmospheres >10kPa. This indicates that all information on the effective plus the residual near-surface atmospheric temperature on planetary bodies with thick atmospheres, is automatically ‘baked-in’ to the three mentioned gas parameters. Given this, it is shown that no one gas has an anomalous effect on atmospheric temperatures that is significantly more than any other gas. In short; there can be no 33°C ‘greenhouse effect’ on Earth, or any significant ‘greenhouse effect’ on any other planetary body with an atmosphere of >10kPa. Instead, it is a postulate of this hypothesis that the residual temperature difference of 33°C between the S-B effective temperature and the measured near-surface temperature is actually caused by adiabatic auto-compression.
Preprint
Full-text available
Big climate change data have become a pressing issue that organizations faced with methods to analyse data generated from various data types. However, storage, processing, and analysis of data generated from climate change activities are massive, which is challenging for the current algorithms to handle. Therefore, big data analytics methods are designed for significant data that is required to enhance seasonal change monitoring and understanding, ascertain the health risk of climate change, and improve the allocation, and utilisation of natural resources. This paper provides an outlook on big data analytic methods and describes how climate change and sustainability issues can be analysed through these methods. We extensively discuss big data analytic methods, strengths, and weaknesses. The purpose of analysing big climate change using these methods, the common datasets and implementation frameworks for climate change modeling using the big data analytics approach was also discussed. This big data analytics method is well timed to solve the inherent issues of data analysis and easy realization of sustainable development goals.
Chapter
Climate change effects on water, manifested by climate pattern fluctuations, have been exacerbating. Consequently, the globe is expected to witness even drier areas and water-related disasters by the end of the twenty-first century. Moreover, the intensification of water cycle is imposing a global socioeconomic impact, transforming agriculturally, industrially, and municipally water sector nexuses. Such consequences emerge as a challenging hurdle for institutions’ ability to overpower failures in water sector management. Thus, the present paper underlines prominent findings surrounding climate change multifaceted effects on water status, stemming from selected case studies. The scope of research includes a thorough state-of-the-art analysis, tracked via desk research and comparative analysis. The main driving factors, impact of climate change on water resources, climate fluctuations, and precipitation intensity were most emphasized. Furthermore, agricultural, industrial, and municipal sectors have been highlighted as the major water-consuming fields. Finally, the applications of sustainable technological features, mainly artificial intelligence (AI), remote sensing, and biostimulants, were highlighted as effective tools for circular economy (CE) framework implementation. Both probable and discrepant results on projected water quantity and quality were noted and critically discussed. The paper concludes that developing prediction models and the enactment of the proposed CE framework go hand in hand.KeywordsWaterClimate changeCircular Economy (CE)
Article
In order to improve the prediction accuracy of air temperature forecasting, a temperature prediction model based on the hybrid SARIMA (seasonal autoregressive integrated moving average)‐LSTM (long short‐term memory) model is constructed. First, this method decomposes the temperature series into three series of trend, seasonal, and residual through seasonal‐trend decomposition procedure based on Loess decomposition method. It establishes SARIMA to predict the trend and seasonal series and extracts the linear information contained in the time series to the maximum extent. Then, the LSTM model is used to fit the residual series and the hidden nonlinear information is further extracted. Finally, the prediction results of two parts are added in series to obtain the prediction result of the final hybrid model. Three indexes, namely, root mean square error, mean absolute error, and mean absolute percentage error are evaluated to calculate the prediction accuracy about single models including ARIMA, SARIMA, and LSTM and the hybrid models ARIMA‐LSTM and SARIMA‐LSTM. Also the Kupiec index is used to show tail performance. The empirical results show that the SARIMA‐LSTM combination model is more accurate than the single prediction methods and other combination model. Its accuracy increases by 10.0–27.7%. This method decomposes the temperature series based on Loess decomposition method. It establishes seasonal autoregressive integrated moving average to predict the trend and seasonal series. Then, the long short‐term memory model is used to fit the residual series. Finally, the prediction results of two parts are added in series to obtain the prediction result of the final hybrid model.
Chapter
Full-text available
Climate models have continued to be developed and improved since the AR4, and many models have been extended into Earth System models by including the representation of biogeochemical cycles important to climate change. These models allow for policy-relevant calculations such as the carbon dioxide (CO2) emissions compatible with a specified climate stabilization target. In addition, the range of climate variables and processes that have been evaluated has greatly expanded, and differences between models and observations are increasingly quantified using ‘performance metrics’. In this chapter, model evaluation covers simulation of the mean climate, of historical climate change, of variability on multiple time scales and of regional modes of variability. This evaluation is based on recent internationally coordinated model experiments, including simulations of historic and paleo climate, specialized experiments designed to provide insight into key climate processes and feedbacks and regional climate downscaling. Figure 9.44 provides an overview of model capabilities as assessed in this chapter, including improvements, or lack thereof, relative to models assessed in the AR4. The chapter concludes with an assessment of recent work connecting model performance to the detection and attribution of climate change as well as to future projections.
Article
Full-text available
Since 1850 the global surface temperature has warmed by about 0.9 oC. The CMIP5 computer climate models adopted by the IPCC have projected that the global surface temperature could rise by 2-5 oC from 2000 to 2100 for anthropogenic reasons. These projections are currently used to justify expensive mitigation policies to reduce the emission of anthropogenic greenhouse gases such as CO2. However, recent scientific research has pointed out that the IPCC climate models fail to properly reconstruct the natural variability of the climate. Indeed, advanced techniques of analysis have revealed that the natural variability of the climate is made of several oscillations spanning from the decadal to the millennial scales (e.g. with periods of about 9.1, 10.4, 20, 60, 115, 1000 years and others). These oscillations likely have an astronomical origin. The same considerations yield to the conclusion that the IPCC climate models severely overestimate the anthropogenic climatic warming by about two times. Herein I demonstrate a number of failures of the IPCC models and I propose a semi-empirical climate model able to reconstruct the natural climatic variability since Medieval times. I show that this model projects a very moderate warming until 2040 and a warming less than 2 oC from 2000 to 2100 using the same anthropogenic emission scenarios used by the CMIP5 models. This result suggests that climatic adaptation policies, which are less expensive than the mitigation ones, could be sufficient to address most of the consequences of a climatic change during the 21st century. Finally, I show that a temperature forecast made in 2011 by Scafetta (Ref. 25) based on harmonic oscillations has well agreed with the global surface temperature data up to August 2016.
Article
Full-text available
Wind energy is increasingly being utilized globally, in part as it is a renewable and environmental-friendly energy source. The uncertainty caused by the discontinuous nature of wind energy affects the power grid. Hence, forecasting wind behavior (e.g., wind speed) is important for energy managers and electricity traders, to overcome the risks of unpredictability when using wind energy. Forecasted wind values can be utilized in various applications, such as evaluating wind energy potential, designing wind farms, performing wind turbine predictive control, and wind power planning. In this study, four methods of forecasting using artificial intelligence (artificial neural networks with radial basis function, adaptive neuro-fuzzy inference system, artificial neural network-genetic algorithm hybrid and artificial neural network-particle swarm optimization) are utilized to accurately forecast short-term wind speed data for Tehran, Iran. A large set of wind speed data measured at 1-h intervals, provided by the Iran Renewable Energy Organization (SUNA), is utilized as input in algorithm development. Comparisons of statistical indices for both predicted and actual test data indicate that the artificial neural network-particle swarm optimization hybrid model with the lowest root mean square error and mean square error values outperforms other methods. Nonetheless, all of the models can be used to predict wind speed with reasonable accuracy.
Conference Paper
Full-text available
An accurate rainfall forecasting is very important for agriculture dependent countries like India. For analyzing the crop productivity, use of water resources and pre-planning of water resources, rainfall prediction is important. Statistical techniques for rainfall forecasting cannot perform well for long-term rainfall forecasting due to the dynamic nature of climate phenomena. Artificial Neural Networks (ANNs) have become very popular, and prediction using ANN is one of the most widely used techniques for rainfall forecasting. This paper provides a detailed survey and comparison of different neural network architectures used by researchers for rainfall forecasting. The paper also discusses the issues while applying different neural networks for yearly/monthly/daily rainfall forecasting. Moreover, the paper also presents different accuracy measures used by researchers for evaluating performance of ANN.
Article
Full-text available
The world’s mega-deltas are extremely important from a human perspective and attract considerable effort to reveal their evolution, growth-related driving forces, and human impacts. Here, we report a case study on the Holocene deltaic evolution of the Yellow River, through development of a conceptual model, which is compared with paleo-proxy to analyze the forcing acting on the delta. The main conclusion is that superlobe switching was modulated by the 1500-year cycle. Cooling in Mongolia in response to strong Bond IRD events, which is coincident with warming in eastern China due to a strong Kuroshio Current, enhances the meridional temperature gradient, which then increases cyclone frequency and activates dust storms and terrestrial erosion throughout the catchment. Enhanced erosion supplies great amounts of material to the Yellow River and causes channel evulsion and superlobe development, expressed as dominant 1500-year cycle. At the same time, summer monsoon and solar forcing are uncorrected with deltaic evolution on these timescales. Therefore, we conclude that Holocene dynamics of the delta on a millennial timescale was dominated by winter cyclone activity across northern China and Mongolia.
Article
Full-text available
The Beer-Lambert law does not apply strictly to the relationship between radiative forcing (RF) of CO2 and concentration in the atmosphere, i.e., ΔRF = 5.35ln(C/Co). It is an approximation because water vapour competes unevenly with CO2 over the IR absorption wavelength range. We propose a quadratic model as an improved approximation. It links concentration to RF thereby allowing RF calculation at any concentration, not just ΔRF. For example, at 378 ppmv of CO2, the level in 2005, it calculates RF = 8.67 W m-2, or approximately 2.7% of the total RF of all the greenhouse gases. A second and independent method based on worldwide hourly measurements of atmospheric temperature and relative humidity confirms this percentage. Each method shows that, on average, water vapour contributes approximately 96% of current greenhouse gas warming. Thus, the factors controlling the amount of water vapour in the air also control the earth's temperature.
Article
Full-text available
Due to the dramatic increase in the global mean surface temperature (GMST) during the twentieth century, the climate science community has endeavored to determine which mechanisms are responsible for global warming. By analyzing a millennium simulation (the period of 1000–1990 ad) of a global climate model and global climate proxy network dataset, we estimate the contribution of solar and greenhouse gas forcings on the increase in GMST during the present warm period (1891–1990 ad). Linear regression analysis reveals that both solar and greenhouse gas forcing considerably explain the increase in global mean temperature during the present warm period, respectively, in the global climate model. Using the global climate proxy network dataset, on the other hand, statistical approach suggests that the contribution of greenhouse gas forcing is slightly larger than that of solar forcing to the increase in global mean temperature during the present warm period. Overall, our result indicates that the solar forcing as well as the anthropogenic greenhouse gas forcing plays an important role to increase the global mean temperature during the present warm period.
Article
Full-text available
We attribute climate variability in four independent reconstructions of Greenland-average temperature and precipitation over the twentieth century. The reconstructions exhibit substantial differences in the timing and amplitudes of climate variations. Linear, empirical models of Greenland-average temperature and precipitation variations on multi-decadal timescales are established from a suite of Community Climate System Model 3 simulations of the preindustrial millennium. They are compared against observational reconstructions after being tested against simulations of the industrial and future periods. Empirical estimates of variations over the industrial and future periods are correlated at greater than 0.95 with simulated values. Greenhouse gas increases account for the majority of the temperature and precipitation increases after the mid-1900s. In contrast to the simulations, observed temperatures and precipitation do not increase until the mid-1990s. Thus, the empirical models over-predict the response to greenhouse gases over the twentieth century. We conclude that CCSM3 is not capturing processes that are proving important to Greenland surface conditions. Furthermore, modes of North Atlantic variability exhibit opposite relationships with some observations compared with the simulations. In those cases, reversing the sign of this component of variability yields significant correlations between the estimated and observed accumulation values.
Article
Full-text available
For the understanding of current and future climate change it is a basic pre requisite to properly understand the mechanisms, which controlled climate change after the Last Ice Age. According to the IPCC 5th assessment report (in prep.) the Sun has not been a major driver of climate change during the post-Little Ice Age slow warming, and particularly not during the last 40 years. This statement requires critical review as the IPCC neglects strong paleo-climatologic evidence for the high sensitivity of the climate system to changes in solar activity. This high climate sensitivity is not alone due to variations in total solar irradiance-related direct solar forcing, but also due to additional, so-called indirect solar forcings. These include solar-related chemical-based UV irradiance-related variations in stratospheric temperatures and galactic cosmic ray-related changes in cloud cover and surface temperatures, as well as ocean oscillations, such as the Pacific Decadal Oscillation and the North Atlantic Oscillation that significant affect the climate. As it is still difficult to quantify the relative contribution of combined direct and indirect solar forcing and of increased atmospheric CO2 concentrations to the slow warming of the last 40 years, predictions about future global warming based exclusively on anthropogenic CO2 emission scenarios are premature. Nevertheless, the cyclical temperature increase of the 20th century coincided with the buildup and culmination of the Grand Solar Maximum that commenced in 1924 and ended in 2008. The anticipated phase of declining solar activity of the coming decades will be a welcome 'natural laboratory' to clarify and quantify the present and future role of solar variation in climate change.
Article
Full-text available
Observed upper-tropospheric temperature over the tropics (TTUT) shows a slowdown in warming rate during 1997–2011 despite the continuous warming projected by coupled atmosphere–ocean general circulation models (AOGCMs). This observation–model discord is an underlying issue regarding the reliability of future climate projections based on AOGCMs. To investigate the slowdown, we conducted ensemble historical simulations using an atmospheric general circulation model (AGCM) forced by observed sea surface temperature both with and without anthropogenic influences. The historical AGCM run reproduced a muted TTUT change over the central Pacific (CP) found in multiple observations, while the multi-AOGCM mean did not. Recent tropical Pacific cooling, which is considered natural variability, contributes to the muted trend over the CP and the resultant slowdown of TTUT increase. The results of this study suggest that difficulties in simulating the recent “upper-tropospheric warming hiatus” do not indicate low reliability of AOGCM-based future climate projections.
Article
Full-text available
The climate of the Earth, like planetary climates in general, is broadly controlled by solar irradiation, planetary albedo and emissivity as well as its rotation rate and distribution of land (with its orography) and oceans. However, the majority of climate fluctuations that affect mankind are internal modes of the general circulation of the atmosphere and the oceans. Some of these modes, such as El Nino-Southern Oscillation (ENSO), are quasi-regular and have some longer-term predictive skill; others like the Arctic and Antarctic Oscillation are chaotic and generally unpredictable beyond a few weeks. Studies using general circulation models indicate that internal processes dominate the regional climate and that some like ENSO events have even distinct global signatures. This is one of the reasons why it is so difficult to separate internal climate processes from external ones caused, for example, by changes in greenhouse gases and solar irradiation. However, the accumulation of the warmest seasons during the latest two decades is lending strong support to the forcing of the greenhouse gases. As models are getting more comprehensive, they show a gradually broader range of internal processes including those on longer time scales, challenging the interpretation of the causes of past and present climate events further.
Conference Paper
Full-text available
The Bowen Basin contains the largest coal reserves in Australia. Prolonged heavy rainfall during the 2010-2011 wet-season severely affected industry operations with an estimated economic loss of A$5.7 billion (£3.8 billion). There was no explicit warning of the exceptionally wet conditions in the seasonal forecast from the Australian Bureau of Meteorology, which simply suggested a 50-55% probability of above median rainfall for the Bowen Basin. In this study, the value of using neural networks, a form of artificial intelligence, to forecast monthly rainfall for the town of Nebo in the Bowen Basin is explored. Neural networks facilitate the input of multiple climate indices and the exploration of their non-linear relationships. Through genetic optimisations of input variables related to temperatures, including atmospheric temperatures and sea surface temperatures expressed through the Inter-decadal Pacific Oscillation and Niño 3.4, it is possible to develop monthly rainfall forecasts for Nebo superior to the best seasonal forecasts from the Bureau of Meteorology. As neural networks employ far superior technology for exploring the patterns and relationships within historical data including climate indices they are to be preferred. Keywords: rainfall, neural network, forecast, Southern Oscillation index Interdecadal Pacific Oscillation, coal, mining.
Article
Full-text available
Transient and equilibrium sensitivity of Earth's climate has been calculated using global temperature, forcing and heating rate data for the period 1970-2010. We have assumed increased long-wave radiative forcing in the period due to the increase of the long-lived greenhouse gases. By assuming the change in aerosol forcing in the period to be zero, we calculate what we consider to be lower bounds to these sensitivities, as the magnitude of the negative aerosol forcing is unlikely to have diminished in this period. The radiation imbalance necessary to calculate equilibrium sensitivity is estimated from the rate of ocean heat accumulation as 0.37±0.03W m-2 (all uncertainty estimates are 1- σ). With these data, we obtain best estimates for transient climate sensitivity 0.39±0.07K (W m-2)-1 and equilibrium climate sensitivity 0.54±0.14K (W m-2)-1, equivalent to 1.5±0.3 and 2.0±0.5K (3.7W m-2)-1, respectively. The latter quantity is equal to the lower bound of the 'likely' range for this quantity given by the 2007 IPCC Assessment Report. The uncertainty attached to the lower- bound equilibrium sensitivity permits us to state, within the assumptions of this analysis, that the equilibrium sensitivity is greater than 0.31K (W m-2)-1, equivalent to 1.16K (3.7W m-2)-1, at the 95% confidence level.
Article
Full-text available
Aiming to describe spatio-temporal climate variability on decadal-to-centennial time scales and longer, we analyzed a data set of 26 proxy records extending back 1,000-5,000 years; all records chosen were calibrated to yield temperatures. The seven irregularly sampled series in the data set were interpolated to a regular grid by optimized methods and then two advanced spectral methods—namely singular-spectrum analysis (SSA) and the continuous wavelet transform—were applied to individual series to separate significant oscillations from the high noise background. This univariate analysis identified several common periods across many of the 26 proxy records: a millennial trend, as well as oscillations of about 100 and 200 years, and a broad peak in the 40-70-year band. To study common NH oscillations, we then applied Multichannel SSA. Temperature variations on time scales longer than 600 years appear in our analysis as a dominant trend component, which shows climate features consistent with the Medieval Warm Period and the Little Ice Age. Statistically significant NH-wide peaks appear at 330, 250 and 110 years, as well as in a broad 50-80-year band. Strong variability centers in several bands are located around the North Atlantic basin and are in phase opposition between Greenland and Western Europe.
Article
Full-text available
Comparison of simulated and reconstructed past climate variability within the last millennium provides an opportunity to aid the understanding and interpretation of palaeoclimate proxy data and to test hypotheses regarding external forcings, feedback mechanisms and internal climate variability under conditions close to those of the present day. Most such comparisons have been made at the Northern Hemispheric scale, of which a selection of recent results is briefly discussed here. Uncertainties in climate and forcing reconstructions, along with the simplified representations of the true climate system represented by climate models, limit our possibility to draw certain conclusions regarding the nature of forced and unforced climate variability. Additionally, hemispheric-scale temperature variations have been comparatively small, wherefore the last millennium is apparently not a particularly useful period for estimating climate sensitivity. Nevertheless, several investigators have concluded that Northern Hemispheric-scale decadal-mean temperatures in the last millennium show a significant influence from natural external forcing, where volcanic forcing is significantly detectable while solar forcing is less robustly detected. The amplitude of centennial-scale variations in solar forcing has been a subject for much debate, but current understanding of solar physics implies that these variations have been small - similar in magnitude to those within recent sunspot cycles - and thus they have not been a main driver of climate in the last millennium. This interpretation is supported by various comparisons between forced climate model simulations and temperature proxy data. Anthropogenic greenhouse gas and aerosol forcing has been detected by the end of Northern Hemispheric temperature reconstructions.
Article
Full-text available
We present an advanced two-layer climate model, especially appropriate to calculate the influence of an increasing CO2-concentration and a varying solar activity on global warming. The model describes the atmosphere and the ground as two layers acting simultaneously as absorbers and Planck radiators, and it includes additional heat transfer between these layers due to convection and evaporation. The model considers all relevant feedback processes caused by changes of water vapour, lapse-rate, surface albedo or convection and evaporation. In particular, the influence of clouds with a thermally or solar induced feedback is investigated in some detail. The short- and long-wave absorptivities of the most important greenhouse gases water vapour, carbon dioxide, methane and ozone are derived from line-by-line calculations based on the HITRAN08-databasis and are integrated in the model. Simulations including an increased solar activity over the last century give a CO2 initiated warming of 0.2 °C and a solar influence of 0.54 °C over this period, corresponding to a CO2 climate sensitivity of 0.6 °C (doubling of CO2) and a solar sensitivity of 0.5 °C (0.1 % increase of the solar constant).
Article
Full-text available
We present two new multi-proxy reconstructions of the extra-tropical Northern Hemisphere (30–90° N) mean temperature: a two-millennia long reconstruction reaching back to 1 AD and a 500-yr long reconstruction reaching back to 1500 AD. The reconstructions are based on compilations of 32 and 91 proxies, respectively, of which only little more than half pass a screening procedure and are included in the actual reconstructions. The proxies are of different types and of different resolutions (annual, annual-to-decadal, and decadal) but all have previously been shown to relate to local or regional temperature. We use a reconstruction method, LOCal (LOC), that recently has been shown to confidently reproduce low-frequency variability. Confidence intervals are obtained by an ensemble pseudo-proxy method that both estimates the variance and the bias of the reconstructions. The two-millennia long reconstruction shows a well defined Medieval Warm Period, with a peak warming ca. 950–1050 AD reaching 0.6 °C relative to the reference period 1880–1960 AD. The 500-yr long reconstruction confirms previous results obtained with the LOC method applied to a smaller proxy compilation; in particular it shows the Little Ice Age cumulating in 1580–1720 AD with a temperature minimum of −1.0 °C below the reference period. The reconstructed local temperatures, the magnitude of which are subject to wide confidence intervals, show a rather geographically homogeneous Little Ice Age, while more geographical inhomogeneities are found for the Medieval Warm Period. Reconstructions based on different subsets of proxies show only small differences, suggesting that LOC reconstructs 50-yr smoothed extra-tropical NH mean temperatures well and that low-frequency noise in the proxies is a relatively small problem.
Article
Full-text available
We analyse the spatio-temporal patterns of temperature variability over Northern Hemisphere land areas, on centennial time-scales, for the last 12 centuries using an unprecedentedly large network of temperature-sensitive proxy records. Geographically widespread positive temperature anomalies are observed from the 9th to 11th centuries, similar in extent and magnitude to the 20th century mean. A dominance of widespread negative anomalies is observed from the 16th to 18th centuries. Though we find the amplitude and spatial extent of the 20th century warming is within the range of natural variability over the last 12 centuries, we also find that the rate of warming from the 19th to the 20th century is unprecedented in the context of the last 1200 yr. The positive Northern Hemisphere temperature change from the 19th to the 20th century is clearly the largest between any two consecutive centuries in the past 12 centuries. These results remain robust even after removing a significant number of proxies in various tests of robustness showing that the choice of proxies has no particular influence on the overall conclusions of this study.
Article
Full-text available
A 1343-year tree-ring chronology was developed from Qilian junipers in the central Qilian Mountains of the northeastern Tibetan Plateau (TP), China. The climatic implications of this chronology were investigated using simple correlation, partial correlation and response function analyses. The chronology was significantly positively correlated with temperature variables prior to and during the growing season, especially with monthly minimum temperature. Minimum temperature anomalies from January to August since AD 670 were then reconstructed based on the tree ring chronology. The reconstruction explained 58% of the variance in the instrumental temperature records during the calibration period (1960–2012) and captured the variation patterns in minimum temperature at the annual to centennial timescales over the past millennium. The most recent 50 years were the warmest period, while 1690–1880 was the coldest period since AD 670. Comparisons with other temperature series from neighbouring regions and for the Northern Hemisphere as a whole supported the validity of our reconstruction and suggested that it provided a good regional representation of temperature change in the northeastern Tibetan Plateau. The results of wavelet analysis showed the occurrence of significant quasi-periodic patterns at a number of recurring periods (2–4, 40–50, and 90–170 years), which were consistent with those associated with El Niño–Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO) and solar activity. The comparison between the reconstructed temperature and the index of tropical volcanic radiative forcing indicated that some cold events recorded by tree rings may be due to the impact of tropical volcanic eruptions.
Article
Full-text available
The geographical location (latitude: 24° 16′ N and longitude: 55° 36′ E) of Al Ain city in the southwest of United Arab Emirates (UAE) favors the development and utilization of solar energy. This paper presents an artificial neural network (ANN) approach for predicting monthly global solar radiation (MGSR) on a horizontal surface in Al Ain. The ANN models are presented and implemented on 13-year measured meteorological data for Al Ain such as maximum temperature, mean wind speed, sunshine, and mean relative humidity between 1995 and 2007. The meteorological data between 1995 and 2004 are used for training the ANN and data between 2004 and 2007 are used for testing the predicted values. Multilayer perceptron (MLP) and radial basis function (RBF) neural networks are used for the modeling. Models for the MGSR were obtained using eleven combinations of data sets based on the above mentioned measured data for Al Ain city. Forecasting performance parameters such as root mean square error (RMSE), mean bias error (MBE), mean absolute percentage error (MAPE), and correlation coefficient (R2) are presented for the model. The values of RMSE, MBE, MAPE, and R2 are found to be, respectively, 35%, 0.307%, 3.88%, and 92%. A comparison of estimated MGSR with regression models is carried out. The ANN model predicts better than other models. The estimated MGSR data are in reasonable agreement with the actual values. The results indicate the capability of the ANN technique over unseen data and its ability to produce accurate prediction models.
Article
Full-text available
A 1342 yr-long tree-ring chronology was developed from Qilian junipers in the central Qilian Mountains of the north-eastern Tibetan Plateau, China. The climatic implications of this chronology were investigated using simple correlation, partial correlation and response function analyses. The chronology was significantly positively correlated with temperature variables during the pre- and current growing seasons, especially with minimum temperature. The variability of the mean minimum temperature from January to August since 670 AD was then reconstructed based on the tree-ring chronology. The reconstruction explained 58.5% of the variance in the instrumental temperature records during the calibration period (1960-2011) and captured the variation patterns in minimum temperature at the annual to centennial time scales over the past millennium. The most recent 50 yr were the warmest period, while 1690-1880 was the coldest period since 670 AD. Comparisons with other temperature series from neighbouring regions and for the Northern Hemisphere as a whole supported the validity of our reconstruction and suggested that it provided a good regional representation of temperature change in the north-eastern Tibetan Plateau. The results of multi-taper spectral analysis showed the occurrence of significant quasi-periodic behaviour at a number of periods (2-3, 28.8-66.2, 113.6-169.5, and 500 yr), which were consistent with those associated with El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO) and solar activity. Some reconstructed cold events may have close relationship with the volcanic eruptions.
Article
Full-text available
Equilibrium climate sensitivity (ECS) is constrained based on observed near-surface temperature change, changes in ocean heat content (OHC) and detailed radiative forcing (RF) time series from pre-industrial times to 2010 for all main anthropogenic and natural forcing mechanism. The RF time series are linked to the observations of OHC and temperature change through an energy balance model (EBM) and a stochastic model, using a Bayesian approach to estimate the ECS and other unknown parameters from the data. For the net anthropogenic RF the posterior mean in 2010 is 2.0 Wm−2, with a 90% credible interval (C.I.) of 1.3 to 2.8 Wm−2, excluding present-day total aerosol effects (direct + indirect) stronger than −1.7 Wm−2. The posterior mean of the ECS is 1.8 °C, with 90% C.I. ranging from 0.9 to 3.2 °C, which is tighter than most previously published estimates. We find that using three OHC data sets simultaneously and data for global mean temperature and OHC up to 2010 substantially narrows the range in ECS compared to using less updated data and only one OHC data set. Using only one OHC set and data up to 2000 can produce comparable results as previously published estimates using observations in the 20th century, including the heavy tail in the probability function. The analyses show a significant contribution of internal variability on a multi-decadal scale to the global mean temperature change. If we do not explicitly account for long-term internal variability, the 90% C.I. is 40% narrower than in the main analysis and the mean ECS becomes slightly lower, which demonstrates that the uncertainty in ECS may be severely underestimated if the method is too simple. In addition to the uncertainties represented through the estimated probability density functions, there may be uncertainties due to limitations in the treatment of the temporal development in RF and structural uncertainties in the EBM.
Much of Australia regularly experiences extremes of drought and flooding, with high variability in rainfall in many regions of the continent. Development of reliable and accurate medium-term rainfall forecasts is important, particularly for agriculture. Monthly rainfall forecasts 12 months in advance were made with artificial neural networks (ANNs), a form of artificial intelligence, for the locations of Bathurst Deniliquin, and Miles, which are agricultural hubs in the Murray Darling Basin, in southeastern Australia. Two different approaches were used for the optimisation of the ANN models. In the first, all months in each calendar year were optimised together, while in the second approach, rainfall forecasts for each month of the year were made individually. For each of the three locations for most months, higher forecast skill scores were achieved using single-month optimizations. In the case of Bathurst, however, for the months of November and December, the root mean square error (RMSE) for all-month optimisation was lower than for single-month optimisation. The best overall rainfall forecasts for each site were obtained by generating a composite of the two approaches, selecting the forecast for each month with the lowest forecast errors. Composite model skill score levels of at least 40% above that of climatology were achieved for all three locations, whereas skill level derived from forecasts using general circulation models is generally only comparable to climatology at the long-lead time of 8 months.
Article
Wind speed forecasting plays a pivotal role in power dispatching and normal operations of power grids. However, it is both a difficult and challenging problem to achieve high-precision forecasting for the wind speed because the original sequence includes many nonlinear stochastic signals. The current conventional forecasting methods are more suitable for capturing linear trends, and artificial neural networks easily fall into a local optimum. This paper proposes a model that combines a denoising method with a dynamic fuzzy neural network to address the problems above. Singular spectrum analysis optimized by brain storm optimization is applied to preprocess the original wind speed data to obtain a smoother sequence, and a generalized dynamic fuzzy neural network is utilized to perform the forecasting. With a smaller and simpler structure of the neural network, the model can effectively achieve a rapid learning rate and accurate forecasting. Three experimental results, which cover 10-minute interval, 30-miute interval and 60-minute interval wind speed time series data, demonstrate that the model can both satisfactorily approximates the actual value and be used as an effective and simple tool for the planning of smart grids.
Chapter
Surface air temperatures as measured at weather stations around the world are routinely homogenized before they are used to report anthropogenic global warming. The adjustment methodology relies on algorithms that determine homogeneity relative to other locations, and typically results in significant remodeling of individual temperature series. We demonstrate a new technique using temperature trends for southeast Australia from 1887 to 2013, using more traditional quality control methods and based on analysis of statistical variation within and between years. Considering a weighted mean of the five highest-quality maximum temperature time series, trends from 1887 show statistically significant cooling of −1.5°C per century to 1950, followed by rapid warming of 1.9°C per century to 2013. The cooling trend is more pronounced where irrigation development for large-scale rice cultivation has occurred. Neither the cooling nor the magnitude of the recent warming can be explained by anthropogenic global warming theory.
Article
The evolution of industrial-era warming across the continents and oceans provides a context for future climate change and is important for determining climate sensitivity and the processes that control regional warming. Here we use post 1500 palaeoclimate records to show that sustained industrial-era warming of the tropical oceans first developed during the mid-nineteenth century and was nearly synchronous with Northern Hemisphere continental warming. The early onset of sustained, significant warming in palaeoclimate records and model simulations suggests that greenhouse forcing of industrial-era warming commenced as early as the mid-nineteenth century and included an enhanced equatorial ocean response mechanism. The development of Southern Hemisphere warming is delayed in reconstructions, but this apparent delay is not reproduced in climate simulations. Our findings imply that instrumental records are too short to comprehensively assess anthropogenic climate change and that, in some regions, about 180 years of industrial-era warming has already caused surface temperatures to emerge above pre-industrial values, even when taking natural variability into account.
Article
The evolution of industrial-era warming across the continents and oceans provides a context for future climate change and is important for determining climate sensitivity and the processes that control regional warming. Here we use posted 1500 palaeoclimate records to show that sustained industrial-era warming of the tropical oceans first developed during the mid-nineteenth century and was nearly synchronous with Northern Hemisphere continental warming. The early onset of sustained, significant warming in palaeoclimate records and model simulations suggests that greenhouse forcing of industrial-era warming commenced as early as the mid-nineteenth century and included an enhanced equatorial ocean response mechanism. The development of Southern Hemisphere warming is delayed in reconstructions, but this apparent delay is not reproduced in climate simulations. Our findings imply that instrumental records are too short to comprehensively assess anthropogenic climate change and that, in some regions, about 180 years of industrial-era warming has already caused surface temperatures to emerge above pre-industrial values, even when taking natural variability into account.
Article
We present multi-proxy warm season (September–February) temperature reconstructions for the combined land–ocean region of Australasia (0°S–50°S, 110°E–180°E) covering A.D. 1000-2001. Using between two (R2) and 28 (R28) proxy records we compare four 1000-member ensemble reconstructions of regional temperature using four statistical methods: Principal Component Regression (PCR), Composite Plus Scale (CPS), Bayesian Hierarchical Models (LNA) and Pairwise Comparison (PaiCo). The reconstructions are compared with a three-member ensemble of GISS-E2-R model simulations and independent palaeoclimate records. Decadal fluctuations in Australasian temperatures are remarkably similar between the four reconstruction methods. There are, however, differences in the amplitude of temperature variations between the different statistical methods and proxy networks. When the R28 network is used, the warmest 30-year periods occur after 1950 in more than 70% of ensemble members for all methods. However, reconstructions based on only the longest records (R2 and R3 networks) indicate that single 30-year and 10-year periods of similar or slightly higher temperatures than in the late 20th century may have occurred during the first half of the millennium. Regardless, the most recent instrumental temperatures (1985–2014) are above the 90th percentile of all twelve reconstruction ensembles (four reconstruction methods based on three proxy networks, R28, R3, R2). The reconstructed 20th century warming cannot be explained by natural variability alone using the GISS E-2 model. In this climate model, anthropogenic forcing is required to produce the rate and magnitude of post-1950 warming observed in the Australasian region. Our palaeoclimate results are consistent with other studies that attribute the post-1950 warming in Australian temperature records to increases in atmospheric greenhouse gas concentrations.
Article
Time series of sea-level rise are fitted by a sinusoid of period ~ 60 years, confirming the cycle reported for the global mean temperature of the earth. This cycle appears in phase with the Atlantic Multidecadal Oscillation (AMO). The last maximum of the sinusoid coincides with the temperature plateau observed since the end of the 20th century. The onset of declining phase of AMO, the recent excess of the global sea ice area anomaly and the negative slope of global mean temperature measured by satellite from 2002 to 2015, all these indicators sign for the onset of the declining phase of the 60-year cycle. Once this cycle is subtracted from observations, the transient climate response is revised downwards consistent with latest observations, with latest evaluations based on atmospheric infrared absorption and with a general tendency of published climate sensitivity. The enhancement of the amplitude of the CO2 seasonal oscillations which is found up to 71% faster than the atmospheric CO2 increase, focus on earth greening and benefit for crops yields of the supplementary photosynthesis, further minimizing the consequences of the tiny anthropogenic contribution to warming.
Article
A 2650-year (BC665-AD1985) warm season (MJJA: May, June, July, August) temperature reconstruction is derived from a correlation between thickness variations in annual layers of a stalagmite from Shihua Cave, Beijing, China and instrumental meteorological records. Observations of soil CO2 and drip water suggest that the temperature signal is amplified by the soil-organism-CO2 system and recorded by the annual layer series. Our reconstruction reveals that centennial-scale rapid warming occurred repeatedly following multicentenial cooling trends during the last millennia. These results correlate with different records from the Northern Hemisphere, indicating that the periodic alternation between cool and warm periods on a sub-millennial scale had a sub-hemispherical influence.
Article
A simplified mathematical model has been developed to understand the effect of carbon dioxide on the mechanism of global warming. The heat transfer in the atmosphere, in particular the radiation, can be described by known relations in thermal engineering. Here, the radiation exchange among the gas which contains water vapor and carbon dioxide and the Earth's surface as well as the clouds is considered. The emissivity of the gases are a function of temperature, gas concentration and beam length of the atmospheric layer defined through the model. The model is validated with the known average temperature of the Earth. The emissivity of the clouds acts as an adjustment parameter. The temperature of the Earth increases significantly with the CO2 concentration. While doubling the concentration of CO2, the temperature of the Earth increases by 0.4 K.
Article
Climate is discussed as an integral part of 'System Earth', determined by a complex interplay of numerous geological, biological and solar processes. The historical and geological record of changing climate and atmospheric CO2 pressure does not support the current popular vision that this greenhouse gas is the dominant climate controlling agent. When empirically ante post tested against past global climate changes, the 'forecasts' of the climate models mainly based on forcing by atmospheric CO2 are not borne out. On the other hand, recent studies show that solar variability rather than changing CO2 pressure is an important, probably the dominant climate forcing factor.
Article
Long-term temperature variability has significant effects on runoff into the upper reaches of inland rivers. This paper developed a tree-ring chronology of Qilian juniper (Sabina przewalskii Kom.) from the upper tree-line of the middle Qilian Mountains within the upper reaches of Heihe River Basin, Northwest China for a long-term reconstruction of temperature at the study site. In this paper, tree-ring chronology was used to examine climate-growth associations considering local climate data obtained from Qilian Meteorological Station. The results showed that temperatures correlated extremely well with standardized growth indices of trees (r=0.564, P<0.001). Tree-ring chronology was highest correlated with annual mean temperature (r=0.641, P<0.0001). Annual mean temperature which spans the period of 1445–2011 was reconstructed and explained 57.8% of the inter-annual to decadal temperature variance at the regional scale for the period 1961–2011. Spatial correlation patterns revealed that reconstructed temperature data and gridded temperature data had a significant correlation on a regional scale, indicating that the reconstruction represents climatic variations for an extended area surrounding the sampling sites. Analysis of the temperature reconstruction indicated that major cold periods occurred during the periods of 1450s–1480s, 1590s–1770s, 1810s–1890s, 1920s–1940s, and 1960s–1970s. Warm intervals occurred during 1490s–1580s, 1780s–1800s, 1900s–1910s, 1950s, and 1980s to present. The coldest 100-year and decadal periods occurred from 1490s–1580s and 1780s–1800s, respectively, while the warmest 100 years within the studied time period was the 20th century. Colder events and intervals coincided with wet or moist conditions in and near the study region. The reconstructed temperature agreed well with other temperature series reconstructed across the surrounding areas, demonstrating that this reconstructed temperature could be used to evaluate regional climate change. Compared to the tree-ring reconstructed temperature from nearby regions and records of glacier fluctuations from the surrounding high mountains, this reconstruction was reliable, and could aid in the evaluation of regional climate variability. Spectral analyses suggested that the reconstructed annual mean temperature variation may be related to large-scale atmospheric–oceanic variability such as the solar activity, Pacific Decadal Oscillation (PDO) and El Niño–Southern Oscillation (ENSO). © 2016, Xinjiang Institute of Ecology and Geography, the Chinese Academy of Sciences and Springer - Verlag GmbH.
Article
We present a large-scale dendroclimatic reconstruction of July temperatures from 42–52°N to 140–145°E in the Northwest Pacific region for the period from 1800 to 1996. A multiple regression model with principal components (PCs) of a tree-ring chronology network was used for the reconstruction, which accounted for 31.7% of the temperature variance in the calibration period (1901–1996). The reconstructed spatially-averaged July temperatures show large fluctuations, which are comparable to previously published dendroclimatic reconstruction of spring temperatures in northeast Asia. It also shows stable relationships with other datasets, notably sea surface temperatures (SSTs) in a wide area of the North Pacific and the Pacific Decadal Oscillation (PDO), indicating atmospheric–oceanic interaction in the Northwest Pacific region since AD 1800.
Article
Equilibrium climate sensitivity (ECS) is inferred from estimates of instrumental-period warming attributable solely to greenhouse gases (AW), as derived in two recent multi-model detection and attribution (D&A) studies that apply optimal fingerprint methods with high spatial resolution to 3D global climate model simulations. This approach minimises the key uncertainty regarding aerosol forcing without relying on low-dimensional models. The “observed” AW distributions from the D&A studies together with an observationally-based estimate of effective planetary heat capacity (EHC) are applied as observational constraints in (AW, EHC) space. By varying two key parameters—ECS and effective ocean diffusivity—in an energy balance model forced solely by greenhouse gases, an invertible map from the bivariate model parameter space to (AW, EHC) space is generated. Inversion of the constrained (AW, EHC) space through a transformation of variables allows unique recovery of the observationally-constrained joint distribution for the two model parameters, from which the marginal distribution of ECS can readily be derived. The method is extended to provide estimated distributions for transient climate response (TCR). The AW distributions from the two D&A studies produce almost identical results. Combining the two sets of results provides best estimates (5-95 % ranges) of 1.66 (0.7-3.2) K for ECS and 1.37 (0.65-2.2) K for TCR, in line with those from several recent studies based on observed warming from all causes but with tighter uncertainty ranges than for some of those studies. Almost identical results are obtained from application of an alternative profile likelihood statistical methodology.
Article
We present a decadal-scale late Holocene climate record based on diatoms, biogenic silica, and grain size from a 12-m sediment core (VEC02A04) obtained from Frederick Sound in the Seymour-Belize Inlet Complex of British Columbia, Canada. Sediments are characterized by graded, massive, and laminated intervals. Laminated intervals are most common between c. 2948-2708 cal. yr BP and c. 1992-1727 cal. yr BP. Increased preservation of laminated sediments and diatom assemblage changes at this time suggest that climate became moderately drier and cooler relative to the preceding and succeeding intervals. Spectral and wavelet analyses are used to test for statistically significant periodicities in time series of proxies of primary production (total diatom abundance, biogenic silica) and hydrology (grain size) preserved in the Frederick Sound record. Periodicities of c. 42-53, 60-70, 82-89, 241-243, and 380 yrs are present. Results are compared to reconstructed sunspot number data of Solanki et al. (2004) using cross wavelet transform to evaluate the role of solar forcing on NE Pacific climate. Significant common power of periodicities between c. 42-60, 70-89, 241-243, and of 380 yrs occur, suggesting that celestial forcing impacted late Holocene climate at Frederick Sound. Replication of the c. 241-243 yr periodicity in sunspot time series is most pronounced between c. 2900 cal. yr BP and c. 2000 cal. yr BP, broadly correlative to the timing of maximum preservation of laminated sedimentary successions and diatom assemblage changes. High solar activity at the Suess/de Vries band may have been manifested as a prolonged westward shift and/or weakening of the Aleutian Low in the mid-late Holocene, which would have diverted fewer North Pacific storms and resulted in the relatively dry conditions reconstructed for the Seymour-Belize Inlet Complex.
Article
Energy budget estimates of equilibrium climate sensitivity (ECS) and transient climate response (TCR) are derived using the comprehensive 1750–2011 time series and the uncertainty ranges for forcing components provided in the Intergovernmental Panel on Climate Change Fifth Assessment Working Group I Report, along with its estimates of heat accumulation in the climate system. The resulting estimates are less dependent on global climate models and allow more realistically for forcing uncertainties than similar estimates based on forcings diagnosed from simulations by such models. Base and final periods are selected that have well matched volcanic activity and influence from internal variability. Using 1859–1882 for the base period and 1995–2011 for the final period, thus avoiding major volcanic activity, median estimates are derived for ECS of 1.64 K and for TCR of 1.33 K. ECS 17–83 and 5–95 % uncertainty ranges are 1.25–2.45 and 1.05–4.05 K; the corresponding TCR ranges are 1.05–1.80 and 0.90–2.50 K. Results using alternative well-matched base and final periods provide similar best estimates but give wider uncertainty ranges, principally reflecting smaller changes in average forcing. Uncertainty in aerosol forcing is the dominant contribution to the ECS and TCR uncertainty ranges.
Conference Paper
This paper presents a novel neural network-based approach to short-term, multi-step-ahead wind speed forecasting. The methodology combines predictions from a set of feed forward neural networks whose inputs comprehend a set of 11 explanatory variables related to past averages of wind speed, direction, temperature and time of the day, and their outputs represent estimates of specific wind speed averages. Forecast horizons range from 30 minutes up to 6:30 hours ahead with 30 minutes time steps. Final forecasts at specific horizons are combinations of corresponding neural network predictions. Data used in the experiments are telemetric measurements of weather variables from five wind farms in eastern Canada, covering the period from November 2011 to April 2013. Results show that the methodology is effective and outperforms established reference models particularly at longer horizons. The method performed consistently across sites leading up to more than 60% improvement over persistence and 50 % over a more realistic MA-based reference.
Article
A new summer temperature proxy was built for northern Fennoscandia in AD 1000-2004 using parameters of tree growth from a large region, extending from the Swedish Scandes to the Kola Peninsula. It was found that century-scale (55-140 year) cyclicity is present in this series during the entire time interval. This periodicity is highly significant and has a bi-modal structure, i.e. it consists of two oscillation modes, 55-100 year and 100-140 year variations. A comparison of the century-long variation in the northern Fennoscandian temperature proxy with the corresponding variations in Wolf numbers and concentration of cosmogenic 10Be in glacial ice shows that a probable cause of this periodicity is the modulation of regional climate by the secular solar cycle of Gleissberg. This is in line with the results obtained previously for a more limited part of the region (Finnish Lapland: 68-70° N, 20-30° E). Thus the reality of a link between long-term changes in solar activity and climate in Fennoscandia has been confirmed. Possible mechanisms of solar influence on the lower troposphere are discussed.
Article
Predictions of climate change are uncertain mainly because of uncertainties in the emissions of greenhouse gases and how sensitive the climate is to changes in the abundance of the atmospheric constituents. The equilibrium climate sensitivity is defined as the temperature increase because of a doubling of the CO2 concentration in the atmosphere when the climate reaches a new steady state. CO2 is only one out of the several external factors that affect the global temperature, called radiative forcing mechanisms as a collective term. In this paper, we present a model framework for estimating the climate sensitivity. The core of the model is a simple, deterministic climate model based on elementary physical laws such as energy balance. It models yearly hemispheric surface temperature and global ocean heat content as a function of historical radiative forcing. This deterministic model is combined with an empirical, stochastic model and fitted to observations on global temperature and ocean heat content, conditioned on estimates of historical radiative forcing. We use a Bayesian framework, with informative priors on a subset of the parameters and flat priors on the climate sensitivity and the remaining parameters. The model is estimated by Markov Chain Monte Carlo techniques. Copyright © 2012 John Wiley & Sons, Ltd.