ArticlePDF Available

Special Issue: Geostatistics and Machine Learning

Authors:

Abstract

Recent years have seen a steady growth in the number of papers that apply machine learning methods to problems in the earth sciences. Although they have different origins, machine learning and geostatistics share concepts and methods. For example, the kriging formalism can be cast in the machine learning framework of Gaussian process regression. Machine learning, with its focus on algorithms and ability to seek, identify, and exploit hidden structures in big data sets, is providing new tools for exploration and prediction in the earth sciences. Geostatistics, on the other hand, offers interpretable models of spatial (and spatiotemporal) dependence. This special issue on Geostatistics and Machine Learning aims to investigate applications of machine learning methods as well as hybrid approaches combining machine learning and geostatistics which advance our understanding and predictive ability of spatial processes.
Math Geosci (2022) 54:459–465
https://doi.org/10.1007/s11004-022-09998-6
SPECIAL ISSUE
Special Issue: Geostatistics and Machine Learning
Sandra De Iaco1·Dionissios T. Hristopulos2·
Guang Lin3
Received: 14 February 2022 / Accepted: 15 February 2022 / Published online: 21 March 2022
© The Author(s) 2022, corrected publication 2022
Abstract Recent years have seen a steady growth in the number of papers that apply
machine learning methods to problems in the earth sciences. Although they have
different origins, machine learning and geostatistics share concepts and methods. For
example, the kriging formalism can be cast in the machine learning framework of
Gaussian process regression. Machine learning, with its focus on algorithms and ability
to seek, identify, and exploit hidden structures in big data sets, is providing new
tools for exploration and prediction in the earth sciences. Geostatistics, on the other
hand, offers interpretable models of spatial (and spatiotemporal) dependence. This
special issue on Geostatistics and Machine Learning aims to investigate applications of
machine learning methods as well as hybrid approaches combining machine learning
and geostatistics which advance our understanding and predictive ability of spatial
processes.
Keywords Geostatistics ·Statistical learning ·Machine learning ·Spatial process ·
Gaussian process regression
BSandra De Iaco
sandra.deiaco@unisalento.it
Dionissios T. Hristopulos
dchristopoulos@ece.tuc.gr
Guang Lin
guanglin@purdue.edu
1Department of Economic Sciences, Sect. of Mathematics and Statistics, University of Salento,
Lecce, Italy
2School of Electrical and Computer Engineering, Technical University of Crete, 73100 Chania,
Greece
3Department of Mathematics & School of Mechanical Engineering, Purdue University, West
Lafayette, IN 47907, USA
123
460 Math Geosci (2022) 54:459–465
1 Introduction
This special issue explores connections between Geostatistics and Machine Learn-
ing, and their applications in spatial data processing and modeling. Applications of
machine learning in the geosciences have become quite popular in recent years fol-
lowing the development of tools such as random forests and deep learning. A search
in the databases of the IAMG journals Mathematical Geosciences and Computers and
Geosciences with the keyword “machine learning” returns 105 and 319 hits, respec-
tively. The majority of these contributions are dated after 2016, a fact which indicates
the accelerating interest in machine learning for the analysis of spatial data. In the fol-
lowing paragraphs of this section we present a partial and undoubtedly biased account
of some links between geostatistics and machine learning. We also briefly report on
some recent developments in machine learning which we believe are relevant for
the geosciences. Finally, we touch on remaining methodological and computational
challenges.
The application of machine learning to earth science data has been spearheaded by
Mikhail Kanevski and coworkers (Demyanov et al. 1998; Kanevski et al. 2004,2009;
Kanevski and Demyanov 2015). In recent years, following the increasing interest
in machine learning, several review papers have discussed its potential uses in geo-
sciences and remote sensing (Dramsch 2020; Karpatne et al. 2019;Laryetal.2016;
Shen et al. 2021). A nontechnical account which focuses on the challenges related to
the extraction of information from earth science data sets and the opportunities created
by machine learning is given in Maskey et al. (2020).
Modeling of spatial data in the earth sciences usually involves one of the following
three fundamental problems: (i) the classification problem which concerns predicting
the class label for categorical data; (ii) the regression problem which is related to the
prediction of continuous data; and (iii) the problem of probability density function esti-
mation for uncertain processes (Williams and Rasmussen 2006; Kanevski et al. 2009).
These problems can be addressed by means of geostatistical methods or machine learn-
ing models and algorithms, or in terms of combined solutions. Both machine learning
and geostatistics provide powerful frameworks for spatial data processing. Combina-
tions of these two approaches can lead to flexible and computationally efficient spatial
models, as some of the papers in this special issue highlight.
As mentioned in the abstract, certain machine learning methods share concepts
with geostatistical approaches. For example, geostatistical interpolation by means of
optimal linear estimation (kriging) and Gaussian process regression are both based on
the theory of Gaussian random fields (Gaussian processes) (Adler and Taylor 2009;
Yaglom 1987; Chilès and Delfiner 2012; Williams and Rasmussen 2006). Positive def-
inite functions (i.e., “covariance functions” in geostatistics and “covariance kernels”
in machine learning) play a key role in problems of interpolation, classification, clus-
tering, and simulation, whether these are treated in the geostatistical or in the machine
learning framework. Machine learning, however, also includes methods which are
based on algorithms (procedures consisting of specific steps) instead of explicitly
defined mathematical models.
A key issue that both machine learning and geostatistical approaches need to address
in the face of big earth data sets is the scaling of the required computational resources
123
Math Geosci (2022) 54:459–465 461
with the data size N. For example, regression and classification tasks with spatially
dependent data require the inversion of dense covariance (Gram) matrices, an operation
which has a computational complexity of O(N3). This scaling affects both kriging and
Gaussian process-based methods, and it is prohibitive for very big data sets (Chilès and
Delfiner 2012; Hristopulos 2020; Williams and Rasmussen 2006). The problem can
be alleviated by means of different methods such as the stochastic partial differential
equation (SPDE) approach that relies on a sparse solution basis (Lindgren et al. 2011,
2021; Vergara et al. 2022), the stochastic local interaction approach (Hristopulos 2015;
Hristopulos et al. 2021) that exploits sparse expressions for the precision (inverse
covariance) matrix, or composite likelihood methods that break down the calculation
of the likelihood in terms of smaller subsets of the data (Bevilacqua et al. 2012).
Neural networks are a cornerstone of modern machine learning. These models
can be trained to discover features which are hidden in high-dimensional data sets.
Neural networks comprise a large number of parameters which need to be tuned, so
overfitting is a likely problem. However, this is avoided within the Bayesian frame-
work by assigning probability distributions (instead of single values) to the weights of
the connections between different neurons. Bayesian neural networks have the abil-
ity to capture cross-correlations and are therefore potentially useful in problems that
involve data with spatial or spatiotemporal dependence. A Bayesian neural network
contains a number of hidden layers where information is processed. “Deep neural
networks” involve a high number of such layers. Surprisingly, the limit of an infinitely
deep neural network is a Gaussian process (Neal 1996). Neural networks that are not
infinitely deep can capture correlations between different output variables. This feature
can lead to improved spatial prediction in the case of multivariate data sets (Wilson
et al. 2011). A well-known spatial data set from the Swiss Jura Mountains com-
prises measurements of soil concentration for seven toxic metals (Goovaerts 1997).
The Gaussian process regression network (GPRN) developed by Wilson et al. (2011)
predicted cadmium concentration more accurately (i.e., with lower mean absolute
error) than co-kriging. Improved Gaussian process regression models for multivariate
problems (called “multi-output” in machine learning jargon) have since been devel-
oped; these include the Gaussian process autoregressive regression model (GPAR)
(Requeima et al. 2019) and the multi-output Gaussian processes (MOGPs) (Bruinsma
et al. 2020). To our knowledge, the strength of these methods has not yet been inves-
tigated in earth sciences applications.
An important topic in geosciences is the spatiotemporal modeling of dynamic envi-
ronmental processes. Reliable conceptual and quantitative models are necessary to
achieve improved understanding, to better forecast potential environmental hazards,
and to quantify uncertainties. This general problem can be pursued by means of two
different approaches. The first one involves data-driven spatiotemporal prediction
(e.g., regression and classification) by means of geostatistical and machine learn-
ing methods. One of the open problems is the formulation of epistemically adequate
and computationally efficient methods of characterizing spatiotemporal dependence
(Christakos 2000; De Iaco et al. 2001,2002; Cappello et al. 2018; Hristopulos and
Agou 2020; Cappello et al. 2020; Porcu et al. 2021). Addressing this problem requires
the construction of space-time covariance functions or precision operators which are
mathematically well defined and capture the dynamically generated correlations of
123
462 Math Geosci (2022) 54:459–465
realistic space-time systems. The issue of proper definition of covariance functions
(i.e., functions that satisfy permissibility conditions) is well known in mathematics
and statistics (De Iaco and Posa 2018), but it is not always recognized in the applied
sciences literature. This oversight can lead to the use of non-permissible covariance
models which result in numerical instabilities. Computational efficiency requires the
implementation of methods that can alleviate the problem of numerical inversion of
very large matrices resulting from extended space-time domains.
A different approach to spatiotemporal modeling involves the solution of partial dif-
ferential equations that model specific earth processes (e.g., transport of contaminants
in the groundwater, or geophysical fluid dynamics). Machine learning is providing new
tools, such as physics-inspired neural networks (PINNs) (Karniadakis et al. 2021; Yang
et al. 2021), for this type of problems. In the PINN framework, deep neural networks
are trained using a combination of data and constraints imposed by the physical laws.
This hybrid framework gives more weight to the model of the system when the data are
sparse, but progressively shifts focus to the data when the latter are abundant. PINNs
can be used for both forward and inverse as well as high-dimensional problems.
In the next section, we present a short introduction to the six articles of this spe-
cial issue on Geostatistics and Machine Learning. The topics covered in these papers
represent an eclectic selection of practical problems and machine learning approaches
used to tackle them. The contributions of the special issue also present fertile combi-
nations of machine learning and geostatistical methods tailored to address problems
that involve spatial dependence.
Summary of Articles in this Special Issue
The paper titled A comparison between machine learning and functional geostatis-
tics approaches for data-driven analyses of solid transport in a pre-Alpine stream” by
Oleksandr Didkovskyi et al. focuses on predicting the probability of pebble movement
in streams using two different approaches: the machine learning method of gradient
boosting decision trees (based on the computationally efficient XGBoost algorithm)
and the geostatistical method of functional kriging. Both approaches take into account
geometrical features of pebbles and the stream flow rate as input variables. The per-
formance of the two methods is compared in terms of the accuracy with which they
classify the motion (or lack of mobility) of pebbles. The probability of movement has
a highly nonlinear dependence on the morphological features and the stream’s flow
rate and is thus difficult to predict using physics-based methods. In spite of the quite
different perspectives of XGBoost and functional kriging, analysis of the results shows
that both methods perform similarly well and can provide useful modeling frameworks
for sediment transport.
The paper titled “Bayesian deep learning for spatial interpolation in the presence
of auxiliary information” by Charlie Kirkwood et al. focuses on feature learning in
a geostatistical context, by showing how deep neural networks can automatically
learn the complex high-order patterns by which point-sampled target variables relate
to gridded auxiliary variables, and in doing so produce detailed maps. This work
demonstrates how both aleatoric and epistemic uncertainty can be quantified in the
123
Math Geosci (2022) 54:459–465 463
deep learning approach via a Bayesian approximation known as Monte Carlo dropout.
Numerical results indicate the suitability of Bayesian deep learning and its feature
learning capabilities for large-scale geostatistical applications.
The paper titled “Surface Warping Incorporating Machine Learning Assisted
Domain Likelihood Estimation: A New Paradigm in Mine Geology Modelling and
Automation” by Raymond Leung et al. introduces the use of machine learning to sup-
port a Bayesian warping technique applied to reshape modeled surfaces on the basis of
new geochemical observations and spatial constraints. This helps to improve the iden-
tification of boundaries of different spatial domains for grade estimation in mining,
which represents a complex problem, set in a Bayesian framework. A strength of the
manuscript is the assessment of the effectiveness of a range of classifiers. Indeed, the
machine learning performance is computed for neural network, random forest, gradi-
ent boosting, and other classifiers in a binary and multi-class context. The manuscript
represents progress in this evolving field, and further research will continue to address
the problems presented.
The paper titled A Hybrid Estimation Technique Using Elliptical Radial Basis
Neural Networks and Cokriging” by Matthew Samson and Clayton V. Deutsch focuses
on a hybrid machine learning and geostatistical algorithm to improve estimation in
complex domains. The hybrid estimation technique integrates both elliptical radial
basis neural networks and cokriging. Elliptical radial basis function neural networks
(ERBFN) take advantage of nonstationary functions to generate geological estimates.
An ERBFN does not require the assumption of stationarity, and the only input features
required are the spatial coordinates of the known data. The proposed hybrid estimation
considers the machine learning estimate as exhaustive secondary data in ordinary
intrinsic collocated cokriging, taking advantage of kriging’s exactitude while including
the nonstationary features modeled in the ERBFN. The numerical results demonstrate
that this hybrid method can greatly improve mineral resource estimation.
The paper titled “Stochastic Modelling of Mineral Exploration Targets” by Hasan
Talebi et al. focuses on the topic of mineral prospectivity mapping and proposes a
method that can handle various types of uncertainties. The authors propose a multivari-
ate stochastic model which can be used for prediction and uncertainty quantification
of mineral exploration targets. The model combines multivariate geostatistical simula-
tions with a spatial machine learning (random forest) algorithm. The latter incorporates
information from higher-order spatial statistics. The proposed approach is tested
using a synthetic case study with multiple geochemical, geophysical, and litholog-
ical attributes. The new hybrid (geostatistics/machine learning) method demonstrates
enhanced detection capabilities and thus provides a promising tool for investigating
mineral prospectivity.
The paper titled “Robust Feature Extraction for Geochemical Anomaly Recogni-
tion Using a Stacked Convolutional Denoising Autoencoder” by Yihui Xiong and
Renguang Zuo focuses on an optimized deep neural network for the recognition
of multivariate geochemical anomalies, especially in the presence of missing val-
ues. In particular, the authors propose a stacked convolutional denoising autoencoder
(SCDAE) to extract robust features and decrease the sensitivity to partially corrupted
data. The corresponding parameters, which include the network depth, number of con-
volution layers, number of pooling layers, number of filters, and their respective sizes
123
464 Math Geosci (2022) 54:459–465
(i.e., the number of convolution kernels, convolution kernel size, number of pooling
kernels, and pooling kernel size), and the sliding stride, are optimized using trial-and-
error experiments. The performance of the optimal SCDAE architecture in recognizing
multivariate geochemical anomalies, based on the differences in the reconstruction
errors between sample populations, is discussed through a case study regarding the
mineralization in the southwestern Fujian Province. They also show that SCDAE has
a better feature representation capacity than both the stacked convolutional autoen-
coder and stacked denoising autoencoder for geochemical anomaly recognition with
different corruption levels. The robustness of the SCDAE encourages its application
to various geochemical exploration scenarios, especially when there are incomplete
or missing data.
Acknowledgements GL gratefully acknowledges the support from the National Science Foundation
(DMS-1555072, DMS-1736364, CMMI-1634832, and CMMI-1560834), and the Brookhaven National
Laboratory Subcontract 382247, ARO/MURI grant W911NF-15-1-0562, and U.S. Department of Energy
(DOE) Office of Science Advanced Scientific Computing Research program DE-SC0021142.
Funding Open access funding provided by Universitá degli Studi di Milano within the CRUI-CARE
Agreement.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License,
which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long
as you give appropriate credit to the original author(s) and the source, provide a link to the Creative
Commons licence, and indicate if changes were made. The images or other third party material in this
article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line
to the material. If material is not included in the article’s Creative Commons licence and your intended use
is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission
directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/
by/4.0/.
References
Adler RJ, Taylor JE (2009) Random fields and geometry. Springer, Berlin
Bevilacqua M, Gaetan C, Mateu J, Porcu E (2012) Estimating space and space-time covariance functions
for large data sets: a weighted composite likelihood approach. J Am Stat Assoc 107(497):268–280
Bruinsma W, Perim E, Tebbutt W, Hosking S, Solin A, Turner R (2020) Scalable exact inference in
multi-output Gaussian processes. In: Daumé H, Singh A (eds) Proceedings of the 37th international
conference on machine learning, volume 119 of Proceedings of Machine Learning Research, PMLR,
pp 1190–1201
Cappello C, De Iaco S, Posa D (2018) Testing the type of non-separability and some classes of space-time
covariance function models. Stoch Environ Res Risk Assess 32:17–35
Cappello C, De Iaco S, Posa D (2020) covatest: an R package for selecting a class of space-time covariance
functions. J Stat Softw 94(1):1–42
Chilès JP, Delfiner P (2012) Geostatistics: modeling spatial uncertainty, 2nd edn. Wiley, New York
Christakos G (2000) Modern spatiotemporal geostatistics. Oxford University Press, Oxford
De Iaco S, Myers DE, Posa D (2001) Space-time analysis using a general product-sum model. Stat Probab
Lett 52(1):21–28
De Iaco S, Myers DE, Posa D (2002) Nonseparable space-time covariance models: some parametric families.
Math Geol 34(1):23–42
De Iaco S, Posa D (2018) Strict positive definiteness in geostatistics. Stoch Environ Res Risk Assess
32:577–590
Demyanov V, Kanevsky M, Chernov S, Savelieva E, Timonin V (1998) Neural network residual kriging
application for climatic data. J Geogr Inf Decis Anal 2(2):215–232
123
Math Geosci (2022) 54:459–465 465
Dramsch JS (2020) 70 years of machine learning in geoscience in review. Adv Geophys 61:1–55
Goovaerts P (1997) Geostatistics for natural resources evaluation. Oxford University Press, New York, NY
Hristopulos DT (2015) Stochastic local interaction (SLI) model: Bridging machine learning and geostatis-
tics. Comput Geosci 85(Part B):26–37
Hristopulos DT (2020) Random fields for spatial data modeling. Springer, Dordrecht
Hristopulos DT, Agou VD (2020) Stochastic local interaction model with sparse precision matrix for
space–time interpolation. In: spatial Statistics 40:100403, space-time modeling of rare events and
environmental risks: METMA conference
Hristopulos DT, Pavlides A, Agou VD, Gkafa P (2021) Stochastic local interaction model: an alternative
to kriging for massive datasets. Math Geosci 53:1907–1949
Kanevski M, Demyanov V (2015) Statistical learning in geoscience modelling: novel algorithms and chal-
lenging case studies. Comput Geosci 85:1–2
Kanevski M, Kanevski MF, Maignan M (2004) Analysis and modelling of spatial environmental data, vol
6501. EPFL Press, Lausanne
Kanevski M, Timonin V, Pozdnukhov A (2009) Machine learning for spatial environmental data: theory,
applications, and software. EPFL Press, Lausanne
Karniadakis GE, Kevrekidis IG, Lu L, Perdikaris P, Wang S, Yang L (2021) Physics-informed machine
learning. Nature Reviews Nat Rev Phys 3(6):422–440
Karpatne A, Ebert-Uphoff I, Ravela S, Babaie HA, Kumar V (2019) Machine learning for the geosciences:
challenges and opportunities. IEEE Trans Knowl Data Eng 31(8):1544–1554
Lary DJ, Alavi AH, Gandomi AH, Walker AL (2016) Machine learning in geosciences and remote sensing.
Geosci Front 7(1):3–10
Lindgren F, Bolin D, Rue H (2021) The spde approach for gaussian and non-gaussian fields: 10 years and
still running
Lindgren F, Rue H, Lindström J (2011) An explicit link between gaussian fields and Gaussian Markov
random fields: the stochastic partial differential equation approach. J R Stat Soc Ser B (Stat Methodol)
73(4):423–498
Maskey M, Alemohammad H, Murphy K, Ramachandran R (2020) Advancing AI for Earth science: a data
systems perspective. Eos 101
Neal RM (1996) Bayesian learning for neural networks, vol 118. Springer, New York
Porcu E, Furrer R, Nychka D (2021) 30 years of space-time covariance functions. WIREs Comput Stat
13(2):e1512
Requeima J, Tebbutt W, Bruinsma W, Turner R E (2019) The gaussian process autoregressive regression
model (gpar). In: Chaudhuri K, Sugiyama M (eds) Proceedings of the twenty-second international
conference on artificial intelligence and statistics, volume 89 of Proceedings of Machine Learning
Research, PMLR, pp 1860–1869
Shen C, Chen X, Laloy E (2021) Editorial: Broadening the use of machine learning in hydrology. Frontiers
in Water 3
Vergara RC, Allard D, Desassis N (2022) A general framework for SPDE-based stationary random fields.
Bernoulli 28(1):1–32
Williams CKI, Rasmussen CE (2006) Gaussian processes for machine learning. MIT Press, Cambridge,
MA
Wilson A G, Knowles D A, Ghahramani Z (2011) Gaussian process regression networks. arXiv preprint
arXiv:1110.4411
Yaglom AM (1987) Correlation theory of stationary and related random functions, vol I. Springer, New
York
Yang L, Meng X, Karniadakis GE (2021) B-PINNs: Bayesian physics-informed neural networks for forward
and inverse PDE problems with noisy data. J Comput Phys 425:109913
123
... Based on the literature review, such models could be better suited for handling irregular (i.e., non-Gaussian, non-stationary) data but have not yet been applied to CH 4 concentration forecasting. Finally, the machine learning method of Gaussian process regression (Rasmussen and Williams 2006;Agou et al. 2022) and geostatistical analysis (De Iaco et al. 2022), also provide flexible forecasting frameworks that deserve further investigation. ...
Article
Full-text available
Mining operations provide the coal required to satisfy more than 36% of the electricity demand worldwide. Coal mining releases methane gas which constitutes a significant risk for the safety of coal miners working underground. Therefore, early warning of rising methane gas concentrations is critical to preventing accidents and loss of life. The prediction of methane concentration is complicated by its dependence on many factors and the presence of stochastic fluctuations. This paper introduces a new forecasting approach for methane gas emissions in underground coal mines. The proposed approach employs univariate and multivariate time series forecasting methods. Multivariate methods incorporate barometric pressure as a predictor of gas concentration. The data used herein were collected from the Atmospheric Monitoring Systems of three active underground coal mines in the eastern USA. The performance of three time series methods is compared: the univariate autoregressive integrated moving average (ARIMA), the multivariate vector autoregressive (VAR), and ARIMA with exogenous inputs (ARIMAX). The optimal model per method (ARIMA, VAR, ARIMAX) is selected based on the Akaike Information Criterion. The forecasting performance is assessed using cross-validation to determine the best overall model. It is concluded that all three methods can, in most cases, satisfactorily predict methane gas concentrations in underground coal mines.
... ML is a specific branch of artificial intelligence and is a method of empowering ML to do things that cannot be completed by direct programming [17]. ML is applied with great success in various industries, such as engineering [18][19][20][21][22][23], medicine [24][25][26][27][28][29], economy [30][31][32], and environmental and geospatial modeling [33][34][35]. ML has been applied to macroscopic features of target rocks, such as seismic facies classification [36][37][38][39][40] and logging lithofacies classification [41][42][43]. ML has demonstrated outstanding performance in these areas. ...
Article
Full-text available
Total organic carbon (TOC) is important geochemical data for evaluating the hydrocarbon generation potential of source rocks. TOC is commonly measured experimentally using cutting and core samples. The coring process and experimentation are always expensive and time-consuming. In this study, we evaluated the use of three machine learning (ML) models and two multiple regression models to predict TOC based on well logs. The well logs involved gamma rays (GR), deep resistivity (RT), density (DEN), acoustic waves (AC), and neutrons (CN). The ML models were developed based on random forest (RF), extreme learning machine (ELM), and back propagation neural network (BPNN). The source rock of Paleocene Yueguifeng Formation in Lishui–Jiaojiang Sag was taken as a case study. The number of TOC measurements used for training and testing were 50 and 27. All well logs and selected well logs (including AC, CN, and DEN) were used as inputs, respectively, for comparison. The performance of each model has been evaluated using different factors, including R2, MAE, MSE, and RMSE. The results suggest that using all well logs as input improved the TOC prediction accuracy, and the error was reduced by more than 30%. The accuracy comparison of ML and multiple regression models indicated the BPNN was the best, followed by RF and then multiple regression. The worst performance was observed in the ELM models. Considering the running time, the BPNN model has higher prediction accuracy but longer running time in small-sample regression prediction. The RF model can run faster while ensuring a certain prediction accuracy. This study confirmed the ability of ML models for estimating TOC using well logs data in the study area.
Article
Full-text available
In soil science, near-infrared (NIR) spectra are being largely tested to acquire data directly in the field. Machine learning (ML) models using these spectra can be calibrated, adding only samples from one field or gathering different areas to augment the data inserted and enhance the models’ accuracy. Robustness assessment of prediction models usually rely on statistical metrics. However, how the spatial distribution of predicted soil attributes can be affected is still little explored, despite the fact that agriculture productive decisions depend on the spatial variability of these attributes. The objective of this study was to use online NIR spectra to predict soil attributes at field level, evaluating the statistical metrics and also the spatial distribution observed in prediction to compare a local prediction model with models that gathered samples from other areas. A total of 383 online NIR spectra were acquired in an experimental field to predict clay, sand, organic matter (OM), cation exchange capacity (CEC), potassium (K), calcium (Ca), and magnesium (Mg). To build ML calibrations, 72 soil spectra from the experimental field (local dataset) were gathered, with 59 samples from another area nearby, in the same geological region (geological dataset) and with this area nearby and more 60 samples from another area in a different region (global dataset). Principal components regression was performed using k-fold (k=10) cross-validation. Clay models reported similar errors of prediction, and although the local model presented a lower R ² (0.17), the spatial distribution of prediction proved that the models had similar performance. Although OM patterns were comparable between the three datasets, local prediction, with the lower R ² (0.75), was the best fitted. However, for secondary NIR response attributes, only CEC could be successfully predicted and only using local dataset, since the statistical metrics were compatible, but the geological and global models misrepresented the spatial patterns in the field. Agronomic plausibility of spatial distribution proved to be a key factor for the evaluation of soil attributes prediction at field level. Results suggest that local calibrations are the best recommendation for diffuse reflectance spectroscopy NIR prediction of soil attributes and that statistical metrics alone can mispresent the accuracy of prediction.
Article
Full-text available
Despite great progress in simulating multiphysics problems using the numerical discretization of partial differential equations (PDEs), one still cannot seamlessly incorporate noisy data into existing algorithms, mesh generation remains complex, and high-dimensional problems governed by parameterized PDEs cannot be tackled. Moreover, solving inverse problems with hidden physics is often prohibitively expensive and requires different formulations and elaborate computer codes. Machine learning has emerged as a promising alternative, but training deep neural networks requires big data, not always available for scientific problems. Instead, such networks can be trained from additional information obtained by enforcing the physical laws (for example, at random points in the continuous space-time domain). Such physics-informed learning integrates (noisy) data and mathematical models, and implements them through neural networks or other kernel-based regression networks. Moreover, it may be possible to design specialized network architectures that automatically satisfy some of the physical invariants for better accuracy, faster training and improved generalization. Here, we review some of the prevailing trends in embedding physics into machine learning, present some of the current capabilities and limitations and discuss diverse applications of physics-informed learning both for forward and inverse problems, including discovering hidden physics and tackling high-dimensional problems.
Chapter
Full-text available
This review gives an overview of the development of machine learning in geoscience. A thorough analysis of the codevelopments of machine learning applications throughout the last 70 years relates the recent enthusiasm for machine learning to developments in geoscience. I explore the shift of kriging toward a mainstream machine learning method and the historic application of neural networks in geoscience, following the general trend of machine learning enthusiasm through the decades. Furthermore, this chapter explores the shift from mathematical fundamentals and knowledge in software development toward skills in model validation, applied statistics, and integrated subject matter expertise. The review is interspersed with code examples to complement the theoretical foundations and illustrate model validation and machine learning explainability for science. The scope of this review includes various shallow machine learning methods, e.g., decision trees, random forests, support-vector machines, and Gaussian processes, as well as, deep neural networks, including feed-forward neural networks, convolutional neural networks, recurrent neural networks, and generative adversarial networks. Regarding geoscience, the review has a bias toward geophysics but aims to strike a balance with geochemistry, geostatistics, and geology, however, excludes remote sensing, as this would exceed the scope. In general, I aim to provide context for the recent enthusiasm surrounding deep learning with respect to research, hardware, and software developments that enable successful application of shallow and deep machine learning in all disciplines of Earth science.
Article
Full-text available
Although a very rich list of classes of space-time covariance functions exists, specific tools for selecting the appropriate class for a given data set are needed. Thus, the main topic of this paper is to present the new R package, covatest, which can be used for testing some characteristics of a covariance function, such as symmetry, separability and type of non-separability, as well as for testing the adequacy of some classes of space-time covariance models. These last aspects can be relevant for choosing a suitable class of covariance models. The proposed results have been applied to an environmental case study.
Article
Full-text available
In this article, we provide a comprehensive review of space–time covariance functions. As for the spatial domain, we focus on either the d‐dimensional Euclidean space or on the unit d‐dimensional sphere. We start by providing background information about (spatial) covariance functions and their properties along with different types of covariance functions. While we focus primarily on Gaussian processes, many of the results are independent of the underlying distribution, as the covariance only depends on second‐moment relationships. We discuss properties of space–time covariance functions along with the relevant results associated with spectral representations. Special attention is given to the Gneiting class of covariance functions, which has been especially popular in space–time geostatistical modeling. We then discuss some techniques that are useful for constructing new classes of space–time covariance functions. Separate treatment is reserved for spectral models, as well as to what are termed models with special features. We also discuss the problem of estimation of parametric classes of space–time covariance functions. An outlook concludes the paper. This article is categorized under: • Statistical and Graphical Methods of Data Analysis > Analysis of High Dimensional Data • Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods • Statistical and Graphical Methods of Data Analysis > Multivariate Analysis Abstract A separable covariance function (left) and a space‐time covariance function with dynamical compact support.
Article
Gaussian processes and random fields have a long history, covering multiple approaches to representing spatial and spatio-temporal dependence structures, such as covariance functions, spectral representations, reproducing kernel Hilbert spaces, and graph based models. This article describes how the stochastic partial differential equation approach to generalising Matérn covariance models via Hilbert space projections connects with several of these approaches, with each connection being useful in different situations. In addition to an overview of the main ideas, some important extensions, theory, applications, and other recent developments are discussed. The methods include both Markovian and non-Markovian models, non-Gaussian random fields, non-stationary fields and space–time fields on arbitrary manifolds, and practical computational considerations.
Article
Classical geostatistical methods face serious computational challenges if they are confronted with large spatial datasets. The stochastic local interaction (SLI) approach does not require matrix inversion for parameter estimation, spatial prediction, and uncertainty estimation. This leads to better scaling of computational complexity and storage requirements with data size than standard (i.e., without size-reducing modifications) kriging. This contribution presents a simplified SLI model that can handle large data. The SLI method constructs a spatial interaction matrix (precision matrix) that adjusts with minimal user input to the data values, their locations, and sampling density variations. The precision matrix involves compact kernel functions which permit the use of sparse matrix methods. It is proved that the precision matrix of the proposed SLI model is strictly positive definite. In addition, parameter estimation based on likelihood maximization is formulated, and computationally relevant properties of the likelihood function are studied. The interpolation performance of the SLI method is investigated and compared with ordinary kriging using (i) synthetic non-Gaussian data and (ii) coal thickness measurements from approximately 11,500 drill holes (Campbell County, Wyoming, USA). You can view the free preprint here: https://arxiv.org/abs/2001.02246
Article
We propose a Bayesian physics-informed neural network (B-PINN) to solve both forward and inverse nonlinear problems described by partial differential equations (PDEs) and noisy data. In this Bayesian framework, the Bayesian neural network (BNN) combined with a PINN for PDEs serves as the prior while the Hamiltonian Monte Carlo (HMC) or the variational inference (VI) could serve as an estimator of the posterior. B-PINNs make use of both physical laws and scattered noisy measurements to provide predictions and quantify the aleatoric uncertainty arising from the noisy data in the Bayesian framework. Compared with PINNs, in addition to uncertainty quantification, B-PINNs obtain more accurate predictions in scenarios with large noise due to their capability of avoiding overfitting. We conduct a systematic comparison between the two different approaches for the B-PINNs posterior estimation (i.e., HMC or VI), along with dropout used for quantifying uncertainty in deep neural networks. Our experiments show that HMC is more suitable than VI with mean field Gaussian approximation for the B-PINNs posterior estimation, while dropout employed in PINNs can hardly provide accurate predictions with reasonable uncertainty. Finally, we replace the BNN in the prior with a truncated Karhunen-Loève (KL) expansion combined with HMC or a deep normalizing flow (DNF) model as posterior estimators. The KL is as accurate as BNN and much faster but this framework cannot be easily extended to high-dimensional problems unlike the BNN based framework.
Article
Tackling data challenges and incorporating physics into machine learning models will help unlock the potential of artificial intelligence to answer Earth science questions.