Computers & Geosciences

Published by Elsevier
Print ISSN: 0098-3004
Publications
Indicator kriging provides a flexible interpolation approach that is well suited for datasets where: 1) many observations are below the detection limit, 2) the histogram is strongly skewed, or 3) specific classes of attribute values are better connected in space than others (e.g. low pollutant concentrations). To apply indicator kriging at its full potential requires, however, the tedious inference and modeling of multiple indicator semivariograms, as well as the post-processing of the results to retrieve attribute estimates and associated measures of uncertainty. This paper presents a computer code that performs automatically the following tasks: selection of thresholds for binary coding of continuous data, computation and modeling of indicator semivariograms, modeling of probability distributions at unmonitored locations (regular or irregular grids), and estimation of the mean and variance of these distributions. The program also offers tools for quantifying the goodness of the model of uncertainty within a cross-validation and jack-knife frameworks. The different functionalities are illustrated using heavy metal concentrations from the well-known soil Jura dataset. A sensitivity analysis demonstrates the benefit of using more thresholds when indicator kriging is implemented with a linear interpolation model, in particular for variables with positively skewed histograms.
 
PIP This paper describes the development of a census data-mapping package for the Sirius microcomputer. The major objective was to display data from the 1981 United Kingdom Census Small Area Statistics onto various map bases for use in published reports and for interactive comparisons among census variables
 
Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do “basic” digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient.SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002.The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX).Windows environment spares the user scripting or complex interaction with the system.The system is in constant development to answer the needs and suggestions of its users.Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.
 
The ROCKMAG ANALYZER is a software to determine rock magnetic parameters from a broad variety of rock magnetic measurements. This software was particularly designed to visualize and evaluate data from isothermal remanent magnetization acquisition, coercivity curves, hysteresis loops and/or thermomagnetic curves. Various standard and non-standard rock magnetic parameters are calculated from these curves, thus, accelerating and simplifying the quantitative analysis of the measured data. View options like plotting derivatives, para-/diamagnetic correction, etc. further enhance the data analysis. Procedures for smoothing and data fitting by mathematical functions are implemented. Isothermal remanent magnetization acquisition and coercivity curves can be fitted by log-Gaussian functions and hysteresis loops by hyperbolic basic functions. Curie temperature estimation from thermomagnetic curves is supported by two different automated approaches: a second derivative method and a extrapolation method. A number of additional diagrams provide composite plots of parameters obtained by different measurements, like the Day plot and the Henkel plot.The ROCKMAG ANALYZER was designed for the output file format of the Variable Field Translation Balance (MM VFTB). It also supports data from the PM VSM/AGFM. The ROCKMAG ANALYZER requires Win95/98/ME/2k/XP and is available at “http://www.geophysik.uni-muenchen.de/research/paleomagnetism/”.
 
Numerical simulators of the dynamics of strata formation of continental margins fuse information from the atmosphere, ocean and regional geology. Such models can provide information for areas and times for which actual measurements are not available, or for when purely statistical estimates are not adequate by themselves. SEDFLUX is such a basin-fill model, written in ANSI-standard C, able to simulate the delivery of sediment and their accumulation over time scales of tens of thousands of years. SEDFLUX includes the effects of sea-level fluctuations, river floods, ocean storms, and other relevant environmental factors (climate trends, random catastrophic events), at a time step (daily to yearly) that is sensitive to short-term variations of the seafloor. SEDFLUX combines individual process-response models into one fully interactive model, delivering a multi-sized sediment load onto and across a continental margin, including sediment redistribution by (1) river mouth dynamics, (2) buoyant surface plumes, (3) hyperpycnal flows, (4) ocean storms, (5) slope instabilities, (6) turbidity currents, and (7) debris flows. The model allows for the deposit to compact, to undergo tectonic processes (faults, uplift) and isostatic subsidence from the sediment load. The modeled architecture has a typical vertical resolution of 1–25 cm, and a typical horizontal resolution of between 1 and 100 m.
 
Recent advances in theoretical geochemistry permit calculation of the standard molal thermodynamic properties of a wide variety of minerals, gases, aqueous species, and reactions from 1 to 5000 bar and 0 to 1000°C. The SUPCRT92 software package facilitates practical application of these recent theories, equations, and data to define equilibrium constraints on geochemical processes in a wide variety of geologic systems.The SUPCRT92 package is composed of three interactive FORTRAN 77 programs, SUPCRT92, MPRONS92, and CPRONS92, and a sequential-access thermodynamic database, SPRONS92.DAT. The SUPCRT92 program reads or permits user-generation of its two input files, CON and RXN, retrieves data from the direct-access equivalent of SPRONS92.DAT, calculates the standard molal Gibbs free energy, enthalpy, entropy, heat capacity, and volume of each reaction specified on the RXN file through a range of conditions specified on the CON file, and writes the calculated reaction properties to the output TAB file and, optionally, to PLT files that facilitate their graphical depiction. Calculations can be performed along the liquid side of the H2O vaporization boundary by specifying either temperature (T) or pressure (P), and in the single-phase regions of fluid H2O by specifying either T and P, T and H2O density, T and log K, or P and log K. SPRONS92.DAT, which contains standard molal thermodynamic properties at 25°C and 1 bar, equation-of-state parameters, heat capacity coefficients, and phase transition data for approximately 500 minerals, gases, and aqueous species, can be augmented or otherwise modified using MPRONS92, and converted to its direct-access equivalent using CPRONS92.
 
Although embryonic, quantitative biostratigraphy is alive and hopefully well. The potential for future growth clearly is present and will be fostered by IGCP Project 148 which is charged with the development of algorithms as well as the evaluation of methods through rigorous case studies.Sequencing methods produce a dimensionless list of events. Such a list is effective for correlation but yields little information about environments or discontinuities between different faunal zones. Quantified assemblage zones provide information about environments and stratigraphy although the resulting correlations usually are less precise. Assemblage zones also point out the locations of faunal discontinuities. The information contained in an evolutionary sequence consists of a continuous or discontinuous spectrum of morphological or taxonomic changes in time and space, depending on the method of analysis and the evolutionary lineage or lineages under study.
 
This paper recommends computational procedures for employing auxiliary maps, such as maps of drainage patterns, land cover and remote-sensing-based indices, directly in the geostatistical modeling of topography. The methodology is based on the regression-kriging technique, as implemented in the R package gstat. The computational procedures are illustrated using a case study in the south-west part of Serbia. Two point data sets were used for geostatistical modeling: (1) 2051 elevation points were used to generate DEMs and (2) an independent error assessment data set (1020 points) was used to assess errors in the topo-DEM and the SRTM-DEM. Four auxiliary maps were used to improve generation of DEMs from point data: (1) distance to streams, (2) terrain complexity measured by standard deviation filter, (3) analytical hillshading map and (4) NDVI map derived from the Landsat image. The auxiliary predictors were significantly correlated with elevations (adj.R2=0.20) and DEM errors (adj.R2=0.27). By including auxiliary maps in the geostatistical modeling of topography, realizations of DEMs can be generated that represent geomorphology of a terrain more accurately. In addition, downscaling of a coarse 3 arcsec SRTM DEM using auxiliary maps and regression-kriging is demonstrated using the same case study. A methodological advantage of regression-kriging, compared to splines, is the possibility to automate the data processing and incorporate multiple auxiliary predictors. The remaining open issues are computational efficiency, application of local regression-kriging algorithms and preparation of suitable auxiliary data layers for such analyses.
 
With advances in computational capabilities and refinement of seismic wave-propagation models in the past decade large three-dimensional simulations of earthquake ground motion have become possible. The resulting datasets from these simulations are multivariate, temporal and multi-terabyte in size. Past visual representations of results from seismic studies have been largely confined to static two-dimensional maps. New visual representations provide scientists with alternate ways of viewing and interacting with these results potentially leading to new and significant insight into the physical phenomena. Visualizations can also be used for pedagogic and general dissemination purposes. We present a workflow for visual representation of the data from a ground motion simulation of the great 1906 San Francisco earthquake. We have employed state of the art animation tools for visualization of the ground motions with a high degree of accuracy and visual realism.
 
A bibliography of computer applications in the earth sciences from 1948 to 1970 is presented along with an index by method, geologic subject, geographic area, geologic age, programming language, and language.
 
This is the first article in a series in which we introduce methods to determine the permanent and transient deformation induced by earthquakes or similar sources. The point-like and extended foci can be located in a stratified elastic half-space. The software includes a tool to combine the sources to one or more extended sources with arbitrary strike and dip. The number of layers is essentially not limited. The output covers surface and subsurface displacements, strain, tilt and stress. Several effective techniques have been used to solve the stability and convergence problems in computation of the Green's functions, so that the programs are small, fast and accurate. To control the accuracy of the numerical results, the software provides an optional link to Okada's analytical solutions in the special case of a homogeneous half-space. This can also be used for creating zeroth order approaches, i.e. starting models.
 
A brief history of the Kansas Geological Survey's Computer Contributions is presented along with the people involved in development of the first geological computer freeware.
 
This review covers rock, mineral and isotope geochemistry, mineralogy, igneous and metamorphic petrology, and volcanology. Crystallography, exploration geochemistry, and mineral exploration are excluded. Fairly extended comments on software availability, and on computerization of the publication process and of specimen collection indexes, may interest a wider audience. A proliferation of both published and commercial software in the past 3 years indicates increasing interest in what traditionally has been a rather reluctant sphere of geoscience computer activity. However, much of this software duplicates the same old functions (Harker and triangular plots, mineral recalculations, etc.). It usually is more efficient nowadays to use someone else's program, or to employ the command language in one of many general-purpose spreadsheet or statistical packages available, than to program a specialist operation from scratch in, say, FORTRAN. Greatest activity has been in mineralogy, where several journals specifically encourage publication of computer-related activities, and IMA and MSA Working Groups on microcomputers have been convened. In petrology and geochemistry, large national databases of rock and mineral analyses continue to multiply, whereas the international database IGBA grows slowly; some form of integration is necessary to make these disparate systems of lasting value to the global “hard-rock” community. Total merging or separate addressing via an intelligent “front-end” are both possibilities. In volcanology, the BBC's videodisk Volcanoes and the Smithsonian Institution's Global Volcanism Project use the most up-to-date computer technology in an exciting and innovative way, to promote public education.
 
During the past few years there has been a major increase in the publication of books and papers concerned with computer applications in stratigraphy. Many computer programs also have been published. This review concentrates on selected significant developments in lithostratigraphy and biostratigraphy. Topics to be discussed include techniques of classification, interpolation, extrapolation, scaling, data integration, and the sequencing of stratigraphic events.
 
The 1999 Mw=7.6 Chi-Chi earthquake was the strongest inland earthquake in Taiwan in the 20th century. Five radar images acquired with the C-band SARs onboard the ERS-1/2 satellites are combined to study the pre- and co-seismic surface deformations in the epicentral area of about . The pre-seismic interferograms over 2–3 years show consistent fringe patterns that are equivalent to LOS displacement variations of up to about . The deformations are likely caused by the east-west tectonic compression in the region. The short-term co-seismic interferograms show clear arc-shaped fringe patterns of about 10 fringes, equivalent to displacement variations of about . The co-seismic deformation results fit well with both GPS measurements and a simulated interferogram computed based on a fault-dislocation model. This study demonstrates the capability of short-wavelength InSAR systems for monitoring ground deformations of flat terrain in tropical regions.
 
Spatial scale plays an important role in many fields. As a scale-dependent measure for spatial heterogeneity, lacunarity describes the distribution of gaps within a set at multiple scales. In Earth science, environmental science, and ecology, lacunarity has been increasingly used for multiscale modeling of spatial patterns. This paper presents the development and implementation of a geographic information system (GIS) software extension for lacunarity analysis of raster datasets and 1D, 2D, and 3D point patterns. Depending on the application requirement, lacunarity analysis can be performed in two modes: global mode or local mode. The extension works for: (1) binary (1-bit) and grey-scale datasets in any raster format supported by ArcGIS and (2) 1D, 2D, and 3D point datasets as shapefiles or geodatabase feature classes. For more effective measurement of lacunarity for different patterns or processes in raster datasets, the extension allows users to define an area of interest (AOI) in four different ways, including using a polygon in an existing feature layer. Additionally, directionality can be taken into account when grey-scale datasets are used for local lacunarity analysis. The methodology and graphical user interface (GUI) are described. The application of the extension is demonstrated using both simulated and real datasets, including Brodatz texture images, a Spaceborne Imaging Radar (SIR-C) image, simulated 1D points on a drainage network, and 3D random and clustered point patterns. The options of lacunarity analysis and the effects of polyline arrangement on lacunarity of 1D points are also discussed. Results from sample data suggest that the lacunarity analysis extension can be used for efficient modeling of spatial patterns at multiple scales.
 
We have constructed a forward modelling code in Matlab, capable of handling several commonly used electrical and electromagnetic methods in a 1D environment. We review the implemented electromagnetic field equations for grounded wires, frequency and transient soundings and present new solutions in the case of a non-magnetic first layer. The CR1Dmod code evaluates the Hankel transforms occurring in the field equations using either the Fast Hankel Transform based on digital filter theory, or a numerical integration scheme applied between the zeros of the Bessel function. A graphical user interface allows easy construction of 1D models and control of the parameters. Modelling results are in agreement with other authors, but the time of computation is less efficient than other available codes. Nevertheless, the CR1Dmod routine handles complex resistivities and offers solutions based on the full EM-equations as well as the quasi-static approximation. Thus, modelling of effects based on changes in the magnetic permeability and the permittivity is also possible.
 
Solar surface insolation (SSI) provides information on how much solar radiance reaches the Earth’s surface at a specified location during the daytime. The amount of insolation reaching the surface is a critical parameter for climate change estimation and numerical weather prediction (NWP). We calculated SSI from MTSAT-1R data using a neural network (NN) model to obtain more accurate results than obtained using empirical and physical methods. The use of retrieved SSI data depends on the accuracy of the output results. Thus, before adding the input parameters to the NN, principal component transformation was performed using the eigenvectors and normalized input data to eliminate data redundancy. An NN model with one hidden layer was then used to simulate SSI using early-stop and Levenberg–Marquardt back-propagation (LMBP) methods. We separated the NN architecture into two parts according to cloudy or clear-sky conditions, which have different processes because of complicated cloud physical characteristics. The SSI estimates from the NN model were compared with pyranometer measurements and showed better agreement with ground-truth values than did estimates obtained using conventional methods, especially under the clear-sky condition.
 
We present a 2.5-D forward modeling algorithm for electrical resistivity data. The algorithm incorporates a boundary condition and source singularity correction that greatly reduces the need to pad the model space. In addition, the algorithm includes an optimization method for estimating the appropriate Fourier coefficients to achieve an accurate 2.5-D approximation. The optimization scheme uses a gradient-based search to find the optimal coefficients. We compare results from our algorithm to two analytical solutions. We are able to achieve errors on the order 1% when compared to these models. We have implemented the algorithm as an open source MATLAB-based forward modeling package. The MATLAB code is useful for exploring subsurface current flow. It has been coded so that it can be easily ported to work with inversion routines. The code is computationally efficient and suitable for solving 2-D problems with a large number of model parameters.
 
The MAGFLOW cellular automata (CA) model was able to fairly accurately reproduce the time of the lava flow advance during the 2006 Etna eruption, leading to very plausible flow predictions. MAGFLOW is intended for use in emergency response situations during an eruption to quickly forecast the lava flow path over some time interval from the immediate future to a long-time forecast. Major discrepancies between the observed and simulated paths occurred in the early phase of the 2006 eruption due to an underestimation of the initial flow rate, and at the time of the overlapping with the 2004–2005 lava flow. Very good representations of the areas likely to be inundated by lava flows were obtained when we adopt a time-varying effusion rate and include the 2004–2005 lava flow field in the Digital Elevation Model (DEM) of topography.
 
In a wide range of applications involving geological modelling, geological data available at low cost usually consist of documents such as cross-sections or geological maps and punctual data like borehole logs or outcrop descriptions.In order to build accurate 3D geological models based on this information, it is necessary to develop a methodology that takes into account the variety of available data. Such models, of the geometry of geological bodies, should also be easy to edit and update to integrate new data. This kind of model should produce a consistent representation of subsurface geology that may be a support for modelling other subsoil characteristics such as hydrogeologic or geothermic properties of the geological bodies.This paper presents a methodology developed to process geological information in this context. The aims of this methodology are comprehensive data description, effective data validation and easier model updates. Thus, special attention has been given to data structures and processing flows. The adopted methodology is implemented on a system architecture formed by a geographic information system, a geomodeler and a database communicating by file transfers.An application of this methodology, to build a 3D geological model of the subsoil over former coalmines used to store natural gas, is then presented. This model integrates the geological information available and is representative of the geological context. It is a support to the environmental follow-up needed after the end of gas-storage operations.
 
An algorithm and associated codes are developed to determine the depths to bottom of a 21/2-D sedimentary basin in which the density contrast varies parabolically with depth. This algorithm estimates initial depths of a sedimentary basin automatically and modifies thereafter appropriately within the permissible limits in an iterative approach as described in the text. Efficacy of the method as well as the code is illustrated by interpreting a gravity anomaly of a synthetic model. Further, the applicability of the method is exemplified with the derived density–depth model of the Godavari sub-basin, India to interpret the gravity anomalies due to the basin. Interpretations based on parabolic density profile are more consistent with existing geological information rather than with those obtained with constant density profile.
 
This paper presents a new methodology and the corresponding GSLIB-type program to integrate 2D average information, such as vertically-averaged lithofacies proportions, into estimation/simulation of 3D lithofacies distributions. Seismic information, available in 2D, is used in the cokriging of the 2D average lithofacies proportions. A data set related to a siltstonedolomite carbonate reservoir in W. Texas is used for case study. Three steps were taken: first, a co-located cokriging is used to incorporate 2D seismic information into the estimation of vertical lithofacies proportions; second, indicator kriging is used to derive the facies conditional probabilities at each of the 3D simulation grid node; this 3D indicator kriging uses the hard well data and the previously estimated vertical proportions; last, a p-field simulation algorithm is used to draw simulated lithofacies indicators from the previously obtained distributions. Results showed that even with limited well data, the input vertical lithofacies proportions (which carry the seismic information) is honored quasi-exactly.
 
bh_tomo is an open source borehole georadar data processing and ray-based 2D tomography software package developed at the École Polytechnique of Montréal. bh_tomo runs under Matlab version 7.0 and above, and is therefore portable between computer operating systems supported by Matlab. To perform the tomographic inversions, bh_tomo includes an implementation of the classical LSQR algorithm, as well as an implementation of the recent geostatistical inversion scheme developed at the École Polytechnique of Montréal. One important motivation behind the development of bh_tomo was to ease the data processing sequence necessary to perform tomographic inversion of georadar amplitude data, especially when measured between many adjacent boreholes. The software package relies on a mini database and comprises interactive modules to manage, process and interpret the data.
 
A tool called GeoVR has been designed and developed under a client/server architecture to enable the interactive creation of a 3D scene and virtual reality modeling language (VRML) model from 2D spatial data by integrating Internet geographical information system (GIS) and HTML programming. The client front-end of this tool provides an HTML form to set properties for building 3D scenes, while the server back-end supported by off-the-shelf software: ArcView Internet Map Server and ArcView 3D Analyst through Avenue programming, processes the parameters and generates a 3D scene. This 3D scene is then transformed into a VRML model, which, together with its legend, is sent back to the VRML-enabled WWW browser for display and navigation. It is demonstrated that this tool, not only automates the conversion of the conventional 2D GIS data into VRML, but also adapts the current GIS 3D capabilities to the increasingly popular Web environment. The development of GeoVR offers new opportunities for geoscientists to build applications that benefit from virtual reality presentation based upon the existing GIS spatial databases.
 
A prerequisite for numerical simulation of water flow in heterogeneous soils is to build a discrete model of the soil matrix that is a fair representation of the heterogeneities under study, while being compatible with the numerical equations used to compute unsaturated water flow. When introducing significant amounts of very coarse solid elements (gravels) in a discrete, multi-dimensional soil matrix model, in the form of internal boundaries that occupy a certain fraction of the grid nodes, the need arises to eliminate from the model the possible occurrence of isolated areas of soil, surrounded by continuous gravel barriers that keep them separate from other regular grid nodes. This situation is obviously an artefact resulting from the discrete representation of the physical system, since all nongravel areas ought to be considered as being submitted to at least some hydrodynamic linkage with each other and with the outer boundary conditions (rain and other water input at the top, gravity drainage or water table at the bottom). Computational nodes that do not connect in some way, through the computational grid, to these outer boundary conditions, are the source of computational problems due to system indetermination when solving the hydrodynamic equations. A method has thus been devised to automatically detect and eliminate such situations in order to produce plausible soil models for hydrodynamic simulation, in the presence of highly contrasted grain sizes. The method is based on a proposed recursive algorithm for cluster analysis, an attractive and very simple alternative to existing methods generally used to handle cluster problems. This method and its application to the heterogeneous soil modeling problem is presented as a pseudo-code that can be implemented with any current programming language. Performance figures, and simulation results of the hydrodynamic behavior of such soil models, are shown. A parallel implementation of the algorithm is also proposed.
 
A new 3D/2D interactive display server was developed for the IGeoS geophysical data processing framework presented earlier. With introduction of this major component, the framework becomes conceptually complete and potentially bridges the gap between traditional processing and interpretation geophysical software.The display server utilizes Qt toolkit and OpenGL graphics libraries while taking advantage of the object-oriented design of the core data processing system. It operates by creating image object trees that are automatically propagated to the server(s) residing on the same or remote hosts and producing complex, structured, and interactive data displays. The displays support custom interactive graphical user interfaces, which are constructed entirely by the user and do not require computer coding. With over 200 specialized processing tools available, this approach allows creating 3D visualizations and building custom interactive data analysis, interpretation, and modeling tools in many areas of application. In particular, we show examples of integration of seismic ray tracing, gravity, and receiver function modeling and inversion in deep crustal studies.
 
Channel modeling is one of the popular topics in the application of geostatistics to fluvial reservoir modeling. This paper presents an approach to designing channels which have a general flow direction through sand well locations and which avoid shale well locations. This approach is named the random walk on graphs of well locations, and is applied to model channel reservoirs.This modeling process consists of two parts: one direction walk modeling and two direction walk modeling. The first model aims to determine each channel location by the use of a transition probability with a random walk essentially in the main flow direction, say the north–south direction, while the second model simulates different channels that can be oriented in both directions, either from north to south or from south to north. In both parts of the model, the transition probability is estimated based on two coefficients: one is the correlation coefficient of channel observations; the other is the obstacle coefficient of non-channel observations. A case study with a dense array of 332 wells is presented using the proposed random walk model. For the purpose of model verification, channel maps created by the random walk are compared to the hand-drawn channel maps made by geologists. The results show a good agreement in both types of maps, but in contrast to the single map supplied by geologists, the random walk model is capable of generating many realizations of channel configuration, hence allowing for uncertainty evaluation.A limitation of this approach, related to the influence of the number of wells, is discussed.
 
The solution of potential problems is not only fundamental for geosciences, but also an essential part of related subjects like electro- and fluid-mechanics. In all fields, solution algorithms are needed that should be as accurate as possible, robust, simple to program, easy to use, fast and small in computer memory. An ideal technique to fulfill these criteria is the boundary element method (BEM) which applies Green's identities to transform volume integrals into boundary integrals. This work describes a linear analytical BEM for 2D homogeneous potential problems that is more robust and precise than numerical methods because it avoids numerical schemes and coordinate transformations. After deriving the solution algorithm, the introduced approach is tested against different benchmarks. Finally, the gained method was incorporated into an existing software program described before in this journal by the same author.
 
A method of image processing is presented to extract a sequential profile from a map showing folded pattern of laminations with boudinage, overlapping noise and deficits in order to obtain reliable data for sequential analysis of sedimentary rocks on their cross-section. The raw map data is a digital photograph of any striped pattern of such rocks represented by 2D distribution of density of a certain quantity. The image processing consists of several steps. First, we derive the distribution of local slopes (strike) of laminations by means of statistical analysis of the first and the second spatial derivatives of the density map. Filtering of the local slopes leads to the elimination of noise amplified by differential operations. Secondly, a set of curved isochronous lines is determined by integrating the local slopes to trace smoothly the local strikes of laminae. Thirdly, the original folded pattern of laminations is converted to an “unfolded image” with straight laminations by using the isochronous lines. Finally, a set of a reliable 1D sequential profile of density and its uncertainty profile are derived by computing the mean and variance of density along each of the isochronous lines on the unfolded pattern of laminations. The computer code is tested successfully on the synthetic map data and applied to a set of abundance maps of major elements in Archean banded iron formation to characterize the striped pattern. High resolution is achieved objectively to recognize even thin and faint seams in the laminations. The computer code, lamination tracer, is expected to be useful for the analysis of other types of laminated pattern in various fields such as dendrochronology and sequence stratigraphy to decode the environmental variations in the geologic past.
 
The purpose of the paper is to provide details of the application of the INFOREX-3.0 database, a package designed to store, retrieve, and process phase equilibria information. This most recent release of the system accesses data of 162 experimental studies, conducted from 1962 to 1994, including a total of 6174 experiments with 5188 addressed to natural igneous rocks and 986 runs carried out in synthetic systems, mostly CMAS. The total database was divided into 3893 “dry” conditions experiments, and 2281 runs performed in the presence of H2O and/or CO2: 1618 of the “wet” runs represented are water saturated. The number of 1 atm experiments (3750) is greater than the number of high-pressure runs (2474). The INFOREX database contains 8311 coexisting phase compositions: 3197 for glasses, 1247—olivine, 1429—pyroxenes, 501—spinels, 842—plagioclase. One block of the INFOREX information includes 298 liquid compositions where the ratio was determined. Data for sulfur and water solubility experiments also have been systematized. The INFOREX data management system allows users to find and print out data on a specific set of mineral-melt or two mineral equilibrium experiments requested for a given range of temperatures, pressures, oxygen fugacities, and compositions in a matter of a few seconds. In addition, one can use subsets of the data to develop mineral-melt geothermometers for equilibria including olivine, plagioclase, pyroxenes, and spinels for any specific system type. Two examples illustrate the use of INFOREX for testing empirical equations proposed for the calculation of water solubility and ratio in basic to acid melts.
 
HydroTrend v.3.0 is a climate-driven hydrological water balance and transport model that simulates water discharge and sediment load at a river outlet, by incorporating drainage basin properties (river networks, hypsometry, relief, reservoirs) together with biophysical parameters (temperature, precipitation, evapo-transpiration, and glacier characteristics). HydroTrend generates daily discharge values through: snow accumulation and melt, glacier growth and ablation, surface runoff, and groundwater evaporation, retention and recharge. The long-term sediment load is predicted either by the ART-QRT module based on drainage area, discharge, relief, and temperature, or the BQART module that also incorporates basin-average lithology and anthropogenic influences on soil erosion. Sediment trapping efficiency of reservoirs is based on reservoir location in the river network and its volume that determines the residence time of water within the reservoir. Glacial influence is based on the extent of ice cover, equilibrium altitude, and freezing line mobility. HydroTrend v.3.0 captures the inter- and intra-annual variability of sediment flux by using either high-resolution climate observations or a stochastic climate generator for simulations over longer geological intervals. A distributary channel module simulates the flow conditions and transport capacity across a multiple deltaic channel system. Simulations of the Metauro and the Po rivers, in Italy, are used as case studies to demonstrate the applicability of the new model.
 
CSpace is a program for the graphical and algebraic analysis of composition relations within chemical systems. The program is particularly suited to the needs of petrologists, but could also prove useful for mineralogists, geochemists and other environmental scientists. A few examples of what can be accomplished with CSpace are the mapping of compositions into some desired set of system/phase components, the estimation of reaction/mixing coefficients and assessment of phase-rule compatibility relations within or between complex mineral assemblages. The program also allows dynamic inspection of compositional relations by means of barycentric plots. CSpace provides an integrated workplace for data management, manipulation and plotting. Data management is done through a built-in spreadsheet-like editor, which also acts as a data repository for the graphical and algebraic procedures. Algebraic capabilities are provided by a mapping engine and a matrix analysis tool, both of which are based on singular-value decomposition. The mapping engine uses a general approach to linear mapping, capable of handling determined, underdetermined and overdetermined problems. The matrix analysis tool is implemented as a task “wizard” that guides the user through a number of steps to perform matrix approximation (finding nearest rank-deficient models of an input composition matrix), and inspection of null-reaction space relationships (i.e. of implicit linear relations among the elements of the composition matrix). Graphical capabilities are provided by a graph engine that directly links with the contents of the data editor. The graph engine can generate sophisticated 2-D ternary (triangular) and 3D quaternary (tetrahedral) barycentric plots and includes features such as interactive re-sizing and rotation, on-the-fly coordinate scaling and support for automated drawing of tie lines.
 
ArArCALC is a Microsoft Excel® 97-2000-XP application for performing calculations in 40Ar/39Ar geochronology. It is coded in Visual Basic for Applications and can be used under the Windows® 95/98/NT/2000/ME/XP operating systems. ArArCALC provides an easy-to-use graphical interface for the calculation of age plateaus, total fusion ages and isochrons following the regression of 40Ar/39Ar mass spectrometry data. Results are stored in single Excel workbooks including nine different data tables and four different diagrams. Analytical, internal and external errors are calculated based on error propagation of all input parameters, analytical data and applied corrections. Finally, the age calculation results can be recalibrated with reference to the primary K–Ar standards (e.g. GA-1550, MMhb-1) in order to obtain more consistent absolute40Ar/39Ar age determinations. ArArCALC is distributed as freeware.
 
CASQUS is a numerical simulation tool to model the feedback mechanism between surface and tectonic processes. It includes the surface processes model CASCADE into the finite element solver ABAQUS/Standard™. The finite element method allows for geomechanical simulations of the subsurface with geometrically complex structures in 3D. Additionally, in the commercial software ABAQUS™ various types of rheological behavior are already implemented. CASCADE simulates erosion and sedimentation as the combination of fluvial transport and hillslope processes. For the integration of CASCADE into ABAQUS/Standard™ an Arbitrary Lagrangian–Eulerian modeling technique is used, which makes a coupled and automated simulation possible. Two benchmark models that are easy to reproduce demonstrate the functionality of CASQUS. Our tool aims at a better understanding of the feedback between mass redistribution at the Earth's surface and processes within a heterogenous subsurface, and at a quantification of the involved processes.
 
In this paper we introduce a new, precise and adaptive method for the implicit reconstruction of faulted surfaces with complex geometry from scattered, unorganized points as obtained from seismic data or laser scanners. We embed the point set into a 3d-complex on which a 3d-implicit function is interpolated. The 3d-complex is a set of tetrahedrons and the implicit function represents a surface that lies as close as possible to the input data points. The density of the 3d-complex can be adapted to efficiently control both the precision of the implicit function and the size of triangles of the reconstructed surface. Discontinuities in the topology of the tetrahedral mesh make it possible to reconstruct discontinuous, bounded surfaces and very close parallel patches without introducing unwanted connections (topological “handles”) between these regions. To compute the implicit function we use the discrete smooth interpolation (DSI) method with a set of boundary, off-boundary and smoothness constraints. The interpolation problem does not primarily depend on the number of input data points but on the magnitude of the 3d-complex. This method can be applied to the construction of faulted horizons and salt-top surfaces.
 
In this study, an extension called Tunneling Analyst (TA) has been developed in ArcScene 3D GIS software, part of the ArcGIS software package. It dramatically extends the functionalities of ArcScene because it allows: (1) estimation of the 3D distribution of rock mass rating (RMR) values using borehole and geophysical exploration data, (2) the modeling of 3D discontinuity planes such as faults from field-based structural measurements, and (3) analysis of 3D intersections and 3D buffer zones between proposed tunnel alignments and some discontinuities. Because TA can handle and visualize both 2D and 3D geological data in a single GIS environment, the tedious tasks required for data conversion between various software packages can be reduced significantly. The application to the Daecheong tunneling project in Korea shows that TA could present a rational solution to evaluating the rock mass classes along a proposed tunnel alignment and can also provide specific 3D spatial query tools to support the tunnel design work. This paper describes the concept and details of the development and implementation of TA.
 
A robust finite-element code (Pecube) has been developed to solve the three-dimensional heat transport equation in a crustal/lithospheric block undergoing uplift and surface erosion, and characterized by an evolving, finite-amplitude surface topography. The time derivative of the temperature field is approximated by a second-order accurate, mid-point, implicit scheme that takes into account the changing geometry of the problem. The method is based on a mixed Eulerian–Lagrangian approach that requires frequent re-interpolation of the temperature field in the vertical direction to ensure accuracy. From the computed crustal thermal structure, the temperature history of rock particles that, following an imposed tectonic scenario, are exhumed at the Earth's surface, is derived. These T−t paths can then be used to compute apparent isotopic ages for a range of geochronometers. The usefulness of the code is demonstrated by computing the predicted distribution of (U–Th)/He apatite ages in a high relief area of the Sierra Nevada, California, for a range of tectonic scenarios and comparing them to existing data.
 
This study is the follow-up to a previous one devoted to soil pore space modelling. In the previous study, we proposed algorithms to represent soil pore space by means of optimal piecewise approximation using simple 3D geometrical primitives: balls, cylinders, cones, etc. In the present study, we use the ball-based piecewise approximation to simulate biological activity. The basic idea for modelling pore space consists in representing pore space using a minimal set of maximal balls (Delaunay spheres) recovering the shape skeleton. In this representation, each ball is considered as a maximal local cavity corresponding to the “intuitive” notion of a pore as described in the literature. The space segmentation induced by the network of balls (pores) is then used to spatialise biological dynamics. Organic matter and microbial decomposers are distributed within the balls (pores). A valuated graph representing the pore network, organic matter and microorganism distribution is then defined. Microbial soil organic matter decomposition is simulated by updating this valuated graph. The method has been implemented and tested on real data. As far as we know, this approach is the first one to formally link pore space geometry and biological dynamics. The long-term goal is to define geometrical typologies of pore space shape that can be attached to specific biological dynamic properties. This paper is a first attempt to achieve this goal.
 
This paper presents a method, which integrates image knowledge and Light Detection And Ranging (LiDAR) point cloud data for urban digital terrain model (DTM) and digital building model (DBM) generation. The DBM is an Object-Oriented data structure, in which each building is considered as a building object, i.e., an entity of the building class. The attributes of each building include roof types, polygons of the roof surfaces, height, parameters describing the roof surfaces, and the LiDAR point array within the roof surfaces. Each polygon represents a roof surface of building. This type of data structure is flexible for adding other building attributes in future, such as texture information and wall information. Using image knowledge extracted, we developed a new method of interpolating LiDAR raw data into grid digital surface model (DSM) with considering the steep discontinuities of buildings. In this interpolation method, the LiDAR data points, which are located in the polygon of roof surfaces, first are determined, and then interpolation via planar equation is employed for grid DSM generation. The basic steps of our research are: (1) edge detection by digital image processing algorithms; (2) complete extraction of the building roof edges by digital image processing and human–computer interactive operation; (3) establishment of DBM; (4) generation of DTM by removing surface objects. Finally, we implement the above functions by MS VC++ programming. The outcome of urban 3D DSM, DTM and DBM is exported into urban database for urban 3D GIS.
 
Tensor3D is a geometric modeling program with the capacity to simulate and visualize in real-time the deformation, specified through a tensor matrix and applied to triangulated models representing geological bodies. 3D visualization allows the study of deformational processes that are traditionally conducted in 2D, such as simple and pure shears. Besides geometric objects that are immediately available in the program window, the program can read other models from disk, thus being able to import objects created with different open-source or proprietary programs. A strain ellipsoid and a bounding box are simultaneously shown and instantly deformed with the main object. The principal axes of strain are visualized as well to provide graphical information about the orientation of the tensor's normal components. The deformed models can also be saved, retrieved later and deformed again, in order to study different steps of progressive strain, or to make this data available to other programs. The shape of stress ellipsoids and the corresponding Mohr circles defined by any stress tensor can also be represented. The application was written using the Visualization ToolKit, a powerful scientific visualization library in the public domain. This development choice, allied to the use of the Tcl/Tk programming language, which is independent on the host computational platform, makes the program a useful tool for the study of geometric deformations directly in three dimensions in teaching as well as research activities.
 
3D geoscience modeling system (3D GMS) embodied with topological relations is of extreme importance for Geosciences. This paper presents a universal 3D model, generalized tri-prism (GTP) for 3D GMS and real-3D GIS, which is a modification and improvement of former presented analogous tri-prism (ATP) model and is the common model of pyramid model, tetrahedron model and tri-prism (TP) model. The GTP model takes the divergent drill holes, rather than triangulation network after interpolation or vertical parallel drill holes after projection transformation, as its direct data source. Hence, the reliability and quality of the 3D model is maximatily ensured. The GTP component is comprised of six primitives as node, TIN-edge, side-edge, TIN-face, side-face and GTP. Besides, three intermediary diagonal lines in each GTP component are temporary applied for spatial operations. Six groups of topological relations between the six primitives are carefully designed for geo-spatial inquiry and geo-spatial analysis. The mechanisms of chipping, dynamic updating and local refining operations of so constructed 3D geological model are introduced. A real-3D software platform, GeoMo3D@, developed with VC++, OPGL and SQL server, demonstrates most of the 3D geo-spatial operations including clipping, separating, uncovering and geo-fence diagram generating based on an actual 3D geological model of a coal mine, Tangshan, P.R. China.
 
This paper is addressed to the TOUGH2 user community. It presents a new tool for handling simulations run with the TOUGH2 code with specific application to CO2 geological storage. This tool is composed of separate FORTRAN subroutines (or modules) that can be run independently, using input and output files in ASCII format for TOUGH2. These modules have been developed specifically for modeling of carbon dioxide geological storage and their use with TOUGH2 and the Equation of State module ECO2N, dedicated to CO2–water–salt mixture systems, with TOUGHREACT, which is an adaptation of TOUGH2 with ECO2N and geochemical fluid–rock interactions, and with TOUGH2 and the EOS7C module dedicated to CO2–CH4 gas mixture is described. The objective is to save time for the pre-processing, execution and visualization of complex geometry for geological system representation. The workflow is rapid and user-friendly and future implementation to other TOUGH2 EOS modules for other contexts (e.g. nuclear waste disposal, geothermal production) is straightforward. Three examples are shown for validation: (i) leakage of CO2 up through an abandoned well; (ii) 3D reactive transport modeling of CO2 in a sandy aquifer formation in the Sleipner gas Field, (North Sea, Norway); and (iii) an estimation of enhanced gas recovery technology using CO2 as the injected and stored gas to produce methane in the K12B Gas Field (North Sea, Denmark).
 
Existing 2D data structures are often insufficient for analysing the dynamism of saturation excess overland flow (SEOF) within a basin. Moreover, all stream networks and soil surface structures in GIS must be preserved within appropriate projection plane fitting techniques known as georeferencing. Inclusion of 3D volumetric structure of the current soft geo-objects simulation model would offer a substantial effort towards representing 3D soft geo-objects of SEOF dynamically within a basin by visualising saturated flow and overland flow volume. This research attempts to visualise the influence of a georeference system towards the dynamism of overland flow coverage and total overland flow volume generated from the SEOF process using VSG data structure. The data structure is driven by Green–Ampt methods and the Topographic Wetness Index (TWI). VSGs are analysed by focusing on spatial object preservation techniques of the conformal-based Malaysian Rectified Skew Orthomorphic (MRSO) and the equidistant-based Cassini-Soldner projection plane under the existing geodetic Malaysian Revised Triangulation 1948 (MRT48) and the newly implemented Geocentric Datum for Malaysia (GDM2000) datum. The simulated result visualises deformation of SEOF coverage under different georeference systems via its projection planes, which delineate dissimilar computation of SEOF areas and overland flow volumes. The integration of Georeference, 3D GIS and the saturation excess mechanism provides unifying evidence towards successful landslide and flood disaster management through envisioning the streamflow generating process (mainly SEOF) in a 3D environment.
 
In order to improve the railway connection between Austria and Italy, a base tunnel, extending from Fortezza to Innsbruck (57 km), is under study. The design corridor crosscuts a large and strongly tectonized section of the Eastern Alpine chain, characterized by complex metamorphic and igneous lithology and polyphase structures developed under ductile to brittle deformation conditions. In order to model the sub-surface geology of the area, surface and sub-surface geological data have been integrated in a spatial database. 3D geological models of the Italian part of the corridor have been constructed on the basis of this data using two approaches. The first is a more traditional approach, involving the reconstruction of several parallel and intersecting cross-sections. It has been implemented using ArcGIS® software with custom-developed scripts that enable one to automatically project structural data, collected at the surface and along boreholes, onto cross-sections. The projection direction can be controlled and is based on structural trends obtained from a detailed statistical analysis of orientation data. Other ArcGIS® scripts enable linking of the network of crosscutting profiles and help to secure their consistency. The second approach involves the compilation of a true 3D geological model in gOcad®. As far as time efficiency and visualization are concerned, the second approach is more powerful. The basic structural geology assumptions, however, are similar to those applied in the first approach. In addition to the 3D model, compilation scripts (ArcGIS® and gOcad®) have been developed, which allow estimation of the uncertainties in the depth extrapolation of structures observed at the surface or along boreholes. These scripts permit the assignment of each projected structural element (i.e., geological boundaries, faults and shear zones) to a parameter estimating reliability. Basic differences between “data-driven” interpolation and “knowledge-based” extrapolation of geological features at depth are also discussed and consequences for the uncertainty estimates of 3D geological models are evaluated.
 
Two-dimensional GIS are extensively used in geology to create, analyse, and interpret geological map models. However, these systems are unable to represent the Earth's subsurface in three spatial dimensions. The objective of this article is to overcome this deficiency and to provide a general framework for a 3d GIS.The presented approach is based on existing 3d geomodelling theory and software, and is characterized by an integrated data model for geological observation data and geomodels, data management supported by an XML-enabled database server, and functionality for querying observation data and 3d geomodels based on their 3d spatial and geological properties. The resulting 3d GIS framework enables geologists to manipulate, analyse and interpret 3d geomodels analogously as they work with 2d geological maps.
 
True 3D geological models are instrumental in addressing practical geology problems. A 3D geological modeling method is a vital module which converts raw data in lower dimensions into 3D bodies. To be geologically practical, the method must take cross-sections as the main data source and must be capable of modeling areas with complex faults, maintaining data consistency as well as carrying out multi-body modeling. To realize such a practical geological modeling system, GSIS (Geological Spatial Information System) is developed based on the core method of 3D geological multi-body modeling from netty cross-sections with topology.According to the intersecting netty cross-section approach, the presented method divides the modeling area into several cross-section grids. Thus, the modeling is simplified into the independent modeling of each cross-section grid and their merger into an integrated model. Different approaches are employed when constructing models in a specific cross-section grid, depending on whether a fault is present in the grid in question. In the absence of a fault, the basic approach is employed; otherwise, the basic approach is extended to add fault-related treatments. In both approaches, arcs are classified into subsets by their connections, attributes, and topological relationships. Arcs in the same subset of the lowest level are triangulated and interpolated to generate stratum interfaces.In this work, the application of GSIS in the Beijing multi-parameter stereo geological survey project is presented. Among seven successfully constructed models, the engineering geological model of the new city zone in Shunyi district and the bedrock geological model of the central city zone are cited as examples. These demonstrate GSIS’s capacity in modeling large areas with complex geology.
 
We describe visualization software, Visualizer, that was developed specifically for interactive, visual exploration in immersive virtual reality (VR) environments. Visualizer uses carefully optimized algorithms and data structures to support the high frame rates required for immersion and the real-time feedback required for interactivity. As an application developed for VR from the ground up, Visualizer realizes benefits that usually cannot be achieved by software initially developed for the desktop and later ported to VR. However, Visualizer can also be used on desktop systems (unix/linux-based operating systems including Mac OS X) with a similar level of real-time interactivity, bridging the “software gap” between desktop and VR that has been an obstacle for the adoption of VR methods in the Geosciences. While many of the capabilities of Visualizer are already available in other software packages used in a desktop environment, the features that distinguish Visualizer are: (1) Visualizer can be used in any VR environment including the desktop, GeoWall, or CAVE, (2) in non-desktop environments the user interacts with the data set directly using a wand or other input devices instead of working indirectly via dialog boxes or text input, (3) on the desktop, Visualizer provides real-time interaction with very large data sets that cannot easily be viewed or manipulated in other software packages. Three case studies are presented that illustrate the direct scientific benefits realized by analyzing data or simulation results with Visualizer in a VR environment. We also address some of the main obstacles to widespread use of VR environments in scientific research with a user study that shows Visualizer is easy to learn and to use in a VR environment and can be as effective on desktop systems as native desktop applications.
 
3D voxelized images can be manipulated if their component parts can be identified, cataloged, and measured. To accomplish this, it is necessary to separate individual convex objects from the complex structures that result from digital observation techniques such as X-ray tomography. Toward this end, we have developed schemes that peel away sequential layers of voxels from complex structures until narrow waists that connect individual objects disappear, and each component object can be identified. These peeling schemes provide the most uniform possible cumulative thickness of removed layers regardless of the orientation of the voxel grid pattern. Consequently, they lead to the most accurate application regarding inter-object interfaces, medial axis analysis, and individual object statistics such as volumes, orientations and interconnectivity. Peeling schemes can be categorized by the number of steps involved in each peeling iteration. Each step removes voxels according to three possible criteria for defining the exterior of a voxel: exposed faces, edges, or corners. Each of these ultimately causes an initial sphere, for example, to evolve into a cube, dodecahedron, or octahedron, respectively. Combinations of steps can be used to create more complex polyhedra (tetrahexadra, trisoctahedra, trapezohedra, and hexoctahedra). The resulting polyhedron that most closely resembles a starting sphere depends on the appropriate definition of “sphericity”. Using a metric based on the standard deviation of the polyhedral surface from that of a concentric sphere of equal volume, the optimal scheme is peeling by faces 7 times, by edges 3 times, and by corners 4 times. This leads to a hexoctahedron with Miller indices (14 7 4) and a standard deviation of 0.025. Using a metric based on minimizing surface area, the optimal scheme is peeling by faces 9 times, by edges 6 times, and by corners 5 times, leading to a hexoctahedron with Miller indices (20 11 5). In the past, only 1-step peeling has been used (by faces or corners). If computational or conceptual constraints limit peeling to 1-step, the criterion of edges should be used, as the dodecahedron that results deviates from a sphere by only half the amount of either the cube or octahedron resulting from 1-step peeling of faces or corners, respectively. We also determined the best criteria for 2-step and 3-step peeling. The peeling schemes we identify can be used to separate objects from complex structures for application to a number of geological and other problems. Information that emerges from the analysis includes object volumes, which can be used for determining grain- or bubble-size distributions in volcanologic, petrologic, and sedimentary applications, among others.
 
The existing 3D geological modeling systems rely heavily on large numbers of borehole and cross-section data. However, it is well known that the available geological data are generally sparse and undersampled. In this paper, we propose a stepwise refinement method for 3D modeling with multi-source data integration. The method can naturally simulate geological structures no matter whether the available geological data are sufficient or not. By stepwise refinement on multiple data, the method increases the accuracy of 3D models gradually and effectively. In addition, the mechanisms used in the method for organizing and manipulating information can have an equally important impact upon geologists’ thought, the interpretation of geological data, and 3D modeling methodology. A concrete example of using the method to Huai Bei fault and fold belt shows that the method can be applied to broad and complex geological areas.
 
Top-cited authors
James C. Bezdek
  • University of Missouri
William Full
  • GXStat LLC
Edzer Pebesma
  • University of Münster
Eric Christopher Grimm
  • University of Minnesota Twin Cities
Biswajeet Pradhan
  • University of Technology Sydney