Conference Paper

Validation and Improvement of Numerical Methods to Simulate the Well-Test Response of Reservoir Models for Model Calibration Purposes

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Using geostatistical modelling to populate reservoir properties is nowadays the most common approach in the industry and it has received a great deal of attention. A geostatistical reservoir model defines a space of spatial uncertainty, which can be explored by generating many equiprobable reservoir property realisations that are as many possible reservoir models complying with static data. Among them, the relevant models are those that also match the dynamic data, which complete the available data for reservoir model calibration. Finding the relevant reservoir models in the space of spatial uncertainty is a time-consuming process that requires simulating the dynamic (flow) response of many reservoir models. Having a fast and reliable simulation method is then highly desirable to speed up the process of reservoir model calibration. In this context, a new approach has been developed and tested. The method allows easy and fast comparison between interpreted well-test results and equivalent (average) reservoir model properties in terms of transmissivity (k.h) and permeability. The comparison can be used to validate or reject a reservoir model, and to obtain indications on how to modify it to fit the well-test data. This paper presents the method and the results obtained to evaluate its performances and to validate it. Well-test-interpreted permeabilities (or transmissivities) are nothing but weighted average permeabilities that are to be calculated from permeabilities defined over closed surfaces properly defined around the well, the weights depending on the flow geometry. The proposed method is based on steady-state flow simulation that is carried out by making the tested well a source term (producing or injecting well) in the centre of a simulation domain (reservoir model region). The latter must be extended enough to contain, or at least overlap, the stabilisation area of the well test in which average transmissivities are to be estimated. The method relies on three key aspects: defining a simulation domain (extension and shape) that is consistent with the actual well-test drainage area, defining relevant boundary conditions to reproduce flow paths that are consistent with those generated by the actual well test, using the new effective-gradient based averaging method to compute average permeabilities over closed surfaces properly defined. The method is tested on various synthetic and partly real field cases, for which the transient well-test responses are first simulated and interpreted, then compared with the transmissivities that are predicted using the new method. Sensitivity analysis is also carried out on calculation parameters (flow simulation domain, flow rates…) to check the robustness of the method and identify improvement avenues. All these results tend to confirm the effectiveness of the method, which can combine speed and accuracy. This method is intended to be used as an objective function to perform automatic or assisted reservoir model calibration on interpreted well-test data. It is expected to be particularly useful to calibrate naturally fractured reservoir models for which permeability tensors are to be calculated from uncertain locally defined fracture property statistics.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
We present an approach for the simultaneous estimation of permeability and porosity from time-lapse seismic, time-lapse electromagnetic, and production data. Permeability and porosity are some of the most important parameters for reservoir characterization because of their significant influence over oil recovery. Both permeability and porosity has a big footprint on the distribution of saturation and pressure in the reservoir during production, which can be snapshot by the time-lapse seismic and time-lapse electromagnetic measurements. Production data is the traditional source used for reservoir model conditioning but only provides limited constraint to the reservoir model that is very localized to the near-wellbore regions. In contrast, time-lapse seismic and time-lapse electromagnetic can impose additional constraints to the reservoir model conditioning problem covering the inter-well regions. We have developed a reservoir centric workflow to allow jointly invert these multiphysical data. The non-uniqueness of this inverse problem is hence effectively reduced. We use a synthetic case to demonstrate that the proposed approach can significantly improve the estimation of permeability and porosity as well as the fluid-front movement monitoring.
Article
Full-text available
We have set up a general framework to obtain estimations of the geostatistical parameters (GP) such as the correlation length, lc and permeability variance, ln from well test data. Most often, in practical studies, the GP are estimated using geological and petrophysical data, but in a lot of cases, these data are too scarce to give high confidence results. The method was tested using synthetic well-test data performed on some training images, and correct estimations of the underlying correlation length, lc and permeability variance, ln were recovered. Once the GP are estimated, other well established techniques can be used to get well-test matched reservoir images consistent with the geostatistical model. In practical applications, excellent well test data are needed, and the method should be improved using multiple well test data. P. 669
Conference Paper
The aim of this work is to present the effectiveness of a fully integrated approach for ensemble-based history matching on a complex real field application. We show that the predictive ability of the ensemble of models is greatly enhanced through an integrated workflow promoting multidisciplinary collaboration between all subsurface disciplines. Consistent integration of geological and engineering understanding within the dynamic data conditioning phase of a reservoir study is a challenging task. The ensemble-based approach offers an efficient solution to this challenge, especially when it is tied in with an appropriate reservoir modeling approach, suitable parameterization and repeatable workflows. All these are key factors to ensure geological consistency while improving the match quality and reliability of the generated reservoir models. One key feature of the ensemble-based method is that it overcomes the typical limitation of the traditional approaches where the number of uncertainty parameters often has to be reduced resulting from practical or algorithm constraints. This is especially important on complex reservoirs, where the subsurface uncertainties cannot be represented in a handful of scalar multipliers while honoring the static and dynamic data measurements. We organize the proposed methodology in four main steps: firstly, an initial ensemble of models able to capture the model uncertainties in all parts of the modeling processes are generated. Next, the match level for different observed data specifying the likelihood error is assigned. Then the uncertainty parameters that should be modified in the history matching process are identified. Finally, we perform a computational intensive step on large-scale computing facilities, where a state-of-art iterative ensemble algorithm, tightly connected with standard geo-modeling tools, is used to consistently update the full set of models. The result is an ensemble of reservoir models where we can quantify both the uncertainty in the production forecast and in the reservoir model parameters. The methodology has been applied on a challenging History Matching problem for a turbiditic channel complex reservoir. We demonstrate the efficiency of the method, by generating an ensemble of reservoir models that offer geologically consistent explanations of the currently measured static and dynamic data. The model forecasts reliability is confirmed through a validation process using dynamic data that have been left out of the history matching process. More importantly, however, is that the proposed workflow, and resulting models provides a good starting point for a more efficient update of the reservoir models as new static and dynamic data become available. With the proposed approach is possible to move from sporadic updates of reservoir models towards frequent model updates when the new data arrive. Utilizing all available data in a consistent manner - when the data are collected - is the key to making more robust reservoir management decisions, especially for complex reservoirs.
Conference Paper
When considering the task of creating reservoir models for fields under development, dynamic data measurements often have limited impact compared with static (geophysical and geological) data. This is not necessarily true for the Johan Sverdrup field offshore Norway, where exceptional reservoir properties make the data from eight drill-stem tests (DSTs) particularly interesting. For this reason, it is important to utilize the information found in the collected static and dynamic data in a consistent manner, to improve the understanding of the reservoir. This is especially true for the Avaldsnes High area, located in the southeastern part of the Johan Sverdrup field, where the observed thickness is below the seismic resolution, and the DST data from four wells indicate permeabilities in the range of 20 to 80 Darcy, with an overlapping radius of investigation. In this paper, we apply an ensemble-based approach to generate a large set of reservoir models for the Avaldsnes High area of the Johan Sverdrup field, all of which are plausible given the current observed static and dynamic data. We consider multiple modelling scenarios, introducing uncertainty in the sand thickness, facies (rock type) description and the permeability modelling. Unlike conventional pressure transient analysis (PTA), where we analyze the DSTs separately, and the non-uniqueness in the data interpretation is hard to address and quantify, this is not the case with the ensemble-based approach. Since we conduct the static and dynamic data conditioning simultaneously, we can consistently address possible ambiguities in interpreted permeabilities, thicknesses and flow barriers seen in the conventional PTA analysis. The study reveals that by conditioning the generated models to dynamic data we introduce clear spatial trends in both the sand thickness and permeability. In particular, we greatly reduce the potential downside with respect to the sand thickness in the Avaldsnes High area.
Conference Paper
Generating realizations of reservoir permeability and porosity fields that are conditional to static and dynamic data are difficult. The constraints imposed by dynamic data are typically nonlinear and the relationship between the observed data and the petrophysical parameters is given by a flow simulator which is expensive to run. In addition, spatial organization of real rock properties is quite complex. Thus, most attempts at conditioning reservoir properties to dynamic data have either approximated the relationship between data and parameters so that complex geologic models could be used, or have used simplified spatial models with actual production data. In this paper, we describe a multistep procedure for efficiently generating realizations of reservoir properties that honor dynamic data from complex stochastic models. First, we generate a realization of the rock properties that is conditioned to static data, but not to the pressure data. Second, we generate a realization of the production data (i.e., add random errors to the production data). Third, we find the property field that is as close as possible to the uncalibrated realization and also honors the realization of the production data. The ensemble of realizations generated by this procedure often provides a good empirical approximation to the posteriori probability density function for reservoir models and can be used for Monte Carlo inference. We apply the above procedure to the problem of conditioning a three-dimensional stochastic model to data from two well tests. The real-field example contains two facies. Permeabilities within each facies were generated using a "cloud transform" that honored the observed scatter in the crossplot of permeability and porosity. We cut a volume, containing both test wells, from the full-field model, then scaled it up to about 9,000 cells before calibrating to pressure data. Although the well-test data were of poor quality, the data provided information to modify the permeabilities within the regions of investigations and on the overall permeability average.
Article
We present a method to integrate log, core, and well-test pressure data to describe reservoir heterogeneities. The conditional simulation method of simulated annealing is used to incorporate diverse sources of data. We use analytical solutions for radially heterogeneous reservoirs to define an equivalent radial permeability and a corresponding region of investigation. By numerical experimentation on drawdown well-test simulations in heterogeneous permeability fields, we determine that a weighted-area-based geometric average of the grid-block permeabilities within the region of investigation best defines the equivalent radial permeability. This information, along with the spatial statistics from core/log data, are coded into the overall objective function of the simulated annealing algorithm to yield a consistent reservoir description.
Article
Stochastic simulation techniques are increasingly used to model reservoir heterogeneity. However, the value of such techniques depends largely on their ability to provide models of reservoir properties that are conditioned to all available information. In particular, pressure test data impose constraints on the permeability distribution around the corresponding wells. More precisely, the permeability derived from the analysis of a pressure curve is viewed as a permeability average in the vicinity of the well. A Monte-Carlo methodology is proposed in this study to characterize such averaging. Several permeability fields corresponding to various distribution models are generated using geostatistical simulation techniques. A pumping test is numerically simulated in each of these laterally non-uniform reservoirs, then analyzed automatically with a type-curve fitting algorithm to estimate the apparent permeability around the well. For each permeability distribution model considered, this analysis allows defining the volume of averaging as well as the type of permeability averaging to be applied within such volume. The volume of averaging is found approximately independent of the underlying permeability distribution model and smaller than that given by traditional definitions of radius-of-drainage. Within such volume and for practical test durations, the averaging relating small scale permeabilities to well test permeabilities largely depends on the statistical characteristics of the permeability field and may significantly differ from the traditional geometric mean. This averaging is particularly sensitive to the intensity and spatial connectivity of permeability variations. The averaging formula quantified by such study can be used for conditioning stochastic permeability fields to pressure test information. Selective sampling schemes based upon simulated annealing can be applied within the geostatistical simulation algorithm to force the generated permeability fields to locally match any particular average imposed by well test data.
Article
This paper demonstrates an integrated approach to conditioning models for fractured petroleum reservoirs, through application of discrete fracture network (DFN) methods. The approach is built on the observation that discrete fractures controlling reservoir production can also influence the anisotropy of seismic response. The paper extends previous work by the above authors by considering realistic fractured reservoir geometries including multiple fracture sets. The presence of a system of natural fractures in a reservoir induces anisotropy in its hydraulic and elastic properties. Fracture induced hydraulic anisotropy and related heterogeneous connectivity is evidenced through systematic non-uniform production performance. Elastic anisotropy may be observed in seismic data as azimuthally dependent elastic attributes such as compressional wave velocity and reflection amplitude versus azimuth which may be inverted from 3D seismic data. Previous work by the above authors demonstrated a method for simultaneous inversion of production and seismic observations through a gradient-based optimization scheme. In that work, the improvement in conditioning of the DFN model was demonstrated in a synthetic case containing a single fracture set. The current work builds on previous work by the authors through application of the new method to reservoirs containing two fracture systems. The robustness of the method with respect to host rock type is tested through use of matrix petrophysical and elastic properties representative of typical fractured reservoir lithologies. DFN model preconditioning requirements are explored through sensitivity testing with respect to hydraulic, elastic, and geometrical model parameters. The added value of seismic anisotropy in the inversion is demonstrated through comparison with a similar inversion process using only production data in the objective function. The results show that the new method for integration of production and seismic anisotropy provides improvement in resolution of the geometrical properties of multiple fracture systems over that which is achievable using production observations alone. The speed and stability of model convergence using the new process is dependent upon both fracture network and host rock properties. Key issues in model preconditioning observed in sensitivity tests are discussed. Introduction It is known from theory and practice that both the permeability and the elastic response of fractured reservoirs exhibit anisotropic behavior that is strongly correlated to the characteristics of the fracture system. Historically, geophysical and reservoir engineering techniques for characterization of systems of natural fractures have been developed and pursued independently. Seismic anisotropy, which is predicted by elastic theory, has been used by several authors1,2,3,4,5 to estimate fracture orientation and density from 3D seismic datasets. Sophisticated reservoir engineering techniques have been developed for characterization of the hydraulic response of naturally fractured reservoirs6,7,8,9. More recently, Parney and LaPointe10,11 used stochastic discrete feature network (DFN) modeling techniques to forward model both the elastic and hydraulic response of a common fracture system. However, this work did not solve the inverse problem. Previous work by Will et al.12 demonstrated a novel method of solving the inverse problem for fracture system geometrical characteristics. This method, based on DFN modeling techniques, simultaneously employed both the reservoir having a single fracture set. An initial estimate of fracture system parameters, P32 intensity and fracture trend, were refined through an iterative least squares optimization technique in which the objective function contained both elastic and hydraulic parameters. Stochastic DFN modeling techniques were used for forward modeling of both elastic and hydraulic responses of the fracture set at each iteration using appropriate effective media models. The method was evaluated by comparing optimizations carried out using a) seismic and hydraulic data, against b) hydraulic data alone. The inversion including both seismic and hydraulic data produced significantly better results, with faster and more stable convergence of both P32 intensity and fracture set orientation.
Article
This paper describes a method to condition facies (rock-type) permeabilities in geostatistical models to dynamic reservoir data: the flow meter log and the well test KH value. The proposed approach consists of two steps. In the first step, the flow meter logs are transformed into (reference) permeability profiles by use of the well test permeability–thickness (KH). This transformation is analytical, meaning that a geostatistical reservoir realization and numerical flow simulation are not required. In the second step, synthetic permeability profiles are computed for the known rock-type classification at the well locations. An optimization procedure is applied to improve the match of these synthetic permeability profiles with the reference (flow meter) profiles. In this process the initial (core based) rock-type permeability model is modified. The optimization procedure simultaneously takes into account all the wells, and is subject to geological constraints imposed by the user (ranking, permeability bounds). Although a number of assumptions must be verified, this fully analytical approach leads to a fast, flexible, and practical optimization routine that is relatively easy to implement. The up-front integration of dynamic data into the modeling process leads to a more representative permeability description than is obtained using only plug values. Because of the global nature of the optimization method, a satisfactory match can be extrapolated throughout the reservoir with a reasonable degree of confidence. The paper presents a successful application of the procedure to a Middle East reservoir.
Article
In order to predict accurately future production performance, a reservoir model should reflect the actual patterns of permeability connectivity (flow paths and barriers). Information about such patterns of connectivity is carried by flow response data recorded at wells. However, the flow response data are influenced by many factors other than permeability connectivity such as boundary conditions and fluid property variations. This paper presents a neural network-based procedure for filtering the information related to the permeability field in flow response data. The flow response data is modeled as specific multiple point averages of permeability values in the neighborhood of the well. Such multiple point averages allow accounting for the spatial connectivity of the permeability field. The superiority of such multiple point averages over single point permeability averages for representing flow response data is demonstrated over several reservoir examples. The ultimate quest is to integrate the permeability connectivity information contained in the flow response data into the numerical reservoir models. This amounts to ascertain that the permeability numerical models identify the previous multiple point averages. A Markov chain Monte Carlo simulation algorithm is implemented to perform this identification. Alternative equiprobable permeability fields are generated which, in addition to reproducing the production data, conform to a prior model for the spatial variability of the permeability field. The results demonstrate that flow simulation on the simulated permeability fields do indeed match historic well test data accurately. More importantly, future reservoir performance predictions are rendered more accurate.
Article
This review examines the single-phase flow of fluids to wells in heterogeneous porous media and explores procedures to evaluate pumping test or pressure-response curves. This paper examines how these curves may be used to improve descriptions of reservoir properties obtained from geology, geophysics, core analysis, outcrop measurements, and rock physics. We begin our discussion with a summary of the classical attempts to handle the issue of heterogeneity in well test analysis. We then review more recent advances concerning the evaluation of conductivity or permeability in terms of statistical variables and touch on perturbation techniques. Our current view to address the issue of heterogeneity by pumping tests may be simply summarized as follows. We assume a three-dimensional array (ordered set) of values for the properties of the porous medium as a function of the coordinates that is obtained as a result of measurements and interpretations. We presume that this array of values contains all relevant information available from prior geological and geophysical interpretations, core and outcrop measurements, and rock physics. These arrays consist of several million values of properties, and the information available is usually on a very fine scale (often
Article
This paper describes a new method for gradually deforming realizations of Gaussian-related stochastic models while preserving their spatial variability. This method consists in building a stochastic process whose state space is the ensemble of the realizations of a spatial stochastic model. In particular, a stochastic process, built by combining independent Gaussian random functions, is proposed to perform the gradual deformation of realizations. Then, the gradual deformation algorithm is coupled with an optimization algorithm to calibrate realizations of stochastic models to nonlinear data. The method is applied to calibrate a continuous and a discrete synthetic permeability fields to well-test pressure data. The examples illustrate the efficiency of the proposed method. Furthermore, we present some extensions of this method (multidimensional gradual deformation, gradual deformation with respect to structural parameters, and local gradual deformation) that are useful in practice. Although the method described in this paper is operational only in the Gaussian framework (e.g., lognormal model, truncated Gaussian model, etc.), the idea of gradually deforming realizations through a stochastic process remains general and therefore promising even for calibrating non-Gaussian models.