Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2

Department of Statistics, University of Auckland, Private Bag 92019, Auckland, New Zealand
Computational Statistics & Data Analysis (Impact Factor: 1.15). 03/2008; DOI: 10.1016/j.csda.2008.01.005
Source: DBLP

ABSTRACT A crucial problem in Bayesian posterior computation is efficient sampling from a univariate distribution, e.g. a full conditional distribution in applications of the Gibbs sampler. This full conditional distribution is usually non-conjugate, algebraically complex and computationally expensive to evaluate. We propose an alternative algorithm, called ARMS2, to the widely used adaptive rejection sampling technique ARS [Gilks, W.R., Wild, P., 1992. Adaptive rejection sampling for Gibbs sampling. Applied Statistics 41 (2), 337–348; Gilks, W.R., 1992. Derivative-free adaptive rejection sampling for Gibbs sampling. In: Bernardo, J.M., Berger, J.O., Dawid, A.P., Smith, A.F.M. (Eds.), Bayesian Statistics, Vol. 4. Clarendon, Oxford, pp. 641–649] for generating a sample from univariate log-concave densities. Whereas ARS is based on sampling from piecewise exponentials, the new algorithm uses truncated normal distributions and makes use of a clever auxiliary variable technique [Damien, P., Walker, S.G., 2001. Sampling truncated normal, beta, and gamma densities. Journal of Computational and Graphical Statistics 10 (2) 206–215]. Furthermore, we extend this algorithm to deal with non-log-concave densities to provide an enhanced alternative to adaptive rejection Metropolis sampling, ARMS [Gilks, W.R., Best, N.G., Tan, K.K.C., 1995. Adaptive rejection Metropolis sampling within Gibbs sampling. Applied Statistics 44, 455–472]. The performance of ARMS and ARMS2 is compared in simulations of standard univariate distributions as well as in Gibbs sampling of a Bayesian hierarchical state-space model used for fisheries stock assessment.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Adaptive Rejection Metropolis Sampling (ARMS) is a well-known MCMC scheme for generating samples from one-dimensional target distributions. ARMS is widely used within Gibbs sampling, where automatic and fast samplers are of-ten needed to draw from univariate full-conditional densities. In this work, we propose an alternative adaptive algorithm (IA 2 RMS) that overcomes the main drawback of ARMS (an uncomplete adaptation of the proposal in some cases), speed-ing up the convergence of the chain to the target. Numerical results show that IA 2 RMS outperforms the standard ARMS, providing a correlation among samples close to zero. Index Terms— Monte Carlo methods, Gibbs sampler, adaptive rejection Metropolis sampling (ARMS).
    IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); 01/2014
  • [Show abstract] [Hide abstract]
    ABSTRACT: Finite-element modeling has applications in various disciplines of engineering. The finite element models do not necessarily predict the measured data sufficiently accurately. This is why we need to update these models to better reflect the measured data. Applications to engineering problems are considered especially for updating of finite element models and its applications to model detection. This book introduces the concepts of computational intelligence for finite-element-model updating. The computational intelligence methods used for finite-element-model updating are divided into three classes: optimization methods, machine-learning methods and Monte-Carlo based methods. The optimization methods used include the Broyden-Fletcher-Goldfarb-Shanno (BFGS), the Nelder-Mead (NM) simplex method, the genetic algorithm, the particle swarm optimization, simulated annealing, the response-surface method and hybrid methods. The optimization techniques are mainly global optimum methods with the exception of the BFGS and NM methods, which are local methods. The machine-learning methods used in this book are multi-layer perceptron neural networks, and the Monte Carlo methods are used to train a sample probability distribution that is obtained by formulating the problems using the Bayesian framework. In chapter 2 the BFGS and NM methods are applied to the finite-element-model updating process. Chapter 3 compares the NM simplex method, which is found to be better than BFGS in Chapter 2, to a genetic algorithm (GA) for finite-element-model updating. The GA algorithm is a global optimum method inspired by the evolutionary process of different species. In Chapter 4 the particle-swarm-optimization (PSO) method is applied to finite-element-model updating. This method is tested and compared to a GA and it performs better than the GA. Chapter 5 implements simulated annealing (SA) for a finite-element-model updating and it compares it with the PSO. It is observed that the PSO give a more accurately updated finite-element results that SA. Chapter 6 presents the response-surface method (RSM)for finite-element-model updating and Chapter 7 introduces the hybrid of particle-swarm optimization and the NM simplex optimization method for the finite-element-model updating. Chapter 8 presents a multiple criterion method (MCM) for finite-element-model updating and Chapter 9 implements a Bayesian neural network for finite-element-model updating and compares it with to the response-surface method. The Bayesian neural-network method performs better than the response-surface techniques. Chapter 10 implements the Bayesian approach for finite-element-model updating. The Bayesian is implemented using a genetic Markov chain Monte Carlo method inspired by genetic-programming techniques. In Chapter 11 a multiple criterion method (MCM) is presented and tested. This book opens new research directions in the field of computational intelligence applied in mathematical models that use finite-element updating methods. I warmly recommend this book for the under-graduate and graduate students, researchers and all the people interested in the fields of computational intelligence and the finite element method.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Bayesian inference often requires efficient numerical approximation algorithms, such as sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) methods. The Gibbs sampler is a well-known MCMC technique, widely applied in many signal processing problems. Drawing samples from univariate full-conditional distributions efficiently is essential for the practical application of the Gibbs sampler. In this work, we present a simple, self-tuned and extremely efficient MCMC algorithm which produces virtually independent samples from these univariate target densities. The proposal density used is self-tuned and tailored to the specific target, but it is not adaptive. Instead, the proposal is adjusted during an initial optimization stage, following a simple and extremely effective procedure. Hence, we have named the newly proposed approach as FUSS (Fast Universal Self-tuned Sampler), as it can be used to sample from any bounded univariate distribution and also from any bounded multi-variate distribution, either directly or by embedding it within a Gibbs sampler. Numerical experiments, on several synthetic data sets (including a challenging parameter estimation problem in a chaotic system) and a high-dimensional financial signal processing problem, show its good performance in terms of speed and estimation accuracy.
    Digital Signal Processing 04/2015; DOI:10.1016/j.dsp.2015.04.005 · 1.50 Impact Factor


Available from