Article
Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2
Department of Statistics, University of Auckland, Private Bag 92019, Auckland, New Zealand
Computational Statistics & Data Analysis (Impact Factor: 1.4). 03/2008; 52(7):34083423. DOI: 10.1016/j.csda.2008.01.005 Source: DBLP
Fulltext preview
sciencedirect.com Available from: Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.

 "Iterations (while n < N ): 2. Build a proposal, πt(x), given the set St = {s 1 , . . . , sm t }, using a convenient procedure (e.g. the ones described in [28], [30] or the simpler ones proposed in Section IIIC). 3. Draw x ∼ e πt(x) ∝ πt(x) and u ∼ U ([0] [1]). "
Dataset: IA2RMS inpress

 "2. Build a proposal, πt(x), given the set St = {s1, . . . , sm t }, using a convenient procedure (e.g. the ones described in [28], [30] or the simpler ones proposed in Section IIIC). "
[Show abstract] [Hide abstract] ABSTRACT: Bayesian methods have become very popular in signal processing lately, even though performing exact Bayesian inference is often unfeasible due to the lack of analytical expressions for optimal Bayesian estimators. In order to overcome this problem, Monte Carlo (MC) techniques are frequently used. Several classes of MC schemes have been developed, including Markov Chain Monte Carlo (MCMC) methods, particle filters and population Monte Carlo approaches. In this paper, we concentrate on the Gibbstype approach, where automatic and fast samplers are needed to draw from univariate (fullconditional) densities. The Adaptive Rejection Metropolis Sampling (ARMS) technique is widely used within Gibbs sampling, but suffers from an important drawback: an incomplete adaptation of the proposal in some cases. In this work, we propose an alternative adaptive MCMC algorithm (IA 2 RMS) that overcomes this limitation, speeding up the convergence of the chain to the target, allowing us to simplify the construction of the sequence of proposals, and thus reducing the computational cost of the entire algorithm. Note that, although IA2RMS has been developed as an extremely efficient MCMCwithinGibbs sampler, it also provides an excellent performance as a standalone algorithm when sampling from univariate distributions. In this case, the convergence of the proposal to the target is proved and a bound on the complexity of the proposal is provided. Numerical results, both for univariate (standalone IA2RMS) and multivariate (IA 2 RMSwithinGibbs) distributions, show that IA2RMS outperforms ARMS and other classical techniques, providing a correlation among samples close to zero. 
 "(a) illustrates an example of this construction. More sophisticated approaches to build W t (x) (e.g., using quadratic segments when possible [12] "
Conference Paper: Independent doubly Adaptive Rejection Metropolis Sampling
[Show abstract] [Hide abstract] ABSTRACT: Adaptive Rejection Metropolis Sampling (ARMS) is a wellknown MCMC scheme for generating samples from onedimensional target distributions. ARMS is widely used within Gibbs sampling, where automatic and fast samplers are often needed to draw from univariate fullconditional densities. In this work, we propose an alternative adaptive algorithm (IA 2 RMS) that overcomes the main drawback of ARMS (an uncomplete adaptation of the proposal in some cases), speeding up the convergence of the chain to the target. Numerical results show that IA 2 RMS outperforms the standard ARMS, providing a correlation among samples close to zero. Index Terms— Monte Carlo methods, Gibbs sampler, adaptive rejection Metropolis sampling (ARMS).