Estimation of cosmological parameters using adaptive importance sampling

Physical review D: Particles and fields (Impact Factor: 4.86). 03/2009; 80(2). DOI: 10.1103/PhysRevD.80.023507
Source: arXiv


We present a Bayesian sampling algorithm called adaptive importance sampling or Population Monte Carlo (PMC), whose computational workload is easily parallelizable and thus has the potential to considerably reduce the wall-clock time required for sampling, along with providing other benefits. To assess the performance of the approach for cosmological problems, we use simulated and actual data consisting of CMB anisotropies, supernovae of type Ia, and weak cosmological lensing, and provide a comparison of results to those obtained using state-of-the-art Markov Chain Monte Carlo (MCMC). For both types of data sets, we find comparable parameter estimates for PMC and MCMC, with the advantage of a significantly lower computational time for PMC. In the case of WMAP5 data, for example, the wall-clock time reduces from several days for MCMC to a few hours using PMC on a cluster of processors. Other benefits of the PMC approach, along with potential difficulties in using the approach, are analysed and discussed. Comment: 17 pages, 11 figures

Download full-text


Available from: S. Prunet, Dec 31, 2013
  • Source
    • ", we chose to implement the adaptive Metropolis algorithm suggested in [19] for the examples discussed below. It uses a multivariate Gaussian proposal that is centered on the current point and updated based on the covariance of previous iterations with the cooling scheme described in [12]. Note that the resulting chain strictly speaking is not a Markov chain because the proposal is continuously adapted after every batch of N update iterations, but for simplicity, we continue to use the term " Markov " . "
    [Show abstract] [Hide abstract]
    ABSTRACT: Adaptive importance sampling is a powerful tool to sample from complicated target densities, but its success depends sensitively on the initial proposal density. An algorithm is presented to automatically perform the initialization using Markov chains and hierarchical clustering. The performance is checked on challenging multimodal examples in up to 20 dimensions and compared to results from nested sampling. Our approach yields a proposal that leads to rapid convergence and accurate estimation of overall normalization and marginal distributions.
  • Source
    • "Our MCMC chains are obtained by computing the Bayesian likelihood at random points selected using the Metropolis Hastings (MH) algorithm [44] [45]. For other sampling algorithms used in the context of cosmological parameter estimation we refer the reader to [46] [47] [48] [49] [50] [51] [52]. At each point Θ = (θ 1 , θ 2 , ...θ n ) the Bayesian posterior likelihood, L(Θ | D) is computed given the data D as, L(Θ | D) ∝ L(D | Θ)L(Θ). "
    [Show abstract] [Hide abstract]
    ABSTRACT: We make the first detailed MCMC likelihood study of cosmological constraints that are expected from some of the largest, ongoing and proposed, cluster surveys in different wave-bands and compare the estimates to the prevalent Fisher matrix forecasts. Mock catalogs of cluster counts expected from the surveys -- eROSITA, WFXT, RCS2, DES and Planck, along with a mock dataset of follow-up mass calibrations are analyzed for this purpose. A fair agreement between MCMC and Fisher results is found only in the case of minimal models. However, for many cases, the marginalized constraints obtained from Fisher and MCMC methods can differ by factors of 30-100%. The discrepancy can be alarmingly large for a time dependent dark energy equation of state, w(a); the Fisher methods are seen to under-estimate the constraints by as much as a factor of 4--5. Typically, Fisher estimates become more and more inappropriate as we move away from LCDM, to a constant-w dark energy to varying-w dark energy cosmologies. Fisher analysis, also, predicts incorrect parameter degeneracies. From the point of mass-calibration uncertainties, a high value of unknown scatter about the mean mass-observable relation, and its redshift dependence, is seen to have large degeneracies with the cosmological parameters sigma_8 and w(a) and can degrade the cosmological constraints considerably. We find that the addition of mass-calibrated cluster datasets can improve dark energy and sigma_8 constraints by factors of 2--3 from what can be obtained compared to CMB+SNe+BAO only. Since, details of future cluster surveys are still being planned, we emphasize that optimal survey design must be done using MCMC analysis rather than Fisher forecasting. [abridged]
    Journal of Cosmology and Astroparticle Physics 10/2012; 2013(02). DOI:10.1088/1475-7516/2013/02/030 · 5.81 Impact Factor
  • Source
    • "The level of difficulty may be that the computation time of one single value of the likelihood function requires several seconds, as in the cosmology analysis of Wraith et al. (2009) where the likelihood is represented by an involved computer program. It may also be that the only possible representation of the likelihood function is as an integral over a possibly large number of latent variables of a joint (unobserved) likelihood. "
    [Show abstract] [Hide abstract]
    ABSTRACT: While Robert and Rousseau (2010) addressed the foundational aspects of Bayesian analysis, the current chapter details its practical aspects through a review of the computational methods available for approximating Bayesian procedures. Recent innovations like Monte Carlo Markov chain, sequential Monte Carlo methods and more recently Approximate Bayesian Computation techniques have considerably increased the potential for Bayesian applications and they have also opened new avenues for Bayesian inference, first and foremost Bayesian model choice. Comment: This is a chapter for the book "Bayesian Methods and Expert Elicitation" edited by Klaus Bocker, 23 pages, 9 figures
Show more