Article

Estimation of cosmological parameters using adaptive importance sampling

Physical review D: Particles and fields (Impact Factor: 4.86). 03/2009; 80(2). DOI: 10.1103/PhysRevD.80.023507
Source: arXiv

ABSTRACT We present a Bayesian sampling algorithm called adaptive importance sampling or Population Monte Carlo (PMC), whose computational workload is easily parallelizable and thus has the potential to considerably reduce the wall-clock time required for sampling, along with providing other benefits. To assess the performance of the approach for cosmological problems, we use simulated and actual data consisting of CMB anisotropies, supernovae of type Ia, and weak cosmological lensing, and provide a comparison of results to those obtained using state-of-the-art Markov Chain Monte Carlo (MCMC). For both types of data sets, we find comparable parameter estimates for PMC and MCMC, with the advantage of a significantly lower computational time for PMC. In the case of WMAP5 data, for example, the wall-clock time reduces from several days for MCMC to a few hours using PMC on a cluster of processors. Other benefits of the PMC approach, along with potential difficulties in using the approach, are analysed and discussed. Comment: 17 pages, 11 figures

Full-text

Available from: S. Prunet, Dec 31, 2013
0 Followers
 · 
95 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: In the context of the halo model, the simple — but powerful — assumption that the number of galaxies only depends on halo mass, the halo occupation distribution (HOD) model, leads to an accurate analytic prediction of the galaxy distribution. Reciprocally, interpreting galaxy clustering using the HOD model allows to make a direct comparison between galaxy properties and halo mass. By using accurate galaxy clustering measurements over 133 deg2 of the Wide component of the Canada-France-Hawaii Telescope Legacy Survey (CFHTLS), we performed a detailed investigation of the changing relationship between galaxies and the dark matter haloes they inhabit from z = 1.2 to the local Universe. In this paper, we present our results and their implication for galaxy evolution.
    10/2013;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Measurement of cosmic shear using weak gravitational lensing is a challenging task that involves a number of complicated procedures. We study in detail the systematic errors in measurement of weak lensing Minkowski Functionals (MFs). Specifically, we focus on systematics associated with galaxy shape measurement, photometric redshift errors, and shear calibration correction. We first generate mock weak lensing catalogues that directly incorporate the actual observational characteristics of the Canada-France-Hawaii Lensing Survey (CFHTLenS). We then perform the Fisher analysis using the large set of mock catalogues for various cosmological models. We find that the statistical error associated with the observational effects degrades the cosmological parameter constraints by a factor of a few. Subaru Hyper Suprime-Cam (HSC) survey with the sky coverage of ~1400 deg2 will constrain the dark energy equation of state parameter with an error of Delta w_0 ~ 0.25 by the lensing MFs alone, but biases induced by the systematics can be comparable to the 1-sigma error. We conclude that the lensing MFs are powerful statistics beyond the two-point statistics, only if well calibrated measurement of both the redshifts and the shapes of source galaxies is performed. Finally, we analyse the CFHTLenS data to explore the ability of the MFs to break degeneracies between a few cosmological parameters. Using a combined analysis of the MFs and the shear correlation function, we derive the matter density Omega_m0= 0.256+0.054-0.046.
    The Astrophysical Journal 12/2013; 786(1). DOI:10.1088/0004-637X/786/1/43 · 6.28 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We describe an algorithm which can provide mixture summaries of multi-modal posterior distributions adaptively. The parameter space of the involved posteriors ranges in size from few-dimensional to dozens of dimensions. This work was motivated by an astrophysical problem called extra-solar planets (exoplanets) detection, wherein a challenging issue is computation of stochastic integrals that are required for Bayesian model comparison. The difficulty comes from the highly nonlinear models that lead to multi-modal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multi-modal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it to be a large constant, which leads to penalty in the required computational resource. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximisation (EM) step to tailor such number online. The efficiency of our proposed method is testified via both simulation studies and real exoplanet data analysis.
    The Astrophysical Journal Supplement Series 07/2014; 213(1):14. DOI:10.1088/0067-0049/213/1/14 · 14.14 Impact Factor