Use of the MultiNest algorithm for gravitational wave data analysis

Classical and Quantum Gravity (Impact Factor: 3.56). 04/2009; DOI: 10.1088/0264-9381/26/21/215003
Source: arXiv

ABSTRACT We describe an application of the MultiNest algorithm to gravitational wave data analysis. MultiNest is a multimodal nested sampling algorithm designed to efficiently evaluate the Bayesian evidence and return posterior probability densities for likelihood surfaces containing multiple secondary modes. The algorithm employs a set of live points which are updated by partitioning the set into multiple overlapping ellipsoids and sampling uniformly from within them. This set of live points climbs up the likelihood surface through nested iso-likelihood contours and the evidence and posterior distributions can be recovered from the point set evolution. The algorithm is model-independent in the sense that the specific problem being tackled enters only through the likelihood computation, and does not change how the live point set is updated. In this paper, we consider the use of the algorithm for gravitational wave data analysis by searching a simulated LISA data set containing two non-spinning supermassive black hole binary signals. The algorithm is able to rapidly identify all the modes of the solution and recover the true parameters of the sources to high precision. Comment: 18 pages, 4 figures, submitted to Class. Quantum Grav; v2 includes various changes in light of referee's comments

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This review is focused on tests of Einstein's theory of General Relativity with gravitational waves that are detectable by ground-based interferometers and pulsar timing experiments. Einstein's theory has been greatly constrained in the quasi-linear, quasi-stationary regime, where gravity is weak and velocities are small. Gravitational waves will allow us to probe a complimentary, yet previously unexplored regime: the non-linear and dynamical strong-field regime. Such a regime is, for example, applicable to compact binaries coalescing, where characteristic velocities can reach fifty percent the speed of light and compactnesses can reach a half. This review begins with the theoretical basis and the predicted gravitational wave observables of modified gravity theories. The review continues with a brief description of the detectors, including both gravitational wave interferometers and pulsar timing arrays, leading to a discussion of the data analysis formalism that is applicable for such tests. The review ends with a discussion of gravitational wave tests for compact binary systems.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The Galaxy is suspected to contain hundreds of millions of binary white dwarf systems, a large fraction of which will have sufficiently small orbital period to emit gravitational radiation in band for space-based gravitational wave detectors such as the Laser Interferometer Space Antenna (LISA). LISA's main science goal is the detection of cosmological events (supermassive black hole mergers, etc.) however the gravitational signal from the galaxy will be the dominant contribution to the data -- including instrumental noise -- over approximately two decades in frequency. The catalogue of detectable binary systems will serve as an unparalleled means of studying the Galaxy. Furthermore, to maximize the scientific return from the mission, the data must be "cleansed" of the galactic foreground. We will present an algorithm that can accurately resolve and subtract >10000 of these sources from simulated data supplied by the Mock LISA Data Challenge Task Force. Using the time evolution of the gravitational wave frequency, we will reconstruct the position of the recovered binaries and show how LISA will sample the entire compact binary population in the Galaxy.
    Physical review D: Particles and fields 06/2011; 84.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Model selection and parameter inference are complex problems that have yet to be fully addressed in systems biology. In contrast with parameter optimisation, parameter inference computes both the parameter means and their standard deviations (or full posterior distributions), thus yielding important information on the extent to which the data and the model topology constrain the inferred parameter values. We report on the application of nested sampling, a statistical approach to computing the Bayesian evidence Z, to the inference of parameters, and the estimation of log Z in an established model of circadian rhythms. A ten-fold difference in the coefficient of variation between degradation and transcription parameters is demonstrated. We further show that the uncertainty remaining in the parameter values is reduced by the analysis of increasing numbers of circadian cycles of data, up to 4 cycles, but is unaffected by sampling the data more frequently. Novel algorithms for calculating the likelihood of a model, and a characterisation of the performance of the nested sampling algorithm are also reported. The methods we develop considerably improve the computational efficiency of the likelihood calculation, and of the exploratory step within nested sampling. We have demonstrated in an exemplar circadian model that the estimates of posterior parameter densities (as summarised by parameter means and standard deviations) are influenced predominately by the length of the time series, becoming more narrowly constrained as the number of circadian cycles considered increases. We have also shown the utility of the coefficient of variation for discriminating between highly-constrained and less-well constrained parameters.
    BMC Systems Biology 07/2013; 7(1):72. · 2.98 Impact Factor

Full-text (2 Sources)

Available from
May 26, 2014