Bayesian and non-Bayesian probabilistic models for medical image analysis

Imaging Science and Biomedical Engineering Division, Medical School, University of Manchester, Stopford Building, Oxford Road, Manchester M13 9PT, UK
Image and Vision Computing (Impact Factor: 1.59). 09/2003; 21(10):851-864. DOI: 10.1016/S0262-8856(03)00072-6
Source: DBLP


Bayesian approaches to data analysis are popular in machine vision, and yet the main advantage of Bayes theory, the ability to incorporate prior knowledge in the form of the prior probabilities, may lead to problems in some quantitative tasks. In this paper we demonstrate examples of Bayesian and non-Bayesian techniques from the area of magnetic resonance image (MRI) analysis. Issues raised by these examples are used to illustrate difficulties in Bayesian methods and to motivate an approach based on frequentist methods. We believe this approach to be more suited to quantitative data analysis, and provide a general theory for the use of these methods in learning (Bayes risk) systems and for data fusion. Proofs are given for the more novel aspects of the theory. We conclude with a discussion of the strengths and weaknesses, and the fundamental suitability, of Bayesian and non-Bayesian approaches for MRI analysis in particular, and for machine vision systems in general.

Full-text preview

Available from:
  • Source
    • "Some previous work has attempted to modify the Bayesian formalism to account for local structure, such as boundaries, by using estimates of prior probabilities based upon a local resampling of data or Markov Random Field (MRF) formulations [29]. The freedom to take such a step is linked to the classic problem of identification of prior probabilities in Bayesian methods [9]. However, in the work presented here we take an alternative view that local information regarding image structure can be included directly within the density model. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a new algorithm for the segmentation of medical image volumes, which addresses the problem of partial volume tissue estimation, where a mixture of tissues combine to form the intensity value for a particular voxel. In addition, the algorithm is capable of using multiple image volumes, and the associated multi-dimensional image gradient, to increase tissue separability. It uses the Expectation-Maximisation (EM) algorithm to perform clustering in image intensity and gradient histograms. Bayes theory is used to generate probability maps for the most likely tissue volume fraction within each voxel, in contrast to previous approaches, which typically compute the most likely tissue class label. Evaluation of the algorithm consisted of three stages, all of which used MR data sets of the normal human brain. First, the improvement in the model parameter stability gained through the inclusion of gradient information was evaluated. Second, the improved segmentation accuracy of multi-dimensional approaches was demonstrated by assessing the errors on reconstructed images produced from the seg-mentation result. Finally, the absolute accuracy of the segmentation when applied to an exemplar medical problem, the measurement of cerebrospinal fluid (CSF) volumes, was evaluated through com-parison with a "bronze-standard" consisting of previous published measurements.
    Preview · Article · Jan 2008
  • Source
    • "It merely shows the statistical accuracy of the prediction. It is well known that a Bayesian probability approach provides a statistical calculation based framework for determining the likelihood, which is dependent on prior knowledge, accumulated experience, and empirical data [7]-[8]. The simple formulation of Bayesian probability approach is, "
    [Show abstract] [Hide abstract]
    ABSTRACT: Economic pressure to reduce the cost of the U.S. Navy ships has brought into the focus the need to significantly reduce the size of a ship's crew. In order for an automated system to replace humans while making critical decisions, it is required that such a system be able to accurately predict future events. This paper presents a wavelet theory based prediction system to predict the occurrences of ship's fires. Furthermore, while the prediction model predicts the future events, the accuracy of prediction has to be quantified by formulating a probability index that would mirror the confidence on the prediction. As such, a Bayesian theory based probability estimation model (BPEM) is developed for estimating the probability that the predicted values are within specified limits of tolerance. Tests with the U.S Naval Research Laboratory (NRL) data, covering various fire scenarios, validate that the proposed methodology consistently provides earlier detection as compared to the published results from the INRL' early warning fire detection system (EWFD) system.
    Preview · Conference Paper · Jul 2005
  • Source
    • "Unknown tissues are accounted for in the Bayesian formulation by including a fixed extra term f O for infrequently occurring outlier data [2] in total probability which enables separation of pathological tissues. "
    [Show abstract] [Hide abstract]
    ABSTRACT: One of the most common problems in image analysis is the estimation and removal of noise or other artefacts using spatial filters. Common tech-niques include Gaussian, Median and Anisotropic Filtering. Though these techniques are quite common they must be used with great care on medi-cal data, as it is very easy to introduce artifact into images due to spatial smoothing. The use of such techniques is further restricted by the absence of a 'gold standard' data against which to test the behaviour of the filters. Following a general discussion of the equivalence of filtering techniques to likelihood based estimation using an assumed model, this paper describes an approach to noise filtering in multi-dimensional data using a partial volume data density model. The resulting data sets can then be taken as a gold stan-dard for spatial filtering techniques which use the information from single images. We demonstrate equivalence between the results from this analysis and techniques for performance characterisation which do not require a 'gold standard'.
    Preview · Article · Jan 2004
Show more