A Bayesian model of shape and appearance for subcortical brain segmentation

FMRIB Centre, Department of Clinical Neurology, University of Oxford, Oxford, UK.
NeuroImage (Impact Factor: 6.13). 02/2011; 56(3):907-22. DOI: 10.1016/j.neuroimage.2011.02.046
Source: PubMed

ABSTRACT Automatic segmentation of subcortical structures in human brain MR images is an important but difficult task due to poor and variable intensity contrast. Clear, well-defined intensity features are absent in many places along typical structure boundaries and so extra information is required to achieve successful segmentation. A method is proposed here that uses manually labelled image data to provide anatomical training information. It utilises the principles of the Active Shape and Appearance Models but places them within a Bayesian framework, allowing probabilistic relationships between shape and intensity to be fully exploited. The model is trained for 15 different subcortical structures using 336 manually-labelled T1-weighted MR images. Using the Bayesian approach, conditional probabilities can be calculated easily and efficiently, avoiding technical problems of ill-conditioned covariance matrices, even with weak priors, and eliminating the need for fitting extra empirical scaling parameters, as is required in standard Active Appearance Models. Furthermore, differences in boundary vertex locations provide a direct, purely local measure of geometric change in structure between groups that, unlike voxel-based morphometry, is not dependent on tissue classification methods or arbitrary smoothing. In this paper the fully-automated segmentation method is presented and assessed both quantitatively, using Leave-One-Out testing on the 336 training images, and qualitatively, using an independent clinical dataset involving Alzheimer's disease. Median Dice overlaps between 0.7 and 0.9 are obtained with this method, which is comparable or better than other automated methods. An implementation of this method, called FIRST, is currently distributed with the freely-available FSL package.


Available from: Brian Patenaude, Apr 24, 2014

Click to see the full-text of:

Article: A Bayesian model of shape and appearance for subcortical brain segmentation

2.81 MB

See full-text
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose a hierarchical pipeline for skull-stripping and segmentation of anatomical structures of interest from T1-weighted images of the human brain. The pipeline is constructed based on a two-level Bayesian parameter estimation algorithm called multi-atlas likelihood fusion (MALF). In MALF, estimation of the parameter of interest is performed via maximum a posteriori estimation using the expectation-maximization (EM) algorithm. The likelihoods of multiple atlases are fused in the E-step while the optimal estimator, a single maximizer of the fused likelihoods, is then obtained in the M-step. There are two stages in the proposed pipeline; first the input T1-weighted image is automatically skull-stripped via a fast MALF, then internal brain structures of interest are automatically extracted using a regular MALF. We assess the performance of each of the two modules in the pipeline based on two sets of images with markedly different anatomical and photometric contrasts; 3T MPRAGE scans of pediatric subjects with developmental disorders vs. 1.5T SPGR scans of elderly subjects with dementia. Evaluation is performed quantitatively using the Dice overlap as well as qualitatively via visual inspections. As a result, we demonstrate subject-level differences in the performance of the proposed pipeline, which may be accounted for by age, diagnosis, or the imaging parameters (particularly the field strength). For the subcortical and ventricular structures of the two datasets, the hierarchical pipeline is capable of producing automated segmentations with Dice overlaps ranging from 0.8 to 0.964 when compared with the gold standard. Comparisons with other representative segmentation algorithms are presented, relative to which the proposed hierarchical pipeline demonstrates comparative or superior accuracy.
    Frontiers in Neuroscience 01/2015; 9:61. DOI:10.3389/fnins.2015.00061
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The majority of patients with anti-NMDA receptor (NMDAR) encephalitis suffer from persistent memory impairment despite unremarkable routine clinical MRI. With improved acute care in these patients, neurocognitive impairment represents the major contributor to long-term morbidity and has thus become a focus of attention.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Both high and low blood pressure (BP) have been positively as well as negatively associated with brain volumes in a variety of populations. The objective of this study was to investigate whether BP is associated with cortical and subcortical brain volumes in older old persons with mild cognitive deficits. Within the Discontinuation of Antihypertensive Treatment in the Elderly trial, the cross-sectional relation of BP parameters with both cortical and subcortical brain volumes was investigated in 220 older old persons with mild cognitive deficits (43% men, mean age = 80.7 (SD = 4.1), median Mini-Mental State Examination score = 26 (interquartile range: 25-27)), using linear regression analysis. All analyses were adjusted for age, gender, volume of white matter hyperintensities, and duration of antihypertensive treatment. Brain volumes were determined on 3DT1-weighted brain magnetic resonance imaging scans. Lower systolic BP, diastolic BP, and mean arterial pressure (MAP) were significantly associated with lower volumes of thalamus and putamen (all P ≤ 0.01). In addition, lower MAP was also associated with reduced hippocampal volume (P = 0.035). There were no associations between any of the BP parameters with cortical gray matter or white matter volume. In an older population using antihypertensive medication with mild cognitive deficits, a lower BP, rather than a high BP is associated with reduced volumes of thalamus, putamen, and hippocampus. © American Journal of Hypertension, Ltd 2015. All rights reserved. For Permissions, please email: