The Human Connectome Project: A data acquisition perspective

Department of Anatomy & Neurobiology, Washington University, St. Louis, MO, USA.
NeuroImage (Impact Factor: 6.36). 02/2012; 62(4):2222-31. DOI: 10.1016/j.neuroimage.2012.02.018
Source: PubMed

ABSTRACT The Human Connectome Project (HCP) is an ambitious 5-year effort to characterize brain connectivity and function and their variability in healthy adults. This review summarizes the data acquisition plans being implemented by a consortium of HCP investigators who will study a population of 1200 subjects (twins and their non-twin siblings) using multiple imaging modalities along with extensive behavioral and genetic data. The imaging modalities will include diffusion imaging (dMRI), resting-state fMRI (R-fMRI), task-evoked fMRI (T-fMRI), T1- and T2-weighted MRI for structural and myelin mapping, plus combined magnetoencephalography and electroencephalography (MEG/EEG). Given the importance of obtaining the best possible data quality, we discuss the efforts underway during the first two years of the grant (Phase I) to refine and optimize many aspects of HCP data acquisition, including a new 7T scanner, a customized 3T scanner, and improved MR pulse sequences.

Download full-text


Available from: Linda J Larson-Prior, Jun 30, 2014
1 Follower
45 Reads
  • Source
    • "the years that followed the publication of this paper, the idea to establish a " comprehensive structural description of the network of elements and connections forming the human brain " (Sporns et al., 2005) has gained rapidly growing interest in the field, yielding large-scale data-collection initiatives (Biswal et al., 2010; Nooner et al., 2012; Toga et al., 2012; Van Essen et al., 2013; 2012) as well as analyses (Glasser et al., 2013; Setsompop et al., 2013; Smith et al., 2013; Zuo et al., 2011), which illustrates the interest in structural connectivity databases of the human brain. Thus, in the past, several dMRIbased white matter atlases have been introduced (Mori et al., 2008) which were usually based on single subject data (Bürgel et al., 2006; Catani et al., 2002; Hagmann et al., 2003; Makris et al., 1997; Pajevic and Pierpaoli, 2000; Stieltjes et al., 2001; Wakana et al., 2004). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The analysis of the structural architecture of the human brain in terms of connectivity between its sub-regions has provided profound insights into its underlying functional organization and has coined the concept of the "connectome", a structural description of the elements forming the human brain and the connections among them. Here, as a proof of concept, we introduce a novel group connectome in standard space based on a large sample of 169 subjects from the Enhanced Nathan Kline Institute - Rockland Sample (eNKI-RS). Whole brain structural connectomes of each subject were estimated with a global tracking approach, and the resulting fiber tracts were warped into standard stereotactic (MNI) space using DARTEL. Employing this group connectome, the results of published tracking studies (i.e., the JHU white matter and Oxford thalamic connectivity atlas) could be largely reproduced directly within MNI space. As a second experiment, a study that examined structural connectivity between regions of a functional network, namely the default mode network, was reproduced. Voxel-wise structural centrality was then calculated and compared to prior literature findings. Furthermore, including additional resting-state fMRI data from the same subjects, structural and functional connectivity matrices between approximately forty thousand nodes of the brain were calculated. This was done to estimate structure-function agreement indices of voxel-wise whole brain connectivity. Taken together, the combination of a novel whole brain fiber tracking approach and an advanced normalization method led to a group connectome that allowed (at least heuristically) to perform fiber tracking directly within MNI space. Hence, it may be used for various purposes such as the analysis of structural connectivity and modeling experiments that aim at studying the structure-function relationship of the human connectome. Moreover, it may even represent a first step towards a standard DTI template of the human brain in stereotactic space. The standardized group connectome might thus be a promising new resource to better understand and further analyze the anatomical architecture of the human brain on a population level.
    NeuroImage 08/2015; DOI:10.1016/j.neuroimage.2015.08.048 · 6.36 Impact Factor
  • Source
    • "The number of participants in individual studies has grown for many reasons, including: the increasing availability of MRI scanners; a move from fixed-to random-effects designs (Friston et al., 1999; Mumford and Nichols, 2008); a demand for greater replication in neuroimaging ( " The dilemma of weak neuroimaging papers, " dilemma-weak-neuroimaging); the need to overcome statistical noise in studies of individual differences, genetics, aging, development or disease; large scale investments such as the Human Connectome Project (Van Essen et al., 2012), Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005) or Cambridge Centre for Aging and Neuroscience (; and a growth in open data sharing (Van Horn et al., 2001; Biswal et al., 2010; Poldrack et al., 2013; "
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.
    Frontiers in Neuroinformatics 01/2015; 8:90. DOI:10.3389/fninf.2014.00090 · 3.26 Impact Factor
  • Source
    • "A sample size of n = 1000 was selected as determination of correlations between variability in brain structure and function with increasing age and environmental and genetic risk factors requires a large cohort. This sample size is comparable to those of previous large-scale neuroimaging studies in Europe and the US (Mueller et al., 2005; Ikram et al., 2012; Van Essen et al., 2012), which achieved a uniform distribution of subjects in different age groups (Button et al., 2013). An adequate sample size is also required to detect genetic influences with small effect sizes in polygenic phenotypes (Stein et al., 2012). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The ongoing 1000 brains study (1000BRAINS) is an epidemiological and neuroscientific investigation of structural and functional variability in the human brain during aging. The two recruitment sources are the 10-year follow-up cohort of the German Heinz Nixdorf Recall (HNR) Study, and the HNR MultiGeneration Study cohort, which comprises spouses and offspring of HNR subjects. The HNR is a longitudinal epidemiological investigation of cardiovascular risk factors, with a comprehensive collection of clinical, laboratory, socioeconomic, and environmental data from population-based subjects aged 45-75 years on inclusion. HNR subjects underwent detailed assessments in 2000, 2006, and 2011, and completed annual postal questionnaires on health status. 1000BRAINS accesses these HNR data and applies a separate protocol comprising: neuropsychological tests of attention, memory, executive functions and language; examination of motor skills; ratings of personality, life quality, mood and daily activities; analysis of laboratory and genetic data; and state-of-the-art magnetic resonance imaging (MRI, 3 Tesla) of the brain. The latter includes (i) 3D-T1- and 3D-T2-weighted scans for structural analyses and myelin mapping; (ii) three diffusion imaging sequences optimized for diffusion tensor imaging, high-angular resolution diffusion imaging for detailed fiber tracking and for diffusion kurtosis imaging; (iii) resting-state and task-based functional MRI; and (iv) fluid-attenuated inversion recovery and MR angiography for the detection of vascular lesions and the mapping of white matter lesions. The unique design of 1000BRAINS allows: (i) comprehensive investigation of various influences including genetics, environment and health status on variability in brain structure and function during aging; and (ii) identification of the impact of selected influencing factors on specific cognitive subsystems and their anatomical correlates.
    Frontiers in Aging Neuroscience 07/2014; 6:149. DOI:10.3389/fnagi.2014.00149 · 4.00 Impact Factor
Show more