Article

Data sharing in neuroimaging research

Neurospin, Commissariat à l'Energie Atomique et aux Energies Alternatives Gif-sur-Yvette, France.
Frontiers in Neuroinformatics (Impact Factor: 3.26). 04/2012; 6:9. DOI: 10.3389/fninf.2012.00009
Source: PubMed

ABSTRACT

Significant resources around the world have been invested in neuroimaging studies of brain function and disease. Easier access to this large body of work should have profound impact on research in cognitive neuroscience and psychiatry, leading to advances in the diagnosis and treatment of psychiatric and neurological disease. A trend toward increased sharing of neuroimaging data has emerged in recent years. Nevertheless, a number of barriers continue to impede momentum. Many researchers and institutions remain uncertain about how to share data or lack the tools and expertise to participate in data sharing. The use of electronic data capture (EDC) methods for neuroimaging greatly simplifies the task of data collection and has the potential to help standardize many aspects of data sharing. We review here the motivations for sharing neuroimaging data, the current data sharing landscape, and the sociological or technical barriers that still need to be addressed. The INCF Task Force on Neuroimaging Datasharing, in conjunction with several collaborative groups around the world, has started work on several tools to ease and eventually automate the practice of data sharing. It is hoped that such tools will allow researchers to easily share raw, processed, and derived neuroimaging data, with appropriate metadata and provenance records, and will improve the reproducibility of neuroimaging studies. By providing seamless integration of data sharing and analysis tools within a commodity research environment, the Task Force seeks to identify and minimize barriers to data sharing in the field of neuroimaging.

Download full-text

Full-text

Available from: Jean-Baptiste Poline
  • Source
    • "This non-intrusive, self-regulatory intervention to prevent or mitigate bubbles could potentially be implemented without government involvement. Traders could monitor a shared (Poline et al., 2012;Poldrack et al., 2013), open-access aggregated data stream of processed brain activity, collected from consenting traders' wearable fNIRS technology (Kopton and Kenning, 2014;Piper et al., 2014). Realtime signs of over-heated markets (e.g., low levels of trade-related lateral neocortical activity) would warn traders to exit these markets and thereby prevent major bubbles voluntarily (Haracz and Acland, 2014). "
    [Show description] [Hide description]
    DESCRIPTION: Asset-price bubbles challenge the explanatory and predictive power of standard economic theory, suggesting that neuroeconomic measures should be explored as potential tools for improving the predictive power of standard theory. This exploration is begun by reviewing results from functional magnetic resonance imaging (fMRI) studies of lab asset-price bubbles and herding behavior (i.e., following others' decisions). These results are consistent with a neuroeconomics-based hypothesis of asset-price bubbles. In this view, decision making during bubble or non-bubble periods of financial-market activity is driven by, respectively, evolutionarily ancient or new neurocircuitry. Neuroimaging studies that test this or other neuroeconomics-based hypotheses of asset-price bubbles may yield a bubble-related biomarker (e.g., low trade-related lateral neocortical activity associated with traders’ herding-based decisions). Wearable functional near-infrared spectroscopy (fNIRS) technology could determine the prevalence of such a biomarker among financial-market participants, thereby enabling the real-time detection of an emerging bubble. Mechanisms are described by which this early-warning signal could be exploited in self-regulatory or government-administered policies for financial-system stabilization. In summary, neuroimaging-based financial-system regulation may be useful for distinguishing bubbles from non-bubble periods and preventing major asset-price bubbles.
    Full-text · Working Paper · Jan 2016
  • Source
    • "Sharing data within a larger community is a practical way to address these issues with greater statistical power and propose scientific questions beyond the scope of a single research group. Furthermore, the robustness of biological findings across different methods or processing architectures encourages confidence in the reproducibility of results, a fundamental requirement of good scientific practice (Glatard et al., 2015; Poline et al., 2012). At the same time, cross-site data sharing brings with it a broad range of issues in terms of site/scanner compatibility (Jovicich et al. 2009, 2013, 2014) and the logistical challenges of IT interoperability. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Neuroimaging has been facing a data deluge characterized by the exponential growth of both raw and processed data. As a result, mining the massive quantities of digital data collected in these studies offers unprecedented opportunities and has become paramount for today's research. As the neuroimaging community enters the world of "Big Data", there has been a concerted push for enhanced sharing initiatives, whether within a multisite study, across studies, or federated and shared publicly. This article will focus on the database and processing ecosystem developed at the Montreal Neurological Institute (MNI) to support multicenter data acquisition both nationally and internationally, create database repositories, facilitate data sharing initiatives and leverage existing software toolkits for largescale data processing.
    Full-text · Article · Sep 2015 · NeuroImage
  • Source
    • "As the number of studies using such techniques continues to grow exponentially, the challenge of assessing, summarizing, and condensing their findings poses ever-greater difficulty. Even though a single study can take years to conduct, cost hundreds of thousands of dollars, and require the effort of dozens of highly trained scientists and volunteers, the output is usually reduced to an academic article, and the original data are rarely shared (Poline et al., 2012). Unfortunately, due to the historical legacy of reporting knowledge in written form (of an academic paper), the final documented results consist mostly of subjective interpretation of data with very little machine-readable information. "
    [Show abstract] [Hide abstract]
    ABSTRACT: NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard, and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.
    Full-text · Article · Apr 2015 · NeuroImage
Show more