Functional mapping with simultaneous MEG and EEG

Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, MA, USA.
Journal of Visualized Experiments (Impact Factor: 1.33). 06/2010; 40(40). DOI: 10.3791/1668
Source: PubMed


We use magnetoencephalography (MEG) and electroencephalography (EEG) to locate and determine the temporal evolution in brain areas involved in the processing of simple sensory stimuli. We will use somatosensory stimuli to locate the hand somatosensory areas, auditory stimuli to locate the auditory cortices, visual stimuli in four quadrants of the visual field to locate the early visual areas. These type of experiments are used for functional mapping in epileptic and brain tumor patients to locate eloquent cortices. In basic neuroscience similar experimental protocols are used to study the orchestration of cortical activity. The acquisition protocol includes quality assurance procedures, subject preparation for the combined MEG/EEG study, and acquisition of evoked-response data with somatosensory, auditory, and visual stimuli. We also demonstrate analysis of the data using the equivalent current dipole model and cortically-constrained minimum-norm estimates. Anatomical MRI data are employed in the analysis for visualization and for deriving boundaries of tissue boundaries for forward modeling and cortical location and orientation constraints for the minimum-norm estimates.

60 Reads
  • Source
    • "The high temporal resolution of MEG as an electrophysiological measure positions it to be especially appropriate for use in analyzing auditory processing, where stimuli are necessarily characterized in terms of their fluctuations in time (e.g., see Shamma et al., 2011). Although multiple authors have covered standard practices for conducting MEG studies (Barkley, 2004; Liu et al., 2010; Lee et al., 2012) and reporting the methods and results (Gross et al., 2013) from MEG experiments, adoption of MEG as a tool to facilitate diagnosis has remained limited. Here we will provide a perspective on potential ways in which MEG could, with additional time, effort, and validation, influence clinical practice for developmental pathology. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Magnetoencephalography (MEG) provides a direct, non-invasive view of neural activity with millisecond temporal precision. Recent developments in MEG analysis allow for improved source localization and mapping of connectivity between brain regions, expanding the possibilities for using MEG as a diagnostic tool. In this paper, we first describe inverse imaging methods (e.g., minimum-norm estimation) and functional connectivity measures, and how they can provide insights into cortical processing. We then offer a perspective on how these techniques could be used to understand and evaluate auditory pathologies that often manifest during development. Here we focus specifically on how MEG inverse imaging, by providing anatomically based interpretation of neural activity, may allow us to test which aspects of cortical processing play a role in (central) auditory processing disorder [(C)APD]. Appropriately combining auditory paradigms with MEG analysis could eventually prove useful for a hypothesis-driven understanding and diagnosis of (C)APD or other disorders, as well as the evaluation of the effectiveness of intervention strategies.
    Frontiers in Human Neuroscience 03/2014; 8:151. DOI:10.3389/fnhum.2014.00151 · 2.99 Impact Factor
  • Source
    • "All data were recorded at a sampling rate of 600 Hz with a bandpass of 0.1–200 Hz. Four head position indicator coils were used to monitor head position (see, Liu et al., 2010; Lee et al., 2012). Samples containing artifacts associated with eye-movements and blinks were extracted by detecting peaks from the vertical EOG channel; samples with cardiac artifacts were similarly identified from ECG data. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Frequency tagging of sensory inputs (presenting stimuli that fluctuate periodically at rates to which the cortex can phase lock) has been used to study attentional modulation of neural responses to inputs in different sensory modalities. For visual inputs, the visual steady-state response (VSSR) at the frequency modulating an attended object is enhanced, while the VSSR to a distracting object is suppressed. In contrast, the effect of attention on the auditory steady-state response (ASSR) is inconsistent across studies. However, most auditory studies analyzed results at the sensor level or used only a small number of equivalent current dipoles to fit cortical responses. In addition, most studies of auditory spatial attention used dichotic stimuli (independent signals at the ears) rather than more natural, binaural stimuli. Here, we asked whether these methodological choices help explain discrepant results. Listeners attended to one of two competing speech streams, one simulated from the left and one from the right, that were modulated at different frequencies. Using distributed source modeling of magnetoencephalography results, we estimate how spatially directed attention modulates the ASSR in neural regions across the whole brain. Attention enhances the ASSR power at the frequency of the attended stream in contralateral auditory cortex. The attended-stream modulation frequency also drives phase-locked responses in the left (but not right) precentral sulcus (lPCS), a region implicated in control of eye gaze and visual spatial attention. Importantly, this region shows no phase locking to the distracting stream. Results suggest that the lPCS in engaged in an attention-specific manner. Modeling results that take account of the geometry and phases of the cortical sources phase locked to the two streams (including hemispheric asymmetry of lPCS activity) help to explain why past ASSR studies of auditory spatial attention yield seemingly contradictory results.
    Frontiers in Integrative Neuroscience 02/2014; 8:6. DOI:10.3389/fnint.2014.00006
  • Source
    • "The data were recorded at a sampling rate of 600 Hz with a bandpass of 0.1–200 Hz. Four head position indicator (HPI) coils were used to monitor head position (see Liu et al., 2010 for detailed description). At the beginning of each run, magnetic fields from the HPI coils were recorded to calculate the position and orientation of the head relative to the MEG sensor array. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In order to extract information in a rich environment, we focus on different features that allow us to direct attention to whatever source is of interest. The cortical network deployed during spatial attention, especially in vision, is well characterized. For example, visuospatial attention engages a frontoparietal network including the frontal eye fields (FEFs), which modulate activity in visual sensory areas to enhance the representation of an attended visual object. However, relatively little is known about the neural circuitry controlling attention directed to non-spatial features, or to auditory objects or features (either spatial or non-spatial). Here, using combined magnetoencephalography (MEG) and anatomical information obtained from MRI, we contrasted cortical activity when observers attended to different auditory features given the same acoustic mixture of two simultaneous spoken digits. Leveraging the fine temporal resolution of MEG, we establish that activity in left FEF is enhanced both prior to and throughout the auditory stimulus when listeners direct auditory attention to target location compared to when they focus on target pitch. In contrast, activity in the left posterior superior temporal sulcus (STS), a region previously associated with auditory pitch categorization, is greater when listeners direct attention to target pitch rather than target location. This differential enhancement is only significant after observers are instructed which cue to attend, but before the acoustic stimuli begin. We therefore argue that left FEF participates more strongly in directing auditory spatial attention, while the left STS aids auditory object selection based on the non-spatial acoustic feature of pitch.
    Frontiers in Neuroscience 01/2012; 6:190. DOI:10.3389/fnins.2012.00190 · 3.66 Impact Factor
Show more


60 Reads
Available from