Article

Steady-state responses in MEG demonstrate information integration within but not across the auditory and visual senses

Max Planck Institute for Biological Cybernetics, Tuebingen, Germany.
NeuroImage (Impact Factor: 6.36). 01/2012; 60(2):1478-89. DOI: 10.1016/j.neuroimage.2012.01.114
Source: PubMed

ABSTRACT

To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-state responses that are elicited time-locked to periodically modulated stimuli. Critically, in the frequency domain, interactions between sensory signals are indexed by crossmodulation terms (i.e. the sums and differences of the fundamental frequencies). The 3 × 2 factorial design, manipulated (1) modality: auditory, visual or audiovisual (2) steady-state modulation: the auditory and visual signals were modulated only in one sensory feature (e.g. visual gratings modulated in luminance at 6 Hz) or in two features (e.g. tones modulated in frequency at 40 Hz & amplitude at 0.2 Hz). This design enabled us to investigate crossmodulation frequencies that are elicited when two stimulus features are modulated concurrently (i) in one sensory modality or (ii) in auditory and visual modalities. In support of within-modality integration, we reliably identified crossmodulation frequencies when two stimulus features in one sensory modality were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.

Download full-text

Full-text

Available from: Paolo Belardinelli
    • "If the visual stimulus is a perfect sine wave, it does not contain higher harmonics; if the visual response is linear, the response to this stimulus will be confined to the frequency bin corresponding to the stimulus frequency. However, even in that case, responses at harmonic frequencies are observed (Giani et al., 2012). In contrast, if the system is nonlinear, this will manifest itself in the presence of higher harmonic responses (for more details on nonlinearity in the SSVEP, see Norcia et al. (2015)). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Visual rhythmic stimulation evokes a robust power increase exactly at the stimulation frequency, the so-called steady-state response (SSR). Localization of visual SSRs normally shows a very focal modulation of power in visual cortex and led to the treatment and interpretation of SSRs as a local phenomenon. Given the brain network dynamics, we hypothesized that SSRs have additional large-scale effects on the brain functional network that can be revealed by means of graph theory. We used rhythmic visual stimulation at a range of frequencies (4–30Hz), recorded MEG and investigated source level connectivity across the whole brain. Using graph theoretical measures we observed a frequency-unspecific reduction of global density in the alpha band “disconnecting” visual cortex from the rest of the network. Also, a frequency-specific increase of connectivity between occipital cortex and precuneus was found at the stimulation frequency that exhibited the highest resonance (30Hz). In conclusion, we showed that SSRs dynamically re-organized the brain functional network. These large-scale effects should be taken into account not only when attempting to explain the nature of SSRs, but also when used in various experimental designs.
    No preview · Article · Feb 2016 · Brain Research
  • Source
    • "Finally, like other previous studies (Giani et al. 2012), we did not identify cross- 554 modulation frequencies between the concurrently presented visual and tactile stimuli. This 555 raises the question of whether multisensory integration processes can actually be tagged using 556 this approach, as only a few studies have shown evidence for the emergence of such cross- 557 modulation SS-EPs across modalities (Nozaradan, Zerouali, Peretz, & Mouraux, 2015; M. P. 558 Regan, et al., 1995). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The sustained periodic modulation of a stimulus induces an entrainment of cortical neurons responding to the stimulus, appearing as a steady-state evoked potential (SS-EP) in the EEG frequency spectrum. Here, we used frequency tagging of SS-EPs to study the crossmodal links in spatial attention between touch and vision. We hypothesized that a visual stimulus approaching the left or right hand orients spatial attention toward the approached hand, and thereby enhances the processing of vibrotactile input originating from that hand. Twenty-five subjects took part in the experiment: 16-s trains of vibrotactile stimuli (4.2 and 7.2 Hz) were applied simultaneously to the left and right hand, concomitantly with a punctate visual stimulus blinking at 9.8 Hz. The visual stimulus was approached toward the left or right hand. The hands were either uncrossed (left and right hands to the left and right of the participant) or crossed (left and right hands to the right and left of the participant). The vibrotactile stimuli elicited two distinct SS-EPs with scalp topographies compatible with activity in the contralateral primary somatosensory cortex. The visual stimulus elicited a third SS-EP with a topography compatible with activity in visual areas. When the visual stimulus was over one of the hands, the amplitude of the vibrotactile SS-EP elicited by stimulation of that hand was enhanced, regardless of whether the hands were uncrossed or crossed. This demonstrates a crossmodal effect of spatial attention between vision and touch, integrating proprioceptive and/or visual information to map the position of the limbs in external space. © 2015 Society for Psychophysiological Research.
    Full-text · Article · Aug 2015 · Psychophysiology
    • "Crucially, SSRs provide an index of relative attentional allocation to specific stimuli because attention modulates SSR amplitudes in visual (Müller et al. 1998, 2003; Kim et al. 2007), auditory (Ross et al. 2004; Bidet-Caulet et al. 2007; Saupe et al. 2009b) and audio-visual stimulus situations (Saupe et al. 2009a; Keitel et al. 2011, 2013). Frequency-tagging studies have also investigated effects of audio-visual synchrony on SSRs (Jenkins et al. 2011; Giani et al. 2012). Nozaradan et al. (2012) demonstrated enhanced amplitudes (and inter-trial phase coherence) of SSRs driven by an auditory and a visual stimulus when both obeyed a synchronous presentation. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Our brain relies on neural mechanisms of selective attention and converging sensory processing to efficiently cope with rich and unceasing multisensory inputs. One prominent assumption holds that audio-visual synchrony can act as a strong attractor for spatial attention. Here, we tested for a similar effect of audio-visual synchrony on feature-selective attention. We presented two superimposed Gabor patches that differed in colour and orientation. On each trial, participants were cued to selectively attend to one of the two patches. Over time, spatial frequencies of both patches varied sinusoidally at distinct rates (3.14 and 3.63 Hz), giving rise to pulse-like percepts. A simultaneously presented pure tone carried a frequency modulation at the pulse rate of one of the two visual stimuli to introduce audio-visual synchrony. Pulsed stimulation elicited distinct time-locked oscillatory electrophysiological brain responses. These steady-state responses were quantified in the spectral domain to examine individual stimulus processing under conditions of synchronous versus asynchronous tone presentation and when respective stimuli were attended versus unattended. We found that both, attending to the colour of a stimulus and its synchrony with the tone, enhanced its processing. Moreover, both gain effects combined linearly for attended in-sync stimuli. Our results suggest that audio-visual synchrony can attract attention to specific stimulus features when stimuli overlap in space.
    No preview · Article · Aug 2015 · Experimental Brain Research
Show more