Article

Brain responses to auditory and visual stimulus offset: Shared representations of temporal edges

University Hospital of Psychiatry Bern, Bern, Switzerland.
Human Brain Mapping (Impact Factor: 6.92). 03/2009; 30(3):725-33. DOI: 10.1002/hbm.20539
Source: PubMed

ABSTRACT Edges are crucial for the formation of coherent objects from sequential sensory inputs within a single modality. Moreover, temporally coincident boundaries of perceptual objects across different sensory modalities facilitate crossmodal integration. Here, we used functional magnetic resonance imaging in order to examine the neural basis of temporal edge detection across modalities. Onsets of sensory inputs are not only related to the detection of an edge but also to the processing of novel sensory inputs. Thus, we used transitions from input to rest (offsets) as convenient stimuli for studying the neural underpinnings of visual and acoustic edge detection per se. We found, besides modality-specific patterns, shared visual and auditory offset-related activity in the superior temporal sulcus and insula of the right hemisphere. Our data suggest that right hemispheric regions known to be involved in multisensory processing are crucial for detection of edges in the temporal domain across both visual and auditory modalities. This operation is likely to facilitate cross-modal object feature binding based on temporal coincidence.

1 Follower
 · 
106 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Understanding how the brain extracts and combines temporal structure (rhythm) information from events presented to different senses remains unresolved. Many neuroimaging beat perception studies have focused on the auditory domain and show the presence of a highly regular beat (isochrony) in "auditory" stimulus streams enhances neural responses in a distributed brain network and affects perceptual performance. Here, we acquired functional magnetic resonance imaging (fMRI) measurements of brain activity while healthy human participants performed a visual task on isochronous versus randomly timed "visual" streams, with or without concurrent task-irrelevant sounds. We found that visual detection of higher intensity oddball targets was better for isochronous than randomly timed streams, extending previous auditory findings to vision. The impact of isochrony on visual target sensitivity correlated positively with fMRI signal changes not only in visual cortex but also in auditory sensory cortex during audiovisual presentations. Visual isochrony activated a similar timing-related brain network to that previously found primarily in auditory beat perception work. Finally, activity in multisensory left posterior superior temporal sulcus increased specifically during concurrent isochronous audiovisual presentations. These results indicate that regular isochronous timing can modulate visual processing and this can also involve multisensory audiovisual brain mechanisms.
    Cerebral Cortex 04/2012; 23(6). DOI:10.1093/cercor/bhs095 · 8.31 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Assessing the size of objects rapidly and accurately clearly has survival value. A central multisensory module for subjective magnitude assessment is therefore highly likely, suggested by psychophysical studies, and proposed on theoretical grounds. Given that pain perception is fundamentally an assessment of stimulus intensity, it must necessarily engage such a central module. Accordingly, we compared functional magnetic resonance imaging (fMRI) activity of pain magnitude ratings to matched visual magnitude ratings in 14 subjects. We show that brain activations segregate into two groups, one preferentially activated for pain and another equally activated for both visual and pain magnitude ratings. The properties of regions in the first group were consistent with encoding nociception, whereas those in the second group with attention and task control. Insular cortex responses similarly segregated to a pain-specific area and an area (extending to the lateral prefrontal cortex) conjointly representing perceived magnitudes for pain and vision. These two insular areas were differentiated by their relationship to task variance, ability to encode perceived magnitudes for each stimulus epoch, temporal delay differences, and brain intrinsic functional connectivity. In a second group of subjects (n=11) we contrasted diffusion tensor imaging-based white matter connectivity for these two insular areas and observed anatomical connectivity closely corresponding to the functional connectivity identified with fMRI. These results demonstrate that pain perception is due to the transformation of nociceptive representation into subjective magnitude assessment within the insula. Moreover, we argue that we have identified a multisensory cortical area for "how much" complementary and analogous to the "where" and "what" as described for central visual processing.
    Journal of Neurophysiology 02/2009; 101(2):875-87. DOI:10.1152/jn.91100.2008 · 3.04 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In the past, mechanisms of sensory and perceptual processes have been studied intensively by both psychologists and neuroscientists. However, recent research investigated mostly just single senses (i.e. vision or audition or touch, whereas real-world events often stimulate more than one single modality concurrently. Moreover, parts of information arising from one and the same object need to be joint together across distinct sensory modalities, as when we both feel and see an object in our hand or both see and hear someone speak. The way how information is bound across modalities determines perceptual judgments. For example, temporal perception of external bimodal events does not coercively correspond to their actual physical relations. Rather, perception of subjective simultaneity is determined by implicit knowledge, stimulus properties and dynamics of the perceptual apparatus. Previous studies demonstrated the existence of a temporal window that can flexibly be widened or tightened. Bimodal events will be temporally bound as long as they both fall within this temporal window. This thesis investigates the neural basis of audiovisual perception of temporal relations. Therefore, by means of three fMRI-studies neural modulations of perceived temporal relations were identified when participants judged both semantic and non-semantic stimuli. In doing so, influences of unisensory and bimodal cortices were investigated. Furthermore, neural correlates of temporal plasticity during audiovisual perception were studied with the aid of an adaptation paradigm. This allowed for separating perceptual and stimulus-driven effects of audiovisual temporal perception. Finally, temporal percepts of audiovisual speech were captured by ecologically valid stimuli and the according functional basis was located within bimodal areas. Results revealed distinct activation patterns for perceptual synchrony compared to asynchrony. Moreover, it was shown that different cortical networks are involved in the establishing of subjective synchrony vs. asynchrony by means of dynamic adaptation. In the end, it was demonstrated that bimodal areas can be separated functionally by different time percepts. To conclude, these results indicate that audiovisual temporal information of stable synchrony percepts and stable asynchrony percepts seem to be related to enhanced activity in distinct multisensory cortices. Beyond, the relevant activation patterns were found to be chronotopically arranged.