Article

Object-based attention is multisensory: co-activation of an object's representations in ignored sensory modalities.

The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience and Schizophrenia, Nathan Kline Institute for Psychiatric Research, 140 Old Orangeburg Road, Orangeburg, NY 10962, USA.
European Journal of Neuroscience (Impact Factor: 3.75). 08/2007; 26(2):499-509. DOI: 10.1111/j.1460-9568.2007.05668.x
Source: PubMed

ABSTRACT Within the visual modality, it has been shown that attention to a single visual feature of an object such as speed of motion, results in an automatic transfer of attention to other task-irrelevant features (e.g. colour). An extension of this logic might lead one to predict that such mechanisms also operate across sensory systems. But, connectivity patterns between feature modules across sensory systems are thought to be sparser to those within a given sensory system, where interareal connectivity is extensive. It is not clear that transfer of attention between sensory systems will operate as it does within a sensory system. Using high-density electrical mapping of the event-related potential (ERP) in humans, we tested whether attending to objects in one sensory modality resulted in the preferential processing of that object's features within another task-irrelevant sensory modality. Clear evidence for cross-sensory attention effects was seen, such that for multisensory stimuli responses to ignored task-irrelevant information in the auditory and visual domains were selectively enhanced when they were features of the explicitly attended object presented in the attended sensory modality. We conclude that attending to an object within one sensory modality results in coactivation of that object's representations in ignored sensory modalities. The data further suggest that transfer of attention from visual-to-auditory features operates in a fundamentally different manner than transfer from auditory-to-visual features, and indicate that visual-object representations have a greater influence on their auditory counterparts than vice-versa. These data are discussed in terms of 'priming' vs. 'spreading' accounts of attentional transfer.

1 Bookmark
 · 
86 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Cues that involve a number of sensory modalities are processed in the brain in an interactive multimodal manner rather than independently for each modality. We studied multimodal integration in a natural, yet fully controlled scene, implemented as an interactive game in an auditory-haptic-visual virtual environment. In this imitation of a natural scene, the targets of perception were ecologically valid uni-, bi- and tri-modal manifestations of a simple event - a ball hitting a wall. Subjects were engaged in the game while their behavioral and early cortical electrophysiological responses were measured. Behavioral results confirmed that tri-modal cues were detected faster and more accurately than bi-modal cues, which, likewise, showed advantages over unimodal responses. Event-Related Potentials (ERPs) were recorded, and the first 200 milliseconds following stimulus onset were analyzed to reveal the latencies of cortical multimodal interactions as estimated by sLORETA. These electrophysiological findings indicated bi-modal as well as tri-modal interactions beginning very early (~30 milliseconds), uniquely for each multimodal combination. The results suggest that early cortical multimodal integration accelerates cortical activity and, in turn, enhances performance measures. This acceleration registers on the scalp as sub-additive cortical activation.
    International journal of psychophysiology: official journal of the International Organization of Psychophysiology 12/2013; · 3.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Successful integration of auditory and visual inputs is crucial for both basic perceptual functions and for higher-order processes related to social cognition. Autism spectrum disorders (ASD) are characterized by impairments in social cognition and are associated with abnormalities in sensory and perceptual processes. Several groups have reported that individuals with ASD are impaired in their ability to integrate socially relevant audiovisual (AV) information, and it has been suggested that this contributes to the higher-order social and cognitive deficits observed in ASD. However, successful integration of auditory and visual inputs also influences detection and perception of nonsocial stimuli, and integration deficits may impair earlier stages of information processing, with cascading downstream effects. To assess the integrity of basic AV integration, we recorded high-density electrophysiology from a cohort of high-functioning children with ASD (7-16 years) while they performed a simple AV reaction time task. Children with ASD showed considerably less behavioral facilitation to multisensory inputs, deficits that were paralleled by less effective neural integration. Evidence for processing differences relative to typically developing children was seen as early as 100 ms poststimulation, and topographic analysis suggested that children with ASD relied on different cortical networks during this early multisensory processing stage.
    Cerebral Cortex 06/2013; 23(6):1329-41. · 8.31 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We assessed the role of alpha-band oscillatory activity during a task-switching design that required participants to switch between an auditory and a visual task, while task-relevant audiovisual inputs were simultaneously presented. Instructional cues informed participants which task to perform on a given trial and we assessed alpha-band power in the short 1.35-s period intervening between the cue and the task-imperative stimuli, on the premise that attentional biasing mechanisms would be deployed to resolve competition between the auditory and visual inputs. Prior work had shown that alpha-band activity was differentially deployed depending on the modality of the cued task. Here, we asked whether this activity would, in turn, be differentially deployed depending on whether participants had just made a switch of task or were being asked to simply repeat the task. It is well established that performance speed and accuracy are poorer on switch than on repeat trials. Here, however, the use of instructional cues completely mitigated these classic switch-costs. Measures of alpha-band synchronisation and desynchronisation showed that there was indeed greater and earlier differential deployment of alpha-band activity on switch vs. repeat trials. Contrary to our hypothesis, this differential effect was entirely due to changes in the amount of desynchronisation observed during switch and repeat trials of the visual task, with more desynchronisation over both posterior and frontal scalp regions during switch-visual trials. These data imply that particularly vigorous, and essentially fully effective, anticipatory biasing mechanisms resolved the competition between competing auditory and visual inputs when a rapid switch of task was required.
    European Journal of Neuroscience 04/2014; · 3.75 Impact Factor

Full-text

View
97 Downloads
Available from
May 23, 2014