Article

Object-based attention is multisensory: co-activation of an object's representations in ignored sensory modalities.

The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience and Schizophrenia, Nathan Kline Institute for Psychiatric Research, 140 Old Orangeburg Road, Orangeburg, NY 10962, USA.
European Journal of Neuroscience (Impact Factor: 3.75). 08/2007; 26(2):499-509. DOI: 10.1111/j.1460-9568.2007.05668.x
Source: PubMed

ABSTRACT Within the visual modality, it has been shown that attention to a single visual feature of an object such as speed of motion, results in an automatic transfer of attention to other task-irrelevant features (e.g. colour). An extension of this logic might lead one to predict that such mechanisms also operate across sensory systems. But, connectivity patterns between feature modules across sensory systems are thought to be sparser to those within a given sensory system, where interareal connectivity is extensive. It is not clear that transfer of attention between sensory systems will operate as it does within a sensory system. Using high-density electrical mapping of the event-related potential (ERP) in humans, we tested whether attending to objects in one sensory modality resulted in the preferential processing of that object's features within another task-irrelevant sensory modality. Clear evidence for cross-sensory attention effects was seen, such that for multisensory stimuli responses to ignored task-irrelevant information in the auditory and visual domains were selectively enhanced when they were features of the explicitly attended object presented in the attended sensory modality. We conclude that attending to an object within one sensory modality results in coactivation of that object's representations in ignored sensory modalities. The data further suggest that transfer of attention from visual-to-auditory features operates in a fundamentally different manner than transfer from auditory-to-visual features, and indicate that visual-object representations have a greater influence on their auditory counterparts than vice-versa. These data are discussed in terms of 'priming' vs. 'spreading' accounts of attentional transfer.

1 Bookmark
 · 
99 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Cues that involve a number of sensory modalities are processed in the brain in an interactive multimodal manner rather than independently for each modality. We studied multimodal integration in a natural, yet fully controlled scene, implemented as an interactive game in an auditory-haptic-visual virtual environment. In this imitation of a natural scene, the targets of perception were ecologically valid uni-, bi- and tri-modal manifestations of a simple event - a ball hitting a wall. Subjects were engaged in the game while their behavioral and early cortical electrophysiological responses were measured. Behavioral results confirmed that tri-modal cues were detected faster and more accurately than bi-modal cues, which, likewise, showed advantages over unimodal responses. Event-Related Potentials (ERPs) were recorded, and the first 200 milliseconds following stimulus onset were analyzed to reveal the latencies of cortical multimodal interactions as estimated by sLORETA. These electrophysiological findings indicated bi-modal as well as tri-modal interactions beginning very early (~30 milliseconds), uniquely for each multimodal combination. The results suggest that early cortical multimodal integration accelerates cortical activity and, in turn, enhances performance measures. This acceleration registers on the scalp as sub-additive cortical activation.
    International journal of psychophysiology: official journal of the International Organization of Psychophysiology 12/2013; · 3.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Niemann-Pick type-C (NPC) is an autosomal recessive disease in which cholesterol and glycosphingolipids accumulate in lysosomes due to aberrant cell-transport mechanisms. It is characterized by progressive and ultimately terminal neurological disease, but both pre-clinical studies and direct human trials are underway to test the safety and efficacy of cholesterol clearing compounds, with good success already observed in animal models. Key to assessing the effectiveness of interventions in patients, however, is the development of objective neurobiological outcome measures. Multisensory integration mechanisms present as an excellent candidate since they necessarily rely on the fidelity of long-range neural connections between the respective sensory cortices (e.g. the auditory and visual systems).MethodsA simple way to test integrity of the multisensory system is to ask whether individuals respond faster to the occurrence of a bisensory event than they do to the occurrence of either of the unisensory constituents alone. Here, we presented simple auditory, visual, and audio-visual stimuli in random sequence. Participants responded as fast as possible with a button push. One 11-year-old and two 14-year-old boys with NPC participated in the experiment and their results were compared to those of 35 age-matched neurotypical boys.ResultsReaction times (RTs) to the stimuli when presented simultaneously were significantly faster than when they were presented alone in the neurotypical children, a facilitation that could not be accounted for by probability summation, as evidenced by violation of the so-called `race¿ model. In stark contrast, the NPC boys showed no such speeding, despite the fact that their unisensory RTs fell within the distribution of RTs observed in the neurotypicals.Conclusions These results uncover a previously undescribed deficit in multisensory integrative abilities in NPC, with implications for ongoing treatment of the clinical symptoms of these children. They also suggest that multisensory processes may represent a good candidate biomarker against which to test the efficacy of therapeutic interventions.
    Orphanet Journal of Rare Diseases 09/2014; 9(1):149. · 4.32 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We assessed the role of alpha-band oscillatory activity during a task-switching design that required participants to switch between an auditory and a visual task, while task-relevant audiovisual inputs were simultaneously presented. Instructional cues informed participants which task to perform on a given trial and we assessed alpha-band power in the short 1.35-s period intervening between the cue and the task-imperative stimuli, on the premise that attentional biasing mechanisms would be deployed to resolve competition between the auditory and visual inputs. Prior work had shown that alpha-band activity was differentially deployed depending on the modality of the cued task. Here, we asked whether this activity would, in turn, be differentially deployed depending on whether participants had just made a switch of task or were being asked to simply repeat the task. It is well established that performance speed and accuracy are poorer on switch than on repeat trials. Here, however, the use of instructional cues completely mitigated these classic switch-costs. Measures of alpha-band synchronisation and desynchronisation showed that there was indeed greater and earlier differential deployment of alpha-band activity on switch vs. repeat trials. Contrary to our hypothesis, this differential effect was entirely due to changes in the amount of desynchronisation observed during switch and repeat trials of the visual task, with more desynchronisation over both posterior and frontal scalp regions during switch-visual trials. These data imply that particularly vigorous, and essentially fully effective, anticipatory biasing mechanisms resolved the competition between competing auditory and visual inputs when a rapid switch of task was required.
    European Journal of Neuroscience 04/2014; · 3.75 Impact Factor

Full-text

Download
123 Downloads
Available from
May 23, 2014