Object-based attention is multisensory: Co-activation of an object's representations in ignored sensory modalities

The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience and Schizophrenia, Nathan Kline Institute for Psychiatric Research, 140 Old Orangeburg Road, Orangeburg, NY 10962, USA.
European Journal of Neuroscience (Impact Factor: 3.18). 08/2007; 26(2):499-509. DOI: 10.1111/j.1460-9568.2007.05668.x
Source: PubMed


Within the visual modality, it has been shown that attention to a single visual feature of an object such as speed of motion, results in an automatic transfer of attention to other task-irrelevant features (e.g. colour). An extension of this logic might lead one to predict that such mechanisms also operate across sensory systems. But, connectivity patterns between feature modules across sensory systems are thought to be sparser to those within a given sensory system, where interareal connectivity is extensive. It is not clear that transfer of attention between sensory systems will operate as it does within a sensory system. Using high-density electrical mapping of the event-related potential (ERP) in humans, we tested whether attending to objects in one sensory modality resulted in the preferential processing of that object's features within another task-irrelevant sensory modality. Clear evidence for cross-sensory attention effects was seen, such that for multisensory stimuli responses to ignored task-irrelevant information in the auditory and visual domains were selectively enhanced when they were features of the explicitly attended object presented in the attended sensory modality. We conclude that attending to an object within one sensory modality results in coactivation of that object's representations in ignored sensory modalities. The data further suggest that transfer of attention from visual-to-auditory features operates in a fundamentally different manner than transfer from auditory-to-visual features, and indicate that visual-object representations have a greater influence on their auditory counterparts than vice-versa. These data are discussed in terms of 'priming' vs. 'spreading' accounts of attentional transfer.

Download full-text


Available from: John J Foxe,
  • Source
    • "Indeed, given that there were small but significant performance differences between our ASD and TD groups, we cannot rule out that cognitive factors such as attention may have contributed to the observed differences in multisensory processing. The precise role of attention in the invocation of multisensory processing is currently a matter of significant research interest (e.g., Molholm et al. 2007; Talsma et al. 2007; Senkowski et al. 2008; Zimmer et al. 2010), and clearly the role of attention and its impact on multisensory processing in ASD needs to be directly tested using a design that explicitly manipulates attention. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Successful integration of auditory and visual inputs is crucial for both basic perceptual functions and for higher-order processes related to social cognition. Autism spectrum disorders (ASD) are characterized by impairments in social cognition and are associated with abnormalities in sensory and perceptual processes. Several groups have reported that individuals with ASD are impaired in their ability to integrate socially relevant audiovisual (AV) information, and it has been suggested that this contributes to the higher-order social and cognitive deficits observed in ASD. However, successful integration of auditory and visual inputs also influences detection and perception of nonsocial stimuli, and integration deficits may impair earlier stages of information processing, with cascading downstream effects. To assess the integrity of basic AV integration, we recorded high-density electrophysiology from a cohort of high-functioning children with ASD (7-16 years) while they performed a simple AV reaction time task. Children with ASD showed considerably less behavioral facilitation to multisensory inputs, deficits that were paralleled by less effective neural integration. Evidence for processing differences relative to typically developing children was seen as early as 100 ms poststimulation, and topographic analysis suggested that children with ASD relied on different cortical networks during this early multisensory processing stage.
    Cerebral Cortex 06/2013; 23(6):1329-41. DOI:10.1093/cercor/bhs109 · 8.67 Impact Factor
  • Source
    • "Conversely, sounds may affect perception of visual objects [23] and help select relevant events in an environment containing multiple competing visual objects [24]. Recent studies also suggest that conflicting auditory objects may modulate the spread and capture of visual object-related attention across multisensory objects [25], and that attending to either a visual or an auditory object results in a co-activation of the attended stimulus representation in the other modality [26]. Further studies are, thus, needed to elucidate multisensory aspects of spatial vs. object-specific attention. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Given that both auditory and visual systems have anatomically separate object identification ("what") and spatial ("where") pathways, it is of interest whether attention-driven cross-sensory modulations occur separately within these feature domains. Here, we investigated how auditory "what" vs. "where" attention tasks modulate activity in visual pathways using cortically constrained source estimates of magnetoencephalograpic (MEG) oscillatory activity. In the absence of visual stimuli or tasks, subjects were presented with a sequence of auditory-stimulus pairs and instructed to selectively attend to phonetic ("what") vs. spatial ("where") aspects of these sounds, or to listen passively. To investigate sustained modulatory effects, oscillatory power was estimated from time periods between sound-pair presentations. In comparison to attention to sound locations, phonetic auditory attention was associated with stronger alpha (7-13 Hz) power in several visual areas (primary visual cortex; lingual, fusiform, and inferior temporal gyri, lateral occipital cortex), as well as in higher-order visual/multisensory areas including lateral/medial parietal and retrosplenial cortices. Region-of-interest (ROI) analyses of dynamic changes, from which the sustained effects had been removed, suggested further power increases during Attend Phoneme vs. Location centered at the alpha range 400-600 ms after the onset of second sound of each stimulus pair. These results suggest distinct modulations of visual system oscillatory activity during auditory attention to sound object identity ("what") vs. sound location ("where"). The alpha modulations could be interpreted to reflect enhanced crossmodal inhibition of feature-specific visual pathways and adjacent audiovisual association areas during "what" vs. "where" auditory attention.
    PLoS ONE 06/2012; 7(6):e38511. DOI:10.1371/journal.pone.0038511 · 3.23 Impact Factor
  • Source
    • "Because equivalent behavioral performance might emerge from non-equivalent underlying neural processes, we also examined ERPs. Previous studies in normative adults have demonstrated object-based selection negativities [i.e., ERP components that track visual selective attention (Hansen and Hillyard, 1980)] that begin at approximately 200 msec poststimulus (e.g., Molholm et al., 2004, 2007). These ERP effects, which occur in response to visual targets presented in a stimulus stream that also includes visual non-targets, are most evident at electrode sites positioned over the lateral occipital complexda cluster of brain regions known to contribute to object processing (e.g., Doniger et al., 2000 Lucan et al., 2010; Sehatpour et al., 2006, 2008). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Behavioral evidence for an impaired ability to group objects based on similar physical or semantic properties in autism spectrum disorders (ASD) has been mixed. Here, we recorded brain activity from high-functioning children with ASD as they completed a visual-target detection task. We then assessed the extent to which object-based selective attention automatically generalized from targets to non-target exemplars from the same well-known object class (e.g., dogs). Our results provide clear electrophysiological evidence that children with ASD (N = 17, aged 8-13 years) process the similarity between targets (e.g., a specific dog) and same-category non-targets (SCNT) (e.g., another dog) to a lesser extent than do their typically developing (TD) peers (N = 21). A closer examination of the data revealed striking hemispheric asymmetries that were specific to the ASD group. These findings align with mounting evidence in the autism literature of anatomic underconnectivity between the cerebral hemispheres. Years of research in individuals with TD have demonstrated that the left hemisphere (LH) is specialized toward processing local (or featural) stimulus properties and the right hemisphere (RH) toward processing global (or configural) stimulus properties. We therefore propose a model where a lack of communication between the hemispheres in ASD, combined with typical hemispheric specialization, is a root cause for impaired categorization and the oft-observed bias to process local over global stimulus properties.
    Cortex 05/2012; 49(5). DOI:10.1016/j.cortex.2012.04.007 · 5.13 Impact Factor
Show more