Object-based attention is multisensory: Co-activation of an object's representations in ignored sensory modalities

The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience and Schizophrenia, Nathan Kline Institute for Psychiatric Research, 140 Old Orangeburg Road, Orangeburg, NY 10962, USA.
European Journal of Neuroscience (Impact Factor: 3.18). 08/2007; 26(2):499-509. DOI: 10.1111/j.1460-9568.2007.05668.x
Source: PubMed


Within the visual modality, it has been shown that attention to a single visual feature of an object such as speed of motion, results in an automatic transfer of attention to other task-irrelevant features (e.g. colour). An extension of this logic might lead one to predict that such mechanisms also operate across sensory systems. But, connectivity patterns between feature modules across sensory systems are thought to be sparser to those within a given sensory system, where interareal connectivity is extensive. It is not clear that transfer of attention between sensory systems will operate as it does within a sensory system. Using high-density electrical mapping of the event-related potential (ERP) in humans, we tested whether attending to objects in one sensory modality resulted in the preferential processing of that object's features within another task-irrelevant sensory modality. Clear evidence for cross-sensory attention effects was seen, such that for multisensory stimuli responses to ignored task-irrelevant information in the auditory and visual domains were selectively enhanced when they were features of the explicitly attended object presented in the attended sensory modality. We conclude that attending to an object within one sensory modality results in coactivation of that object's representations in ignored sensory modalities. The data further suggest that transfer of attention from visual-to-auditory features operates in a fundamentally different manner than transfer from auditory-to-visual features, and indicate that visual-object representations have a greater influence on their auditory counterparts than vice-versa. These data are discussed in terms of 'priming' vs. 'spreading' accounts of attentional transfer.

Download full-text


Available from: John J Foxe
  • Source
    • "One of such top-down factors, which has been prominent in the research on multisensory integration, is the expectation of co-occurrence of sensory events. Multiple studies have shown that multimodal recognition is modulated by prior expectations on whether sensory signals should be combined or segregated (Molholm, Martinez, Shpaner, & Foxe, 2007; Murray et al., 2004; Naci, Taylor, Cusack, & Tyler, 2012). As an example of such a study, Lee and Noppeney, 2014 explored top-down effects in multisensory integration. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This article reviews recent literature on the role of top-down feedback processes in semantic representations in the brain. Empirical studies on perception and theoretical models of semantic cognition show that sensory input is filtered and interpreted based on predictions from higher order cognitive areas. Here, we review the present evidence to the proposal that linguistic constructs, in particular, words, could serve as effective priors, facilitating perception and integration of sensory information. We address a number of theoretical questions arising from this assumption. The focus here is if linguistic categories have a direct top-down effect on early stages of perception; or rather interact with later processing stages such as semantic analysis. We discuss experimental approaches that could discriminate between these possibilities. Taken together, this article provides a review on the interaction between language and perception from the predictive perspective, and suggests avenues to investigate the underlying mechanisms from this perspective.
    Full-text · Article · Sep 2015
    • "Motor planning requires the integration of different stimuli (visual, vestibular, tactile and proprioceptive inputs) in order to carry out accurate motor tasks (Meredith and Stein, 1983). Important stimuli for hand dexterity include input from skin receptors, muscle spindles and visual feedback (Shimojo and Shams, 2001; Seitz et al., 2006; Molholm et al., 2007; Lappe et al., 2008; Shams and Seitz, 2008), but it is possible that auditory stimuli produced by active music could also play an important role. Other rehabilitation approaches focusing on multisensory stimuli have been developed in recent years, including virtual reality (Laver et al., 2011 ) and action observation (Ertelt et al., 2007; Buccino et al., 2011), and to the best of our knowledge, comparisons on the efficacy of these approaches have never been performed. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Background and PurposePlaying an instrument implies neuroplasticity in different cerebral regions. This phenomenon has been described in subjects with stroke, suggesting that it could play a role in hand rehabilitation. The aim of this study is to analyse the effectiveness of playing a musical keyboard in improving hand function in subjects with multiple sclerosis.Methods Nineteen hospitalized subjects were randomized in two groups: nine played a turned-on musical keyboard by sequences of fingers movements (audio feedback present) and 10 performed the same exercises on a turned-off musical keyboard (audio feedback absent). Training duration was half an hour per day for 15 days. Primary outcome was the perceived hand functional use measured by ABILHAND Questionnaire. Secondary outcomes were hand dexterity, measured by Nine-Hole Peg Test, and hand strength, measured by Jamar and Pinch dynamometers. Two-way analysis of variance was used for data analysis.ResultsThe interaction time × group was significant (p = 0.003) for ABILHAND Questionnaire in favour of experimental group (mean between-group difference 0.99 logit [IC95%: 0.44; 1.54]). The two groups showed a significant time effect for all outcomes except for Jamar measure.DiscussionPlaying a musical keyboard seems a valid method to train the functional use of hands in subjects with multiple sclerosis. Copyright © 2014 John Wiley & Sons, Ltd.
    No preview · Article · Jul 2014 · Physiotherapy Research International
  • Source
    • "Indeed, given that there were small but significant performance differences between our ASD and TD groups, we cannot rule out that cognitive factors such as attention may have contributed to the observed differences in multisensory processing. The precise role of attention in the invocation of multisensory processing is currently a matter of significant research interest (e.g., Molholm et al. 2007; Talsma et al. 2007; Senkowski et al. 2008; Zimmer et al. 2010), and clearly the role of attention and its impact on multisensory processing in ASD needs to be directly tested using a design that explicitly manipulates attention. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Successful integration of auditory and visual inputs is crucial for both basic perceptual functions and for higher-order processes related to social cognition. Autism spectrum disorders (ASD) are characterized by impairments in social cognition and are associated with abnormalities in sensory and perceptual processes. Several groups have reported that individuals with ASD are impaired in their ability to integrate socially relevant audiovisual (AV) information, and it has been suggested that this contributes to the higher-order social and cognitive deficits observed in ASD. However, successful integration of auditory and visual inputs also influences detection and perception of nonsocial stimuli, and integration deficits may impair earlier stages of information processing, with cascading downstream effects. To assess the integrity of basic AV integration, we recorded high-density electrophysiology from a cohort of high-functioning children with ASD (7-16 years) while they performed a simple AV reaction time task. Children with ASD showed considerably less behavioral facilitation to multisensory inputs, deficits that were paralleled by less effective neural integration. Evidence for processing differences relative to typically developing children was seen as early as 100 ms poststimulation, and topographic analysis suggested that children with ASD relied on different cortical networks during this early multisensory processing stage.
    Full-text · Article · Jun 2013 · Cerebral Cortex
Show more