The multifaceted interplay between attention and multisensory integration. Trends in Cognitive Sciences, 14, 400-410

Department of Cognitive Psychology and Ergonomics, University of Twente, P.O. Box 215, 7500 AE Enschede, The Netherlands.
Trends in Cognitive Sciences (Impact Factor: 21.97). 09/2010; 14(9):400-10. DOI: 10.1016/j.tics.2010.06.008
Source: PubMed


Multisensory integration has often been characterized as an automatic process. Recent findings indicate that multisensory integration can occur across various stages of stimulus processing that are linked to, and can be modulated by, attention. Stimulus-driven, bottom-up mechanisms induced by crossmodal interactions can automatically capture attention towards multisensory events, particularly when competition to focus elsewhere is relatively low. Conversely, top-down attention can facilitate the integration of multisensory inputs and lead to a spread of attention across sensory modalities. These findings point to a more intimate and multifaceted interplay between attention and multisensory integration than was previously thought. We review developments in the current understanding of the interactions between attention and multisensory processing, and propose a framework that unifies previous, apparently discordant, findings.

Download full-text


Available from: Durk Talsma
  • Source
    • "While the proposal that goal-dependence is a dimension of multisensory processing has helped to reconcile a long-standing debate in the area (Talsma et al. 2010), it fails to explain other contradictory findings that continue to accumulate. For example, context (e.g. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and “top-down” control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer’s goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal-dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing, and discuss their possible broader implications.
    Full-text · Article · Jan 2016 · Experimental Brain Research
    • "For example, when actively attending to an auditory sequence, gamma activity is enhanced during very early sensory processing when compared to passive listening (Gilley and Sharma 2010). Increased gamma power has also been observed when responding to congruent versus incongruent auditory-visual stimuli (Talsma et al. 2010), suggesting that early selection mechanisms are modulated by higher level, task related functions. The significant difference in the gamma-1 response suggests that at least some attention is allocated to processing the auditory sequences, even if at a low level of processing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Objective: We sought to examine whether oscillatory EEG responses to a speech stimulus in both quiet and noise were different in children with listening problems than in children with normal hearing. Methods: We employed a high-resolution spectral-temporal analysis of the cortical auditory evoked potential in response to a 150. ms speech sound /da/ in quiet and 3. dB SNR in 21 typically developing children (mean age = 10.7. years, standard deviation = 1.7) and 44 children with reported listening problems (LP) with absence of hearing loss (mean age = 10.3. years, standard deviation = 1.6). Children with LP were assessed for auditory processing disorder (APD) by which 24 children had APD, and 20 children did not. Peak latencies, magnitudes, and frequencies were compared between these groups. Results: Children with LP had frequency shifts in the theta, and alpha bands (p <. 0.05), and children with LP. +. APD had additional frequency (p <. 0.01) and latency shifts (p <. 0.05) in the upper beta and in the lower gamma bands. Conclusions: These results provide evidence for differences in higher level modulatory processing in children with LP, and that APD is driven by differences in early auditory encoding. Significance: These findings may better guide future research toward improving the differential diagnosis and treatment of listening problems in this population of children.
    No preview · Article · Dec 2015 · Clinical neurophysiology: official journal of the International Federation of Clinical Neurophysiology
  • Source
    • "While task-relevance is one frequently studied form of top-down control over sensory processing, within (reviewed in Nobre and Kastner, 2014) and across the senses (e.g., Matusz et al., 2011, 2013; reviewed in Talsma et al., 2010; De Meo et al., 2015; Ten Oever et al., in revisions), an increasing number of studies points to similar importance of context-based influences. As demonstrated by traditional, unisensory studies, context influences range from predictions (Summerfield and Egner, 2009), through external and internal states (e.g., remembering something better in a place where one had learnt it), to fine-grained differences in stimulus features (e.g., the object's colour; Bar, 2004; Baddeley et al., 2009). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Real-world environments are nearly always multisensory in nature. Processing in such situations confers perceptual advantages, but its automaticity remains poorly understood. Automaticity has been invoked to explain the activation of visual cortices by laterally-presented sounds. This has been observed even when the sounds were task-irrelevant and spatially uninformative about subsequent targets. An auditory-evoked contralateral occipital positivity (ACOP) at ~250ms post-sound onset has been postulated as the event-related potential (ERP) correlate of this cross-modal effect. However, the spatial dimension of the stimuli was nevertheless relevant in all prior studies where the ACOP was observed. By manipulating the implicit predictability of the location of lateralised sounds in a passive auditory paradigm, we tested the automaticity of cross-modal activations of visual cortices. 128-channel ERP data from healthy participants were analysed within an electrical neuroimaging framework. The timing, topography, and localisation resembled previous characterisations of the ACOP. However, the cross-modal activations of visual cortices by sounds were critically dependent on whether the sound location was (un)predictable. Our results are the first direct evidence that this particular cross-modal process is not (fully) automatic; instead, it is context-contingent. More generally, the present findings provide novel insights into the importance of context-related factors in controlling information processing across the senses, and call for a revision of current models of automaticity in cognitive sciences.
    Full-text · Article · Nov 2015 · NeuroImage
Show more