Selective Attention Increases Both Gain and Feature Selectivity of the Human Auditory Cortex

Laboratory of Computational Engineering, Helsinki University of Technology, Espoo, Finland.
PLoS ONE (Impact Factor: 3.23). 02/2007; 2(9):e909. DOI: 10.1371/journal.pone.0000909
Source: PubMed


An experienced car mechanic can often deduce what's wrong with a car by carefully listening to the sound of the ailing engine, despite the presence of multiple sources of noise. Indeed, the ability to select task-relevant sounds for awareness, whilst ignoring irrelevant ones, constitutes one of the most fundamental of human faculties, but the underlying neural mechanisms have remained elusive. While most of the literature explains the neural basis of selective attention by means of an increase in neural gain, a number of papers propose enhancement in neural selectivity as an alternative or a complementary mechanism.
Here, to address the question whether pure gain increase alone can explain auditory selective attention in humans, we quantified the auditory cortex frequency selectivity in 20 healthy subjects by masking 1000-Hz tones by continuous noise masker with parametrically varying frequency notches around the tone frequency (i.e., a notched-noise masker). The task of the subjects was, in different conditions, to selectively attend to either occasionally occurring slight increments in tone frequency (1020 Hz), tones of slightly longer duration, or ignore the sounds. In line with previous studies, in the ignore condition, the global field power (GFP) of event-related brain responses at 100 ms from the stimulus onset to the 1000-Hz tones was suppressed as a function of the narrowing of the notch width. During the selective attention conditions, the suppressant effect of the noise notch width on GFP was decreased, but as a function significantly different from a multiplicative one expected on the basis of simple gain model of selective attention.
Our results suggest that auditory selective attention in humans cannot be explained by a gain model, where only the neural activity level is increased, but rather that selective attention additionally enhances auditory cortex frequency selectivity.

Download full-text


Available from: Mikko Sams, Oct 07, 2015
14 Reads
  • Source
    • "Indeed, the sensory expectation generated by the forward prediction can be understood as increased gain for processing, or reshaping of neuronal receptive fields to be more selective to, the attended/expected auditory features (Hickok et al., 2011). The mechanism for sensorimotor integration could thus, similarly to selective attention, induce short-term plasticity effects on the AC (for a review, see Jääskeläinen and Ahveninen, 2014), and therefore enhance behavioral performance, such as sound discrimination (Kauramäki et al., 2007; Ahveninen et al., 2011). Relatedly, a recent study demonstrated, by using TMS and MEG, that when speech sounds were attended, the articulatorymotor cortex contributed to the auditory processing of the sounds already at 60–100 ms after sound onset, whereas when unattended, the contributing effect started considerably later, at ∼170 ms after sound onset (Möttönen et al., 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The cortical dorsal auditory stream has been proposed to mediate mapping between auditory and articulatory-motor representations in speech processing. Whether this sensorimotor integration contributes to speech perception remains an open question. Here, magnetoencephalography was used to examine connectivity between auditory and motor areas while subjects were performing a sensorimotor task involving speech sound identification and overt repetition. Functional connectivity was estimated with inter-areal phase synchrony of electromagnetic oscillations. Structural equation modeling was applied to determine the direction of information flow. Compared to passive listening, engagement in the sensorimotor task enhanced connectivity within 200 ms after sound onset bilaterally between the temporoparietal junction (TPJ) and ventral premotor cortex (vPMC), with the left-hemisphere connection showing directionality from vPMC to TPJ. Passive listening to noisy speech elicited stronger connectivity than clear speech between left auditory cortex (AC) and vPMC at ~100 ms, and between left TPJ and dorsal premotor cortex (dPMC) at ~200 ms. Information flow was estimated from AC to vPMC and from dPMC to TPJ. Connectivity strength among the left AC, vPMC, and TPJ correlated positively with the identification of speech sounds within 150 ms after sound onset, with information flowing from AC to TPJ, from AC to vPMC, and from vPMC to TPJ. Taken together, these findings suggest that sensorimotor integration mediates the categorization of incoming speech sounds through reciprocal auditory-to-motor and motor-to-auditory projections.
    Frontiers in Psychology 05/2014; 5:394. DOI:10.3389/fpsyg.2014.00394 · 2.80 Impact Factor
  • Source
    • "This may result from changes in the selectivity of neurons in the sensory cortex (Chawla et al., 1999; Kastner et al., 1999; Ahveninen et al., 2006). Specifically, research showing that the auditory N1 response is modulated by task demands and notched-noise masking suggests that the spectrotemporal receptive fields of neurons are tuned according to attentional manipulations (Kauramäki et al., 2007), as attention excites neurons responsive to attended features and inhibits neurons responsive to unattended features (Fritz et al., 2003, 2007, 2008; Jääskeläinen et al., 2007). Neurocomputational studies demonstrated that attention may function via optimizing the synaptic gain to represent the precision of sensory information during hierarchical inference (Feldman and Friston, 2010). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The brain as a proactive system processes sensory information under the top-down influence of attention and prediction. However, the relation between attention and prediction remains undetermined given the conflation of these two mechanisms in the literature. To evaluate whether attention and prediction are dependent of each other, and if so, how these two top-down mechanisms may interact in sensory processing, we orthogonally manipulated attention and prediction in a target detection task. Participants were instructed to pay attention to one of two interleaved stimulus streams of predictable/unpredictable tone frequency. We found that attention and prediction interacted on the amplitude of the N1 ERP component. The N1 amplitude in the attended/predictable condition was larger than that in any of the other conditions. Dipole source localization analysis showed that the effect came from the activation in bilateral auditory areas. No significant effect was found in the P2 time window. Our results suggest that attention and prediction are dependent of each other. While attention might determine the overall cortical responsiveness to stimuli when prediction is involved, prediction might provide an anchor for the modulation of the synaptic input strengths which needs to be operated on the basis of attention.
    Frontiers in Human Neuroscience 03/2014; 8:152. DOI:10.3389/fnhum.2014.00152 · 2.99 Impact Factor
  • Source
    • "The increased gain in such estimates is then expected to show up as multiplicative increase in response strength as a function of increasing distance in feature space between the adaptor and the test sounds in the selective-attention condition as compared with the ignore condition. Significant deviation from this expected effect could then be interpreted as indicating reshaping of the underlying neuronal receptive fields [91]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The ability to concentrate on relevant sounds in the acoustic environment is crucial for everyday function and communication. Converging lines of evidence suggests that transient functional changes in auditory-cortex neurons, "short-term plasticity", might explain this fundamental function. Under conditions of strongly focused attention, enhanced processing of attended sounds can take place at very early latencies (~50 ms from sound onset) in primary auditory cortex and possibly even at earlier latencies in subcortical structures. More robust selective-attention short-term plasticity is manifested as modulation of responses peaking at ~100 ms from sound onset in functionally specialized nonprimary auditory-cortical areas by way of stimulus-specific reshaping of neuronal receptive fields that supports filtering of selectively attended sound features from task-irrelevant ones. Such effects have been shown to take effect in ~seconds following shifting of attentional focus. There are findings suggesting that the reshaping of neuronal receptive fields is even stronger at longer auditory-cortex response latencies (~300 ms from sound onset). These longer-latency short-term plasticity effects seem to build up more gradually, within tens of seconds after shifting the focus of attention. Importantly, some of the auditory-cortical short-term plasticity effects observed during selective attention predict enhancements in behaviorally measured sound discrimination performance.
    Neural Plasticity 01/2014; 2014(1):216731. DOI:10.1155/2014/216731 · 3.58 Impact Factor
Show more