Heschl's Gyrus, Posterior Superior Temporal Gyrus, and Mid-Ventrolateral Prefrontal Cortex Have Different Roles in the Detection of Acoustic Changes

Cognitive Brain Research Unit, Department of Psychology, University of Helsinki, Helsinki, Finland.
Journal of Neurophysiology (Impact Factor: 2.89). 04/2007; 97(3):2075-82. DOI: 10.1152/jn.01083.2006
Source: PubMed


A part of the auditory system automatically detects changes in the acoustic environment. This preattentional process has been studied extensively, yet its cerebral origins have not been determined with sufficient accuracy to allow comparison to established anatomical and functional parcellations. Here we used event-related functional MRI and EEG in a parametric experimental design to determine the cortical areas in individual brains that participate in the detection of acoustic changes. Our results suggest that automatic change processing consists of at least three stages: initial detection in the primary auditory cortex, detailed analysis in the posterior superior temporal gyrus and planum temporale, and judgment of sufficient novelty for the allocation of attentional resources in the mid-ventrolateral prefrontal cortex.

Download full-text


Available from: Satu Pakarinen, Oct 07, 2015
26 Reads
    • "The ROI analysis aimed to increase sensitivity in detecting repetition suppression effects in brain areas that have been reported to process acoustic changes. The ROIs included Heschl's gyri (HGs), STGs, and IFGs bilaterally (Schönwiesner et al. 2007). We also chose to include the left IC based on the findings by Chandrasekaran et al. (2012), and the medial geniculate thalamic nuclei (MGB) since they relay acoustic information from the IC to cortical auditory areas (Javad et al. 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Do individuals differ in how efficiently they process non-native sounds? To what extent do these differences relate to individual variability in sound-learning aptitude? We addressed these questions by assessing the sound-learning abilities of Dutch native speakers as they were trained on non-native tone contrasts. We used fMRI repetition suppression to the non-native tones to measure participants' neuronal processing efficiency before and after training. Although all participants improved in tone identification with training, there was large individual variability in learning performance. A repetition suppression effect to tone was found in the bilateral inferior frontal gyri (IFGs) before training. No whole-brain effect was found after training; a region-of-interest analysis, however, showed that, after training, repetition suppression to tone in the left IFG correlated positively with learning. That is, individuals who were better in learning the non-native tones showed larger repetition suppression in this area. Crucially, this was true even before training. These findings add to existing evidence that the left IFG plays an important role in sound learning and indicate that individual differences in learning aptitude stem from differences in the neuronal efficiency with which non-native sounds are processed. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail:
    Cerebral Cortex 06/2015; DOI:10.1093/cercor/bhv126 · 8.67 Impact Factor
  • Source
    • "Our results suggest that while different anterior regions participate in the encoding of features at the local level, posterior and hierarchically superior regions may be engaged in the encoding of more complex or global patterns. Despite converging evidence shows the existence of MMN generators in the frontal lobe [Doeller et al., 2003; Sch€ onwiesner et al., 2007], no frontal areas were observed in this study. Previous studies described the involvement of frontal regions when using listening tasks and by recording EEG or intracranial activity during global–local paradigms [Bekinschtein et al., 2009; Chennu et al., 2013]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Our auditory system is able to encode acoustic regularity of growing levels of complexity to model and predict incoming events. Recent evidence suggests that early indices of deviance detection in the time range of the middle-latency responses (MLR) precede the mismatch negativity (MMN), a well-established error response associated with deviance detection. While studies suggest that only the MMN, but not early deviance-related MLR, underlie complex regularity levels, it is not clear whether these two mechanisms interplay during scene analysis by encoding nested levels of acoustic regularity, and whether neuronal sources underlying local and global deviations are hierarchically organized. We registered magnetoencephalographic evoked fields to rapidly presented four-tone local sequences containing a frequency change. Temporally integrated local events, in turn, defined global regularities, which were infrequently violated by a tone repetition. A global magnetic mismatch negativity (MMNm) was obtained at 140-220 ms when breaking the global regularity, but no deviance-related effects were shown in early latencies. Conversely, Nbm (45-55 ms) and Pbm (60-75 ms) deflections of the MLR, and an earlier MMNm response at 120-160 ms, responded to local violations. Distinct neuronal generators in the auditory cortex underlay the processing of local and global regularity violations, suggesting that nested levels of complexity of auditory object representations are represented in separated cortical areas. Our results suggest that the different processing stages and anatomical areas involved in the encoding of auditory representations, and the subsequent detection of its violations, are hierarchically organized in the human auditory cortex. Hum Brain Mapp, 2014. © 2014 Wiley Periodicals, Inc.
    Human Brain Mapping 11/2014; 35(11). DOI:10.1002/hbm.22582 · 5.97 Impact Factor
  • Source
    • "The brain sources of the MMN have been located not only in primary and secondary auditory areas [12], [33], [63], [67] but also in the inferior frontal gyrus [68], [69], [70]. There is evidence that the frontal component of the MMN occurs later than the temporal one and might reflect a switch of attention to the deviant sound [68], [71]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: How does the human brain extract regularities from its environment? There is evidence that short range or ‘local’ regularities (within seconds) are automatically detected by the brain while long range or ‘global’ regularities (over tens of seconds or more) require conscious awareness. In the present experiment, we asked whether participants’ attention was needed to acquire such auditory regularities, to detect their violation or both. We designed a paradigm in which participants listened to predictable sounds. Subjects could be distracted by a visual task at two moments: when they were first exposed to a regularity or when they detected violations of this regularity. MEG recordings revealed that early brain responses (100-130 ms) to violations of short range regularities were unaffected by visual distraction and driven essentially by local transitional probabilities. Based on global workspace theory and prior results, we expected that visual distraction would eliminate the long range global effect, but unexpectedly, we found the contrary, i.e. late brain responses (300-600 ms) to violations of long range regularities on audio-visual trials but not on auditory only trials. Further analyses showed that, in fact, visual distraction was incomplete and that auditory and visual stimuli interfered in both directions. Our results show that conscious, attentive subjects can learn the long range dependencies present in auditory stimuli even while performing a visual task on synchronous visual stimuli. Furthermore, they acquire a complex regularity and end up making different predictions for the very same stimulus depending on the context (i.e. absence or presence of visual stimuli). These results suggest that while short-range regularity detection is driven by local transitional probabilities between stimuli, the human brain detects and stores long-range regularities in a highly flexible, context dependent manner.
    PLoS ONE 09/2014; In Press(9). DOI:10.1371/journal.pone.0107227 · 3.23 Impact Factor
Show more