The left inferior frontal lobe has been traditionally viewed as a "language area," although its involvement in the discrimination of rapid nonverbal frequency changes has been also shown. Using functional MRI, we studied seven healthy adults during discrimination of relatively slow (200 ms) tonal frequency glides. Compared to a control task, in which subjects indiscriminately responded to white noise bursts, tonal discrimination was associated with bilateral superior and middle temporal and medial frontal activations. Inferior frontal activations were bilateral, but stronger on the left. Contrary to previous studies comparing discrimination of slow frequency changes to rest, our results suggest that such discriminations-when compared to an auditory control task-activate the left inferior frontal gyrus. Our findings are consistent with a participation of Broca's area in nonlinguistic processes besides its known roles in semantic, syntactic, and phonological functions.
"For example, several imaging studies have described activations in DLPFC (superior frontal gyrus, superior frontal sulcus) during auditory spatial localization (Griffiths et al., 1998; Martinkauppi et al., 2000; Weeks et al., 2000; Lipschutz et al., 2002; Lutzenberger et al., 2002; Zatorre et al., 2002; Gaab et al., 2003; Leiberg et al., 2006). Conversely, VLPFC activation (IFG; BA 45,47), has been noted during auditory non-spatial processes, such as listening to melodies, attending pitch/rhythm, determining sound length, word/voice discrimination and auditory working memory (Zatorre et al., 1994, 1998; Platel et al., 1997; Linden et al., 1999; Pedersen et al., 2000; Alain et al., 2001; Kiehl et al., 2001; Muller et al., 2001; Kaiser et al., 2003; Maddock et al., 2003; Arnott et al., 2004; Rämä et al., 2004; Rämä and Courtney, 2005; Kaiser et al., 2009; Koelsch et al., 2009). "
[Show abstract][Hide abstract] ABSTRACT: The functional auditory system extends from the ears to the frontal lobes with successively more complex functions occurring as one ascends the hierarchy of the nervous system. Several areas of the frontal lobe receive afferents from both early and late auditory processing regions within the temporal lobe. Afferents from the early part of the cortical auditory system, the auditory belt cortex, which are presumed to carry information regarding auditory features of sounds, project to only a few prefrontal regions and are most dense in the ventrolateral prefrontal cortex (VLPFC). In contrast, projections from the parabelt and the rostral superior temporal gyrus (STG) most likely convey more complex information and target a larger, widespread region of the prefrontal cortex. Neuronal responses reflect these anatomical projections as some prefrontal neurons exhibit responses to features in acoustic stimuli, while other neurons display task-related responses. For example, recording studies in non-human primates indicate that VLPFC is responsive to complex sounds including vocalizations and that VLPFC neurons in area 12/47 respond to sounds with similar acoustic morphology. In contrast, neuronal responses during auditory working memory involve a wider region of the prefrontal cortex. In humans, the frontal lobe is involved in auditory detection, discrimination, and working memory. Past research suggests that dorsal and ventral subregions of the prefrontal cortex process different types of information with dorsal cortex processing spatial/visual information and ventral cortex processing non-spatial/auditory information. While this is apparent in the non-human primate and in some neuroimaging studies, most research in humans indicates that specific task conditions, stimuli or previous experience may bias the recruitment of specific prefrontal regions, suggesting a more flexible role for the frontal lobe during auditory cognition.
Frontiers in Neuroscience 07/2014; 8(8):199. DOI:10.3389/fnins.2014.00199 · 3.66 Impact Factor
"These studies have compared, in active or passive listening conditions, activations elicited by a sequence of tones or noise bursts of varying pitch with activations elicited by a sequence of tones or noise bursts with a fixed pitch (e.g., Barrett and Hall, 2006; Hyde, 2008; Opitz et al., 2005; Patterson et al., 2002; Warren and Griffiths, 2003), activations to frequencymodulated (FM) tones with activations to non-FM tones (e.g., Hall et al., 2002; Hart et al., 2004), and activations to tones or noise bursts that produce a salient perception of pitch with activations to sounds that produce weak or absent pitch (e.g., Barker et al., 2011; Binder et al., 2000; Hall et al., 2005; Penagos et al., 2004). Other studies have compared activations during active pitch processing, for example, pitch discrimination tasks or pitch working-memory tasks with activations during other auditory tasks (e.g., Alain et al., 2001; Brechmann and Scheich, 2005; Müller et al., 2001). Likewise, fMRI studies of spatial auditory processing in active or passive listening conditions have compared activations to moving sounds or to sounds with varying azimuthal locations with activations to stationary sounds (e.g., Barrett and Hall, 2006; Brunetti et al., 2005; Griffiths et al., 2000; Hart et al., 2004; Krumbholz et al., 2005; Pavani et al., 2002; Warren and Griffiths, 2003; Warren et al., 2002), activations to sounds from a well-defined location with activations to sounds with a diffuse source (e.g., Budd et al., 2003; Hall et al., 2005), and analyzed activations during spatial discrimination or spatial working-memory tasks (e.g., Bidet- Caulet et al., 2005; Martinkauppi et al., 2000). "
[Show abstract][Hide abstract] ABSTRACT: We meta-analyzed 115 functional magnetic resonance imaging (fMRI) studies reporting auditory-cortex (AC) coordinates for activations related to the active and passive processing of the pitch and spatial location of non-speech sounds, as well as to the active and passive speech and voice processing. We aimed at revealing any systematic differences between AC surface locations of these activations by analyzing the activation loci statistically using the open-source Matlab toolbox VAMCA (Visualization and Meta-analysis on Cortical Anatomy). AC activations associated with pitch processing (e.g., active or passive listening to tones with a varying vs. fixed pitch) had median loci in the middle superior temporal gyrus (STG), lateral to Heschl's gyrus. However, median loci of activations due to the processing of infrequent pitch changes in a tone stream were centered in the STG or planum temporale (PT), significantly posterior to the median loci for other types of pitch processing. Median loci of attention-related modulations due to focused attention to pitch (e.g., attending selectively to low or high tones delivered in concurrent sequences) were, in turn, centered in the STG or superior temporal sulcus (STS), posterior to median loci for passive pitch processing. Activations due to spatial processing were centered in the posterior STG or PT, significantly posterior to pitch processing loci (processing of infrequent pitch changes excluded). In the right-hemisphere AC, the median locus of spatial attention-related modulations was in the STS, significantly inferior to the median locus for passive spatial processing. Activations associated with speech processing and those associated with voice processing had indistinguishable median loci at the border of mid-STG and mid-STS. Median loci of attention-related modulations due to attention to speech were in the same mid-STG/STS region. Thus, while attention to the pitch or location of non-speech sounds seems to recruit AC areas not involved in passive pitch or location processing, focused attention to speech predominantly enhances activations in regions that already respond to human vocalizations during passive listening. This suggests that distinct attention mechanisms might be engaged by attention to speech and attention to more elemental auditory features such as tone pitch or location.
Hearing research 08/2013; 307. DOI:10.1016/j.heares.2013.08.001 · 2.97 Impact Factor
"We chose this categorization task because it is well-established that this task is mainly processed in the right AC in different species: gerbils (Wetzel et al., 1998a,b, 2008), rats (Rybalko et al., 2006), and human subjects (Poeppel et al., 2004; Behne et al., 2005; Brechmann and Scheich, 2005). Müller et al. (2001) showed bilateral temporal activity during the detection of one rising FM (no categorization). In all these earlier studies, the decision with regard to the direction of the FM had to be made on individual tones and did not require comparison with previous tones. "
[Show abstract][Hide abstract] ABSTRACT: Evaluating series of complex sounds like those in speech and music requires sequential comparisons to extract task-relevant relations between subsequent sounds. With the present functional magnetic resonance imaging (fMRI) study, we investigated whether sequential comparison of a specific acoustic feature within pairs of tones leads to a change in lateralized processing in the auditory cortex (AC) of humans. For this we used the active categorization of the direction (up vs. down) of slow frequency modulated (FM) tones. Several studies suggest that this task is mainly processed in the right AC. These studies, however, tested only the categorization of the FM direction of each individual tone. In the present study we ask the question whether the right lateralized processing changes when, in addition, the FM direction is compared within pairs of successive tones. For this we use an experimental approach involving contralateral noise presentation in order to explore the contributions made by the left and right AC in the completion of the auditory task. This method has already been applied to confirm the right-lateralized processing of the FM direction of individual tones. In the present study, the subjects were required to perform, in addition, a sequential comparison of the FM direction in pairs of tones. The results suggest a division of labor between the two hemispheres such that the FM direction of each individual tone is mainly processed in the right AC whereas the sequential comparison of this feature between tones in a pair is probably performed in the left AC.
Frontiers in Neuroscience 07/2013; 7(7):115. DOI:10.3389/fnins.2013.00115 · 3.66 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.