Maps and streams in the auditory cortex: Nonhuman primates illuminate human speech processing. Nature Neuroscience, 12(6), 718-724

Laboratory of Integrative Neuroscience and Cognition, Georgetown University Medical Center, Washington, DC, USA.
Nature Neuroscience (Impact Factor: 16.1). 06/2009; 12(6):718-24. DOI: 10.1038/nn.2331
Source: PubMed


Speech and language are considered uniquely human abilities: animals have communication systems, but they do not match human linguistic skills in terms of recursive structure and combinatorial power. Yet, in evolution, spoken language must have emerged from neural mechanisms at least partially available in animals. In this paper, we will demonstrate how our understanding of speech perception, one important facet of language, has profited from findings and theory in nonhuman primate studies. Chief among these are physiological and anatomical studies showing that primate auditory cortex, across species, shows patterns of hierarchical structure, topographic mapping and streams of functional processing. We will identify roles for different cortical areas in the perceptual processing of speech and review functional imaging work in humans that bears on our understanding of how the brain decodes and monitors speech. A new model connects structures in the temporal, frontal and parietal lobes linking speech perception and production.

Download full-text


Available from: Josef P Rauschecker, Oct 06, 2015
1 Follower
53 Reads
  • Source
    • "In contrast , the " what " pathway was proposed to origi - nate from anterior lateral belt areas and to project toward the temporal pole . Human functional imaging studies support the " where " part of the hypothesis in that auditory spatial tasks tend to activate parietal and pre - frontal areas that also are activated during visual spatial tasks ( Rauschecker and Scott , 2009 ; Recanzone and Cohen , 2010 "
    [Show abstract] [Hide abstract]
    ABSTRACT: The auditory system derives locations of sound sources from spatial cues provided by the interaction of sound with the head and external ears. Those cues are analyzed in specific brainstem pathways and then integrated as cortical representation of locations. The principal cues for horizontal localization are interaural time differences (ITDs) and interaural differences in sound level (ILDs). Vertical and front/back localization rely on spectral-shape cues derived from direction-dependent filtering properties of the external ears. The likely first sites of analysis of these cues are the medial superior olive (MSO) for ITDs, lateral superior olive (LSO) for ILDs, and dorsal cochlear nucleus (DCN) for spectral-shape cues. Localization in distance is much less accurate than that in horizontal and vertical dimensions, and interpretation of the basic cues is influenced by additional factors, including acoustics of the surroundings and familiarity of source spectra and levels. Listeners are quite sensitive to sound motion, but it remains unclear whether that reflects specific motion detection mechanisms or simply detection of changes in static location. Intact auditory cortex is essential for normal sound localization. Cortical representation of sound locations is highly distributed, with no evidence for point-to-point topography. Spatial representation is strictly contralateral in laboratory animals that have been studied, whereas humans show a prominent right-hemisphere dominance. © 2015 Elsevier B.V. All rights reserved.
    Handbook of Clinical Neurology 12/2015; 129C:99-116. DOI:10.1016/B978-0-444-62630-1.00006-8
  • Source
    • "According to the hierarchical model of sensory information processing, sensory inputs are transmitted to cortical areas, which are crucial for complex auditory and speech processing, only after being processed in subcortical areas (Hickok and Poeppel, 2007; Rauschecker and Scott, 2009). However, studies using electroencephalography (EEG) indicate that distinguishing simultaneous auditory inputs involves a widely distributed neural network, including the medial temporal lobe, which is essential for declarative memory, and posterior association cortices (Alain et al., 2001; Squire et al., 2004). "
    Frontiers in Psychology 08/2015; 6:1166. DOI:10.3389/fpsyg.2015.01166 · 2.80 Impact Factor
  • Source
    • "Parallel processing is most prominently known from the vertebrate visual system (Livingston and 55 Hubel 1988), where color and shape of a stimulus are analyzed in parallel with a possible motion of 56 the stimulus. Similar distribution of stimulus features on different pathways has been described in the 57 auditory (Rauschecker and Scott, 2009) and the somatosensory systems (Gasser and Erlanger, 1929; 58 Reed et al., 2005). In insects, parallel pathways were described both in vision (Ribi and Scheel, 1981; 59 Fischbach and Dittrich, 1989; Strausfeld et al., 2006; Paulk et al., 2009, 2008) and audition 60 (Helversen and Helversen, 1995). "
    [Show abstract] [Hide abstract]
    ABSTRACT: To rapidly process biologically relevant stimuli, sensory systems have developed a broad variety of coding mechanisms like parallel processing and coincidence detection. Parallel processing (e.g. in the visual system), increases both computational capacity and processing speed by simultaneously coding different aspects of the same stimulus. Coincidence detection is an efficient way to integrate information from different sources. Coincidence has been shown to promote associative learning and memory or stimulus feature detection (e.g. in auditory delay lines). Within the dual olfactory pathway of the honeybee both of these mechanisms might be implemented by uniglomerular projection neurons (PNs) that transfer information from the primary olfactory centers, the antennal lobe (AL), to a multimodal integration center, the mushroom body (MB). PNs from anatomically distinct tracts respond to the same stimulus space, but have different physiological properties, characteristics that are prerequisites for parallel processing of different stimulus aspects. However, the PN pathways also display mirror-imaged like anatomical trajectories that resemble neuronal coincidence detectors as known from auditory delay lines. To investigate temporal processing of olfactory information, we recorded PN odor responses simultaneously from both tracts and measured coincident activity of PNs within and between tracts. Our results show that coincidence levels are different within each of the two tracts. Coincidence also occurs between tracts, but to a minor extent compared to coincidence within tracts. Taken together our findings support the relevance of spike timing in coding of olfactory information (temporal code).
    Frontiers in Physiology 07/2015; 6(208). DOI:10.3389/fphys.2015.00208 · 3.53 Impact Factor
Show more