Article

Auditory-motor interaction revealed by fMRI: speech, music, and working memory in area Spt.

University of California, Irvine, CA 92697, USA.
Journal of Cognitive Neuroscience (Impact Factor: 4.69). 08/2003; 15(5):673-82. DOI: 10.1162/089892903322307393
Source: PubMed

ABSTRACT The concept of auditory-motor interaction pervades speech science research, yet the cortical systems supporting this interface have not been elucidated. Drawing on experimental designs used in recent work in sensory-motor integration in the cortical visual system, we used fMRI in an effort to identify human auditory regions with both sensory and motor response properties, analogous to single-unit responses in known visuomotor integration areas. The sensory phase of the task involved listening to speech (nonsense sentences) or music (novel piano melodies); the "motor" phase of the task involved covert rehearsal/humming of the auditory stimuli. A small set of areas in the superior temporal and temporal-parietal cortex responded both during the listening phase and the rehearsal/humming phase. A left lateralized region in the posterior Sylvian fissure at the parietal-temporal boundary, area Spt, showed particularly robust responses to both phases of the task. Frontal areas also showed combined auditory + rehearsal responsivity consistent with the claim that the posterior activations are part of a larger auditory-motor integration circuit. We hypothesize that this circuit plays an important role in speech development as part of the network that enables acoustic-phonetic input to guide the acquisition of language-specific articulatory-phonetic gestures; this circuit may play a role in analogous musical abilities. In the adult, this system continues to support aspects of speech production, and, we suggest, supports verbal working memory.

2 Bookmarks
 · 
149 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Considerable evidence has shown that unexpected alterations in auditory feedback elicit fast compensatory adjustments in vocal production. Although generally thought to be involuntary in nature, whether these adjustments can be influenced by attention remains unknown. The present event-related potential (ERP) study aimed to examine whether neurobehavioral processing of auditory-vocal integration can be affected by attention. While sustaining a vowel phonation and hearing pitch-shifted feedback, participants were required to either ignore the pitch perturbations, or attend to them with low (counting the number of perturbations) or high attentional load (counting the type of perturbations). Behavioral results revealed no systematic change of vocal response to pitch perturbations irrespective of whether they were attended or not. At the level of cortex, there was an enhancement of P2 response to attended pitch perturbations in the low-load condition as compared to when they were ignored. In the high-load condition, however, P2 response did not differ from that in the ignored condition. These findings provide the first neurophysiological evidence that auditory-motor integration in voice control can be modulated as a function of attention at the level of cortex. Furthermore, this modulatory effect does not lead to a general enhancement but is subject to attentional load.
    Scientific Reports 01/2015; 5:7812. · 5.08 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We independently manipulate memory load and rehearsal rate in a working memory task.•Prefrontal (MFG) and parietal (SPL) exhibit load effects but not rate effects.•IFG, premotor and area Spt exhibit both load and rate effects.•Memory load fMRI effects are most prominent at the start of the delay period.•Rehearsal rate fMRI effects are constant through long delay period.
    NeuroImage 01/2015; 105. · 6.13 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper examines the questions, what levels of speech can be perceived visually, and how is visual speech represented by the brain? Review of the literature leads to the conclusions that every level of psycholinguistic speech structure (i.e., phonetic features, phonemes, syllables, words, and prosody) can be perceived visually, although individuals differ in their abilities to do so; and that there are visual modality-specific representations of speech qua speech in higher-level vision brain areas. That is, the visual system represents the modal patterns of visual speech. The suggestion that the auditory speech pathway receives and represents visual speech is examined in light of neuroimaging evidence on the auditory speech pathways. We outline the generally agreed-upon organization of the visual ventral and dorsal pathways and examine several types of visual processing that might be related to speech through those pathways, specifically, face and body, orthography, and sign language processing. In this context, we examine the visual speech processing literature, which reveals widespread diverse patterns of activity in posterior temporal cortices in response to visual speech stimuli. We outline a model of the visual and auditory speech pathways and make several suggestions: (1) The visual perception of speech relies on visual pathway representations of speech qua speech. (2) A proposed site of these representations, the temporal visual speech area (TVSA) has been demonstrated in posterior temporal cortex, ventral and posterior to multisensory posterior superior temporal sulcus (pSTS). (3) Given that visual speech has dynamic and configural features, its representations in feedforward visual pathways are expected to integrate these features, possibly in TVSA.
    Frontiers in Neuroscience 12/2014; 8:386.

Preview

Download
5 Downloads
Available from