Article

Hemispheric roles in the perception of speech prosody.

Department of Audiology and Speech Sciences, Purdue University, West Lafayette, IN 47907-2038, USA.
NeuroImage (Impact Factor: 6.13). 10/2004; 23(1):344-57. DOI: 10.1016/j.neuroimage.2004.06.004
Source: PubMed

ABSTRACT Speech prosody is processed in neither a single region nor a specific hemisphere, but engages multiple areas comprising a large-scale spatially distributed network in both hemispheres. It remains to be elucidated whether hemispheric lateralization is based on higher-level prosodic representations or lower-level encoding of acoustic cues, or both. A cross-language (Chinese; English) fMRI study was conducted to examine brain activity elicited by selective attention to Chinese intonation (I) and tone (T) presented in three-syllable (I3, T3) and one-syllable (I1, T1) utterance pairs in a speeded response, discrimination paradigm. The Chinese group exhibited greater activity than the English in a left inferior parietal region across tasks (I1, I3, T1, T3). Only the Chinese group exhibited a leftward asymmetry in inferior parietal and posterior superior temporal (I1, I3, T1, T3), anterior temporal (I1, I3, T1, T3), and frontopolar (I1, I3) regions. Both language groups shared a rightward asymmetry in the mid portions of the superior temporal sulcus and middle frontal gyrus irrespective of prosodic unit or temporal interval. Hemispheric laterality effects enable us to distinguish brain activity associated with higher-order prosodic representations in the Chinese group from that associated with lower-level acoustic/auditory processes that are shared among listeners regardless of language experience. Lateralization is influenced by language experience that shapes the internal prosodic representation of an external auditory signal. We propose that speech prosody perception is mediated primarily by the RH, but is left-lateralized to task-dependent regions when language processing is required beyond the auditory analysis of the complex sound.

Full-text

Available from: M. Dzemidzic, Mar 02, 2014
0 Followers
 · 
136 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The basal ganglia (BG) have been functionally linked to emotional processing [Pell, M.D., Leonard, C.L., 2003. Processing emotional tone form speech in Parkinson's Disease: a role for the basal ganglia. Cogn. Affec. Behav. Neurosci. 3, 275-288; Pell, M.D., 2006. Cerebral mechanisms for understanding emotional prosody in speech. Brain Lang. 97 (2), 221-234]. However, few studies have tried to specify the precise role of the BG during emotional prosodic processing. Therefore, the current study examined deviance detection in healthy listeners and patients with left focal BG lesions during implicit emotional prosodic processing in an event-related brain potential (ERP)-experiment. In order to compare these ERP responses with explicit judgments of emotional prosody, the same participants were tested in a follow-up recognition task. As previously reported [Kotz, S.A., Paulmann, S., 2007. When emotional prosody and semantics dance cheek to cheek: ERP evidence. Brain Res. 1151, 107-118; Paulmann, S. & Kotz, S.A., 2008. An ERP investigation on the temporal dynamics of emotional prosody and emotional semantics in pseudo- and lexical sentence context. Brain Lang. 105, 59-69], deviance of prosodic expectancy elicits a right lateralized positive ERP component in healthy listeners. Here we report a similar positive ERP correlate in BG-patients and healthy controls. In contrast, BG-patients are significantly impaired in explicit recognition of emotional prosody when compared to healthy controls. The current data serve as first evidence that focal lesions in left BG do not necessarily affect implicit emotional prosodic processing but evaluative emotional prosodic processes as demonstrated in the recognition task. The results suggest that the BG may not play a mandatory role in implicit emotional prosodic processing. Rather, executive processes underlying the recognition task may be dysfunctional during emotional prosodic processing.
    Brain Research 06/2008; 1217:171-8. DOI:10.1016/j.brainres.2008.04.032 · 2.83 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The present study investigates the neural correlates of rhythm processing in speech perception. German pseudosentences spoken with an exaggerated (isochronous) or a conversational (nonisochronous) rhythm were compared in an auditory functional magnetic resonance imaging experiment. The subjects had to perform either a rhythm task (explicit rhythm processing) or a prosody task (implicit rhythm processing). The study revealed bilateral activation in the supplementary motor area (SMA), extending into the cingulate gyrus, and in the insulae, extending into the right basal ganglia (neostriatum), as well as activity in the right inferior frontal gyrus (IFG) related to the performance of the rhythm task. A direct contrast between isochronous and nonisochronous sentences revealed differences in lateralization of activation for isochronous processing as a function of the explicit and implicit tasks. Explicit processing revealed activation in the right posterior superior temporal gyrus (pSTG), the right supramarginal gyrus, and the right parietal operculum. Implicit processing showed activation in the left supramarginal gyrus, the left pSTG, and the left parietal operculum. The present results indicate a function of the SMA and the insula beyond motor timing and speak for a role of these brain areas in the perception of acoustically temporal intervals. Secondly, the data speak for a specific task-related function of the right IFG in the processing of accent patterns. Finally, the data sustain the assumption that the right secondary auditory cortex is involved in the explicit perception of auditory suprasegmental cues and, moreover, that activity in the right secondary auditory cortex can be modulated by top-down processing mechanisms.
    Journal of Cognitive Neuroscience 04/2008; 20(3):541-52. DOI:10.1162/jocn.2008.20029 · 4.69 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Although there is a strong link between the right hemisphere and understanding emotional prosody in speech, there are few data on how the right hemisphere is implicated for understanding the emotive "attitudes" of a speaker from prosody. This report describes two experiments which compared how listeners with and without focal right hemisphere damage (RHD) rate speaker attitudes of "confidence" and "politeness" which are signalled in large part by prosodic features of an utterance. The RHD listeners displayed abnormal sensitivity to both the expressed confidence and politeness of speakers, underscoring a major role for the right hemisphere in the processing of emotions and speaker attitudes from prosody, although the source of these deficits may sometimes vary.
    Brain and Language 05/2007; 101(1):64-79. DOI:10.1016/j.bandl.2006.10.003 · 3.31 Impact Factor