Seeing and hearing others and oneself talk.

Laboratory of Computational Engineering, Helsinki University of Technology, PO Box 9203, FIN-02015 HUT, Finland.
Cognitive Brain Research (Impact Factor: 3.77). 06/2005; 23(2-3):429-35. DOI: 10.1016/j.cogbrainres.2004.11.006
Source: PubMed

ABSTRACT We studied the modification of auditory perception in three different conditions in twenty subjects. Observing other person's discordant articulatory gestures deteriorated identification of acoustic speech stimuli and modified the auditory percept, causing a strong McGurk effect. A similar effect was found when the subjects watched their own silent articulation in a mirror and acoustic stimuli were simultaneously presented to their ears. Interestingly, a smaller but significant effect was even obtained when the subjects just silently articulated the syllables without visual feedback. On the other hand, observing other person's or one's own concordant articulation and silently articulating a concordant syllable improved identification of the acoustic stimuli. The modification of auditory percepts caused by visual observation of speech and silently articulating it are both suggested to be due to the alteration of activity in the auditory cortex. Our findings support the idea of a close relationship between speech perception and production.

1 Bookmark
  • [Show abstract] [Hide abstract]
    ABSTRACT: Knowledge of the complexity of human communication comes from three main sources - (i) studies of the linguistics and neuropsychology of dysfunction after brain injury; (ii) studies of the development of social communication in infancy, and its dysfunction in developmental psychopathologies; and (iii) the evolutionary history of human communicative interaction. Together, these suggest the need for a broad, integrated theory of communication of which language forms a small but critical component.
    Behavioral and Brain Sciences 06/2013; · 18.57 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Visual speech influences the perception of heard speech. A classic example of this is the McGurk effect, whereby an auditory /pa/ overlaid onto a visual /ka/ induces the fusion percept of /ta/. Recent behavioral and neuroimaging research has highlighted the importance of both articulatory representations and motor speech regions of the brain, particularly Broca's area, in audiovisual (AV) speech integration. Alternatively, AV speech integration may be accomplished by the sensory system through multisensory integration in the posterior STS. We assessed the claims regarding the involvement of the motor system in AV integration in two experiments: (i) examining the effect of articulatory suppression on the McGurk effect and (ii) determining if motor speech regions show an AV integration profile. The hypothesis regarding experiment (i) is that if the motor system plays a role in McGurk fusion, distracting the motor system through articulatory suppression, should result in a reduction of McGurk fusion. The results of experiment (i) showed that articulatory suppression results in no such reduction, suggesting that the motor system is not responsible for the McGurk effect. The hypothesis of experiment (ii) was that if the brain activation to AV speech in motor regions (such as Broca's area) reflects AV integration, the profile of activity should reflect AV integration: AV > AO (auditory only) and AV > VO (visual only). The results of experiment (ii) demonstrate that motor speech regions do not show this integration profile, whereas the posterior STS does. Instead, activity in motor regions is task dependent. The combined results suggest that AV speech integration does not rely on the motor system.
    Journal of Cognitive Neuroscience 11/2013; · 4.49 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Speech production, both overt and covert, down-regulates the activation of auditory cortex. This is thought to be due to forward prediction of the sensory consequences of speech, contributing to a feedback control mechanism for speech production. Critically, however, these regulatory effects should be specific to speech content to enable accurate speech monitoring. To determine the extent to which such forward prediction is content-specific, we recorded the brain's neuromagnetic responses to heard multisyllabic pseudowords during covert rehearsal in working memory, contrasted with a control task. The cortical auditory processing of target syllables was significantly suppressed during rehearsal compared with control, but only when they matched the rehearsed items. This critical specificity to speech content enables accurate speech monitoring by forward prediction, as proposed by current models of speech production. The one-to-one phonological motor-to-auditory mappings also appear to serve the maintenance of information in phonological working memory. Further findings of right-hemispheric suppression in the case of whole-item matches and left-hemispheric enhancement for last-syllable mismatches suggest that speech production is monitored by 2 auditory-motor circuits operating on different timescales: Finer grain in the left versus coarser grain in the right hemisphere. Taken together, our findings provide hemisphere-specific evidence of the interface between inner and heard speech.
    Cerebral Cortex 01/2014; · 6.83 Impact Factor

Full-text (2 Sources)

Available from
May 20, 2014