Article

Seeing and hearing others and oneself talk.

Laboratory of Computational Engineering, Helsinki University of Technology, PO Box 9203, FIN-02015 HUT, Finland.
Cognitive Brain Research (Impact Factor: 3.77). 06/2005; 23(2-3):429-35. DOI: 10.1016/j.cogbrainres.2004.11.006
Source: PubMed

ABSTRACT We studied the modification of auditory perception in three different conditions in twenty subjects. Observing other person's discordant articulatory gestures deteriorated identification of acoustic speech stimuli and modified the auditory percept, causing a strong McGurk effect. A similar effect was found when the subjects watched their own silent articulation in a mirror and acoustic stimuli were simultaneously presented to their ears. Interestingly, a smaller but significant effect was even obtained when the subjects just silently articulated the syllables without visual feedback. On the other hand, observing other person's or one's own concordant articulation and silently articulating a concordant syllable improved identification of the acoustic stimuli. The modification of auditory percepts caused by visual observation of speech and silently articulating it are both suggested to be due to the alteration of activity in the auditory cortex. Our findings support the idea of a close relationship between speech perception and production.

1 Bookmark
 · 
93 Views
  • Source
    The New Handbook of Multisensory Processes, Edited by Barry E. Stein, 01/2012: chapter Multisensory interactions in speech perception: pages 435-452; MIT Press.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Over the course of the first 2 years of life, infants are learning a great deal about the sound system of their native language. Acquiring the sound system requires the infant to learn about sounds and their distributions, sound combinations, and prosodic information, such as syllables, rhythm, and stress. These aspects of the phonological system are being learned simultaneously as the infant experiences the language around him or her. What binds all of the phonological units is the context in which they occur, namely, words. In this review, we explore the development of phonetics and phonology by showcasing the interactive nature of the developing lexicon and sound system with a focus on perception. We first review seminal research in the foundations of phonological development. We then discuss early word recognition and learning followed by a discussion of phonological and lexical representations. We conclude by discussing the interactive nature of lexical and phonological representations and highlight some further directions for exploring the developing sound system.For further resources related to this article, please visit the WIREs website.Conflict of interest: The authors have declared no conflicts of interest for this article.
    Wiley interdisciplinary reviews. Cognitive science 08/2014; 5(5). · 0.79 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Interaction between covert and overt orofacial gestures has been poorly studied apart from old and rather qualitative experiments. The question deserves special interest in the context of the debate between auditory and motor theories of speech perception, where dual tasks may be of great interest. It is shown here that dynamic mandible and lips movement produced by a participant result in strong and stable perturbations to an inner speech counting task that has to be realized at the same time, while static orofacial configurations and static or dynamic manual actions produce no perturbation. This enables the authors to discuss how such kinds of orofacial perturbations could be introduced in dual task paradigms to assess the role of motor processes in speech perception.
    The Journal of the Acoustical Society of America 10/2014; 136(4):1869. · 1.65 Impact Factor

Full-text (2 Sources)

Download
56 Downloads
Available from
May 20, 2014