Seeing facial motion affects auditory processing in noise.
ABSTRACT Speech perception, especially in noise, may be maximized if the perceiver observes the naturally occurring visual-plus-auditory cues inherent in the production of spoken language. Evidence is conflicting, however, about which aspects of visual information mediate enhanced speech perception in noise. For this reason, we investigated the relative contributions of audibility and the type of visual cue in three experiments in young adults with normal hearing and vision. Relative to static visual cues, access to the talker's phonetic gestures in speech production, especially in noise, was associated with (a) faster response times and sensitivity for speech understanding in noise, and (b) shorter latencies and reduced amplitudes of auditory N1 event-related potentials. Dynamic chewing facial motion also decreased the N1 latency, but only meaningful linguistic motions reduced the N1 amplitude. The hypothesis that auditory-visual facilitation is distinct to properties of natural, dynamic speech gestures was partially supported.
Thank you for your interest in this article! The full text can be found through the publisher at the link
Jaimie Gilbert, Ph.D., CCCA
School of Communicative Disorders
University of Wisconsin Stevens Point
1901 Fourth Avenue
Stevens Point, WI 54481
Gilbert, J. L., Lansing, C. R., & Garnsey, S. M. (2012). Seeing facial motion affects auditory
processing in noise. Attention, Perception, & Psychophysics. DOI:10.3758/s13414-012-