What Iconic Gesture Fragments Reveal about Gesture–Speech Integration: When Synchrony Is Lost, Memory Can Help

Max-Planck-Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
Journal of Cognitive Neuroscience (Impact Factor: 4.09). 03/2010; 23(7):1648-63. DOI: 10.1162/jocn.2010.21498
Source: PubMed


The present series of experiments explores several issues related to gesture-speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture-speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture-speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.

Download full-text


Available from: Christian Obermeier, Dec 13, 2013
  • Source
    • "With respect to the latter, it has been found that – contrary to some initial negative findings (Krauss et al., 1991, 1995) – listeners are sensitive to the additional information provided by gesture. For instance, several groups have found that semantically incongruent gesture–speech pairings interfere with language comprehension, and we have reported evidence that gestures can disambiguate lexically ambiguous words (Holle and Gunter, 2007; Obermeier et al., 2011). Thus, the semantic information provided by gestures interacts with the semantic information of speech, and recent brain imaging studies have implicated that the left inferior frontal gyrus (Willems et al., 2007, 2009) and the left posterior temporal lobe (Holle et al., 2008, 2010) are crucially involved in this interaction. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent research suggests that the brain routinely binds together information from gesture and speech. However, most of this research focused on the integration of representational gestures with the semantic content of speech. Much less is known about how other aspects of gesture, such as emphasis, influence the interpretation of the syntactic relations in a spoken message. Here, we investigated whether beat gestures alter which syntactic structure is assigned to ambiguous spoken German sentences. The P600 component of the Event Related Brain Potential indicated that the more complex syntactic structure is easier to process when the speaker emphasizes the subject of a sentence with a beat. Thus, a simple flick of the hand can change our interpretation of who has been doing what to whom in a spoken sentence. We conclude that gestures and speech are integrated systems. Unlike previous studies, which have shown that the brain effortlessly integrates semantic information from gesture and speech, our study is the first to demonstrate that this integration also occurs for syntactic information. Moreover, the effect appears to be gesture-specific and was not found for other stimuli that draw attention to certain parts of speech, including prosodic emphasis, or a moving visual stimulus with the same trajectory as the gesture. This suggests that only visual emphasis produced with a communicative intention in mind (that is, beat gestures) influences language comprehension, but not a simple visual movement lacking such an intention.
    Frontiers in Psychology 03/2012; 3:74. DOI:10.3389/fpsyg.2012.00074 · 2.80 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The existence of a highly significant linear relationship between the natural logarithm (ln) all-cause mortality rate and age at the population level is firmly established (r>0.99). The slope and intercept of the equation, however, vary markedly between populations. Whether this relationship also applies to specific disease entities has not been established. Use was made of mortality rates for all-cause, total cardiovascular, total cancer and residual diseases. The midpoint of 5-year age classes between the ages of 35 and 84 years, obtained for both sexes, were analysed. The mean of the three latest available years, from the period 1997-1999 were used. The relationship also applies to a slightly lesser degree to the relationship between total cardiovascular mortality rate, consisting predominantly of ischemic heart disease and stroke, and age (r>0.99). Marginally better relationships are obtained using a second-degree polynomial equation between ln all-cause mortality rate and age, age as independent variables. Total ln cancer mortality rate, however, behaves differently with a significant negative deviation of the mortality rate from linearity at older ages. Residual mortality (non-cancer, non-cardiovascular) mortality shows a mirror pattern to cancer mortality. This residual mortality expressed as a percentage of all-cause mortality varies markedly between populations. The level of some major constituents of the residual mortality rates (respiratory diseases, pneumonia, ill-defined causes and senility) also varies markedly. The magnitude of the variation suggests misclassification or misdiagnosis of several important disease entities, for example, between senility and stroke or between pneumonia and lung cancer. This questions the validity of disease-specific mortality rates especially at older ages, making their comparison between countries less reliable.
    European Journal of Cardiovascular Prevention and Rehabilitation 04/2005; 12(2):175-81. DOI:10.1097/00149831-200504000-00014 · 3.69 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: There is no doubt that gestures are communicative and can be integrated online with speech. Little is known, however, about the nature of this process, for example, its automaticity and how our own communicative abilities and also our environment influence the integration of gesture and speech. In two Event Related Potential (ERP) experiments, the effects of gestures during speech comprehension were explored. In both experiments, participants performed a shallow task thereby avoiding explicit gesture-speech integration. In the first experiment, participants with normal hearing viewed videos in which a gesturing actress uttered sentences which were either embedded in multi-speaker babble noise or not. The sentences contained a homonym which was disambiguated by the information in a gesture, which was presented asynchronous to speech (1000 msec earlier). Downstream, the sentence contained a target word that was either related to the dominant or subordinate meaning of the homonym and was used to indicate the success of the disambiguation. Both the homonym and the target word position showed clear ERP evidence of gesture-speech integration and disambiguation only under babble noise. Thus, during noise, gestures were taken into account as an important communicative cue. In Experiment 2, the same asynchronous stimuli were presented to a group of hearing-impaired students and age-matched controls. Only the hearing-impaired individuals showed significant speech-gesture integration and successful disambiguation at the target word. The age-matched controls did not show any effect. Thus, individuals who chronically experience suboptimal communicative situations in daily life automatically take gestures into account. The data from both experiments indicate that gestures are beneficial in countering difficult communication conditions independent of whether the difficulties are due to external (babble noise) or internal (hearing impairment) factors.
    Cortex 02/2011; 48(7):857-70. DOI:10.1016/j.cortex.2011.02.007 · 5.13 Impact Factor
Show more