Article

What iconic gesture fragments reveal about gesture-speech integration: when synchrony is lost, memory can help.

Max-Planck-Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
Journal of Cognitive Neuroscience (Impact Factor: 4.69). 03/2010; 23(7):1648-63. DOI: 10.1162/jocn.2010.21498
Source: PubMed

ABSTRACT The present series of experiments explores several issues related to gesture-speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture-speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture-speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.

Download full-text

Full-text

Available from: Christian Obermeier, Dec 13, 2013
0 Followers
 · 
131 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: The existence of a highly significant linear relationship between the natural logarithm (ln) all-cause mortality rate and age at the population level is firmly established (r>0.99). The slope and intercept of the equation, however, vary markedly between populations. Whether this relationship also applies to specific disease entities has not been established. Use was made of mortality rates for all-cause, total cardiovascular, total cancer and residual diseases. The midpoint of 5-year age classes between the ages of 35 and 84 years, obtained for both sexes, were analysed. The mean of the three latest available years, from the period 1997-1999 were used. The relationship also applies to a slightly lesser degree to the relationship between total cardiovascular mortality rate, consisting predominantly of ischemic heart disease and stroke, and age (r>0.99). Marginally better relationships are obtained using a second-degree polynomial equation between ln all-cause mortality rate and age, age as independent variables. Total ln cancer mortality rate, however, behaves differently with a significant negative deviation of the mortality rate from linearity at older ages. Residual mortality (non-cancer, non-cardiovascular) mortality shows a mirror pattern to cancer mortality. This residual mortality expressed as a percentage of all-cause mortality varies markedly between populations. The level of some major constituents of the residual mortality rates (respiratory diseases, pneumonia, ill-defined causes and senility) also varies markedly. The magnitude of the variation suggests misclassification or misdiagnosis of several important disease entities, for example, between senility and stroke or between pneumonia and lung cancer. This questions the validity of disease-specific mortality rates especially at older ages, making their comparison between countries less reliable.
    European Journal of Cardiovascular Prevention and Rehabilitation 04/2005; 12(2):175-81. DOI:10.1097/00149831-200504000-00014 · 3.69 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: There is no doubt that gestures are communicative and can be integrated online with speech. Little is known, however, about the nature of this process, for example, its automaticity and how our own communicative abilities and also our environment influence the integration of gesture and speech. In two Event Related Potential (ERP) experiments, the effects of gestures during speech comprehension were explored. In both experiments, participants performed a shallow task thereby avoiding explicit gesture-speech integration. In the first experiment, participants with normal hearing viewed videos in which a gesturing actress uttered sentences which were either embedded in multi-speaker babble noise or not. The sentences contained a homonym which was disambiguated by the information in a gesture, which was presented asynchronous to speech (1000 msec earlier). Downstream, the sentence contained a target word that was either related to the dominant or subordinate meaning of the homonym and was used to indicate the success of the disambiguation. Both the homonym and the target word position showed clear ERP evidence of gesture-speech integration and disambiguation only under babble noise. Thus, during noise, gestures were taken into account as an important communicative cue. In Experiment 2, the same asynchronous stimuli were presented to a group of hearing-impaired students and age-matched controls. Only the hearing-impaired individuals showed significant speech-gesture integration and successful disambiguation at the target word. The age-matched controls did not show any effect. Thus, individuals who chronically experience suboptimal communicative situations in daily life automatically take gestures into account. The data from both experiments indicate that gestures are beneficial in countering difficult communication conditions independent of whether the difficulties are due to external (babble noise) or internal (hearing impairment) factors.
    Cortex 02/2011; 48(7):857-70. DOI:10.1016/j.cortex.2011.02.007 · 6.04 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Language and action systems are functionally coupled in the brain as demonstrated by converging evidence using Functional magnetic resonance imaging (fMRI), electroencephalography (EEG), transcranial magnetic stimulation (TMS), and lesion studies. In particular, this coupling has been demonstrated using the action-sentence compatibility effect (ACE) in which motor activity and language interact. The ACE task requires participants to listen to sentences that described actions typically performed with an open hand (e.g., clapping), a closed hand (e.g., hammering), or without any hand action (neutral); and to press a large button with either an open hand position or closed hand position immediately upon comprehending each sentence. The ACE is defined as a longer reaction time (RT) in the action-sentence incompatible conditions than in the compatible conditions. Here we investigated direct motor-language coupling in two novel and uniquely informative ways. First, we measured the behavioural ACE in patients with motor impairment (early Parkinson's disease - EPD), and second, in epileptic patients with direct electrocorticography (ECoG) recordings. In experiment 1, EPD participants with preserved general cognitive repertoire, showed a much diminished ACE relative to non-EPD volunteers. Moreover, a correlation between ACE performance and action-verb processing (kissing and dancing test - KDT) was observed. Direct cortical recordings (ECoG) in motor and language areas (experiment 2) demonstrated simultaneous bidirectional effects: motor preparation affected language processing (N400 at left inferior frontal gyrus and middle/superior temporal gyrus), and language processing affected activity in movement-related areas (motor potential at premotor and M1). Our findings show that the ACE paradigm requires ongoing integration of preserved motor and language coupling (abolished in EPD) and engages motor-temporal cortices in a bidirectional way. In addition, both experiments suggest the presence of a motor-language network which is not restricted to somatotopically defined brain areas. These results open new pathways in the fields of motor diseases, theoretical approaches to language understanding, and models of action-perception coupling.
    Cortex 03/2012; 49(4). DOI:10.1016/j.cortex.2012.02.014 · 6.04 Impact Factor