Article

What iconic gesture fragments reveal about gesture-speech integration: when synchrony is lost, memory can help.

Max-Planck-Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
Journal of Cognitive Neuroscience (Impact Factor: 4.49). 03/2010; 23(7):1648-63. DOI: 10.1162/jocn.2010.21498
Source: PubMed

ABSTRACT The present series of experiments explores several issues related to gesture-speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture-speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture-speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.

0 Bookmarks
 · 
87 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: The existence of a highly significant linear relationship between the natural logarithm (ln) all-cause mortality rate and age at the population level is firmly established (r>0.99). The slope and intercept of the equation, however, vary markedly between populations. Whether this relationship also applies to specific disease entities has not been established. Use was made of mortality rates for all-cause, total cardiovascular, total cancer and residual diseases. The midpoint of 5-year age classes between the ages of 35 and 84 years, obtained for both sexes, were analysed. The mean of the three latest available years, from the period 1997-1999 were used. The relationship also applies to a slightly lesser degree to the relationship between total cardiovascular mortality rate, consisting predominantly of ischemic heart disease and stroke, and age (r>0.99). Marginally better relationships are obtained using a second-degree polynomial equation between ln all-cause mortality rate and age, age as independent variables. Total ln cancer mortality rate, however, behaves differently with a significant negative deviation of the mortality rate from linearity at older ages. Residual mortality (non-cancer, non-cardiovascular) mortality shows a mirror pattern to cancer mortality. This residual mortality expressed as a percentage of all-cause mortality varies markedly between populations. The level of some major constituents of the residual mortality rates (respiratory diseases, pneumonia, ill-defined causes and senility) also varies markedly. The magnitude of the variation suggests misclassification or misdiagnosis of several important disease entities, for example, between senility and stroke or between pneumonia and lung cancer. This questions the validity of disease-specific mortality rates especially at older ages, making their comparison between countries less reliable.
    European Journal of Cardiovascular Prevention and Rehabilitation 04/2005; 12(2):175-81. · 2.63 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Language and action systems are functionally coupled in the brain as demonstrated by converging evidence using Functional magnetic resonance imaging (fMRI), electroencephalography (EEG), transcranial magnetic stimulation (TMS), and lesion studies. In particular, this coupling has been demonstrated using the action-sentence compatibility effect (ACE) in which motor activity and language interact. The ACE task requires participants to listen to sentences that described actions typically performed with an open hand (e.g., clapping), a closed hand (e.g., hammering), or without any hand action (neutral); and to press a large button with either an open hand position or closed hand position immediately upon comprehending each sentence. The ACE is defined as a longer reaction time (RT) in the action-sentence incompatible conditions than in the compatible conditions. Here we investigated direct motor-language coupling in two novel and uniquely informative ways. First, we measured the behavioural ACE in patients with motor impairment (early Parkinson's disease - EPD), and second, in epileptic patients with direct electrocorticography (ECoG) recordings. In experiment 1, EPD participants with preserved general cognitive repertoire, showed a much diminished ACE relative to non-EPD volunteers. Moreover, a correlation between ACE performance and action-verb processing (kissing and dancing test - KDT) was observed. Direct cortical recordings (ECoG) in motor and language areas (experiment 2) demonstrated simultaneous bidirectional effects: motor preparation affected language processing (N400 at left inferior frontal gyrus and middle/superior temporal gyrus), and language processing affected activity in movement-related areas (motor potential at premotor and M1). Our findings show that the ACE paradigm requires ongoing integration of preserved motor and language coupling (abolished in EPD) and engages motor-temporal cortices in a bidirectional way. In addition, both experiments suggest the presence of a motor-language network which is not restricted to somatotopically defined brain areas. These results open new pathways in the fields of motor diseases, theoretical approaches to language understanding, and models of action-perception coupling.
    Cortex 03/2012; · 6.16 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: As we speak, we use not only the arbitrary form-meaning mappings of the speech channel but also motivated form-meaning correspondences, i.e. iconic gestures that accompany speech (e.g. inverted V-shaped hand wiggling across gesture space to demonstrate walking). This article reviews what we know about processing of semantic information from speech and iconic gestures in spoken languages during comprehension of such composite utterances. Several studies have shown that comprehension of iconic gestures involves brain activations known to be involved in semantic processing of speech: i.e. modulation of the electrophysiological recording component N400, which is sensitive to the ease of semantic integration of a word to previous context, and recruitment of the left-lateralized frontal-posterior temporal network (left inferior frontal gyrus (IFG), medial temporal gyrus (MTG) and superior temporal gyrus/sulcus (STG/S)). Furthermore, we integrate the information coming from both channels recruiting brain areas such as left IFG, posterior superior temporal sulcus (STS)/MTG and even motor cortex. Finally, this integration is flexible: the temporal synchrony between the iconic gesture and the speech segment, as well as the perceived communicative intent of the speaker, modulate the integration process. Whether these findings are special to gestures or are shared with actions or other visual accompaniments to speech (e.g. lips) or other visual symbols such as pictures are discussed, as well as the implications for a multimodal view of language.
    Philosophical transactions of the Royal Society of London. Series B, Biological sciences. 09/2014; 369(1651).

Full-text

View
6 Downloads
Available from
May 28, 2014