ArticlePDF Available

Abstract

Dynamic facial expressions of emotion are ecologically valid and powerful media for emotional communication compared to static expressions. However, little is known about perceiving and processing dynamic facial expressions of emotions. In the presented study we compared the EMG activity to dynamic (natural) facial expressions with the static ones (pictures of faces with equivalent static emotional expressions). We hypothesized that dynamic faces enhanced facial muscular reactions. 40 subjects were exposed to dynamic and static stimuli of happy, anger, fear and surprise faces while facial electromyographic (EMG) activity from the zygomatic major, the corrugator supercilii and the orbicularis oculi muscle regions was recorded from the left side of the face. The analysis of data showed that: 1) the dynamic presentations of happy expressions induced stronger EMG activity in the zygomatic major and orbicularis oculi compared to static presentations. Moreover happy faces evoked decreased corrugator EMG activity, 2) presentations of angry expressions induced the strongest EMG activity in the corrugator supercilii and orbicularis oculi however, the greater activity was observed in response to static than to dynamic stimuli inthe corrugator supercilii, 3) two other expressions of emotion: fear and surprise did not evoke significant changes in EMG activity of none of three muscles. It was shown that some natural (dynamic) emotional expression evokes stronger facial muscle reactions interpretable as facial mimicry more evidently than static expressions. It seems that the dynamic property facilitates perceiving and processing facial expressions of emotions.
This article appeared in a journal published by Elsevier. The attached
copy is furnished to the author for internal non-commercial research
and education use, including for instruction at the authors institution
and sharing with colleagues.
Other uses, including reproduction and distribution, or selling or
licensing copies, or posting to personal, institutional or third party
websites are prohibited.
In most cases authors are permitted to post their version of the
article (e.g. in Word or Tex form) to their personal website or
institutional repository. Authors requiring further information
regarding Elsevier’s archiving and manuscript policies are
encouraged to visit:
http://www.elsevier.com/copyright
Author's personal copy
Short communication
EMG activity in response to static and dynamic facial expressions
Krystyna Rymarczyk
a,b,
, Cezary Biele
a
, Anna Grabowska
a
, Henryk Majczynski
a
a
Nencki Institute of Experimental Biology, Department of Neurophysiology, Warsaw, Poland
b
Warsaw School of Social Sciences and Humanities, Department of Experimental Neuropsychology, Warsaw, Poland
abstractarticle info
Article history:
Received 14 April 2010
Received in revised form 4 November 2010
Accepted 5 November 2010
Available online 11 November 2010
Keywords:
EMG activity
Dynamic expression
Static expression
The EMG activity associated with static and dynamic facial expressions (morphs with happy or angry
emotions) were compared. We hypothesized that dynamic faces would (a) enhance facial muscular reactions
and (b) evoke higher intensity ratings.
Our analysis showed that dynamic expressions were rated as more intense than static ones. Subjects reacted
spontaneously and rapidly to happy faces with increased zygomaticus major EMG activity and decrease
corrugator supercilii EMG activity - showing greater changes in response to dynamic than to static stimuli in
both muscles. In contrast, angry faces evoked no alteration of EMG activity in zygomaticus muscles and only
small changes in the corrugator muscle EMG, and there was no difference between the responses to static and
dynamic stimuli. It may be concluded that the dynamic property facilitates processing of facial expressions of
emotions.
© 2010 Elsevier B.V. All rights reserved.
1. Introduction
Understanding the emotions of others is crucial for appropriate
social communication. The face is believed to be the most important
channel of emotional expression in humans (Mehrabian, 1981). Most
research on the importance of facial expressions for social interaction
has been conducted using static faces as stimuli. However, the
importance of dynamic properties of facial expression has recently
been emphasized in the psychological literature. Some studies have
shown that dynamic presentation of facial expressions improves the
recognition of emotional content (Harwood et al., 1999; Kamachi
et al., 2001; Sato and Yoshikawa, 2004; Wehrle et al., 2000), and also
enhances emotional arousal (Biele and Grabowska, 2006; Yoshikawa
and Sato, 2006). Despite using different kinds of stimuli, such as the
point-light technique (Bassili, 1979), computer-generated schematic
movies (Wehrle et al., 2000; Weyers et al., 2006), computer-
generated morphed animations (Biele and Grabowska, 2006; Sato
et al., 2008) and natural movies (Fujimura et al., 2010; Harwood et al.,
1999), the majority of studies have highlighted the importance of
dynamics in the evaluation of emotional expression.
It is well known that humans react to emotional facial expressions
with specic, congruent facial muscle activity (facial mimicry), which
can be reliably measured by electromyography (EMG; e.g. Dimberg,
1982; Larsen et al., 2003). For example, pictures of angry facial
expressions evoke increased M. corrugator supercilii activity, while
pictures of happy facial expressions increase M. zygomaticus major
activity and decrease M. corrugator supercilii activity. These facial
muscular reactions appear to be spontaneous and automatic (Dim-
berg and Thunberg, 1998).
Two recent studies have shown that presentations of dynamic
facial expressions evoke stronger EMG responses than static ones,
although the results are inconsistent. For example, Weyers et al.
(2006) saw a stronger facial reaction (EMG) to stimulation by dy-
namic rather than static happy facial expressions of avatars (computer
synthesized faces), with increased activity of M. zygomaticus major
and decreased activity of M. corrugator supercilii. For angry facial
expressions, they found no signicant differences in the level of
corrugator supercilii activity evoked by dynamic and static presenta-
tions. In the study of Sato et al. (2008), which employed stimuli
selected from a video database of facial expressions of emotion
prepared by computer-morphing techniques, the dynamic presenta-
tion of happy facial expressions also evoked stronger EMG responses
in M. zygomaticus major. But contrary to the results of Weyers
and coworkers, dynamic angry facial expressions evoked stronger
EMG responses in M. corrugator supercilii than static ones, but no
differences were found in the responses to static and dynamic happy
facial expressions.
In view of the discrepancies in the literature, we designed a study
which utilized a computer-morphing technique to present dynamic
expressions of anger and happiness. This technique was able to
increase the dynamicity of the presented facial emotions by showing a
series of pictures starting from a neutral (0%) facial expressions and
progressing to 100% emotional expression within a shorter time
period than in previous studies (Weyers et al., 2006; Sato et al., 2008).
We hypothesized that abrupt changes from neutral to full expression
International Journal of Psychophysiology 79 (2011) 330333
Corresponding author. Nencki Institute of Experimental Biology, Department of
Neurophysiology, Laboratory of Psychophysiology, 3 Pasteur St., 02-093 Warsaw,
Poland. Tel.: +48 22 5892 393; fax: +48 22 822 53 42.
E-mail address: k.rymarczyk@nencki.gov.pl (K. Rymarczyk).
0167-8760/$ see front matter © 2010 Elsevier B.V. All rights reserved.
doi:10.1016/j.ijpsycho.2010.11.001
Contents lists available at ScienceDirect
International Journal of Psychophysiology
j o u r n a l ho m e p a g e : w w w. e l s e v i e r. c o m / l o c a t e / i j p s y c h o
Author's personal copy
in the presented faces would result in (a) more pronounced facial
muscular reactions and (b) higher intensity ratings compared to static
faces.
2. Method
2.1. Participants
Thirty right-handed subjects participated in the study. All partici-
pants were told that the thermal responses of skin to human faces were
to be measured. The subjects had normal or corrected-to-normal
eyesight. Each participant was paid 20 PLN(~ 6 EUR) and their informed
consent was obtained twice: before the start of testing and after the
testing, when they were informed of the true goal of the experiment.
During the experiment, the participants were video-taped using a
hidden camera (PANASONIC NV-MX500) for off-line analysis. The video
tapes were inspected together with the EMG records and 3 subjects
were rejected from the data analysis because of body movement, teeth
grinding, or excessive blinking, which caused artifacts in the EMG
signals. This left twenty-seven subjects (15 female and 12 male, age 23
26 years) for data analyses. The study was conducted in accordance with
the guidelines for ethical research and the experimental procedures
were approved by the Local Ethics Committee.
2.2. Materials
The stimuli were black and white pictures and animations of two
facial emotional expressions, anger and happiness, performed by four
actors (two female and two male), taken from the Montreal Set of
Facial Displays of Emotion (MSFDE; Beaupré and Hess, 2005). For
dynamic expressions, 1.5 s movies were produced from six pictures
representing different states of emotional intensity, i.e. increasing
valence (neutral, 20, 40, 60, 80 and 100%), prepared using computer-
morphing techniques (Hess et al., 1997). The rst ve pictures in each
set (neutral to 80% intensity) were presented for 100 ms each, while
the full expression picture (100% of the emotion) was displayed for
1 s, creating the compelling illusion of a short movie clip displaying a
dynamic facial expression of either anger or happiness (Biele and
Grabowska, 2006). The dynamic presentation was more abrupt than
in other studies in which changes from neutral to full expression
lasted 1500 ms (Sato et al., 2008) or 1000 ms (Weyers et al., 2006).
According to Achaibou et al. (2008), displaying the apex of the
emotional expression for 1 s made the dynamic presentation more
natural. Static pictures of the full emotion were also presented for
1.5 s.
Following the experiments with EMG recording, the new set of
stimuli were presented and the participants asked to rate their
emotional valence.
2.3. EMG recordings
Facial EMG activity was measured using bipolar Ag/AgCl miniature
electrodes (4 mm in diameter). Electrodes lled with electrode paste
(Brain Products GmbH, Munich, Germany) were positioned over the
M. zygomaticus major and M. corrugator supercilii on the left side of
the face (Cacioppo et al., 1986). The reference electrode (10 mm in
diameter) was attached to the forehead. Before the electrodes were
attached, the skin was cleaned with alcohol, and a thin coating of
electrode paste was applied. This procedure reduced the electrode
side resistance to b5 k . EMG recordings were made using a BrainAmp
MR amplier (Brain Products), ltered with 30 Hz high-pass, 500 Hz
low-pass and 50 Hz notch lters, digitized using a 24-bit A/D
converter with a sampling rate equal to 1 kHz, and nally stored on
a personal computer (PC). Off-line, the signals were rectied and
integrated with a moving average lter integrating over 50 ms. The
EMG response was measured as the difference between the recorded
EMG signal and EMG rectied mean activity during the last second
before stimulus presentation.
2.4. Procedure
Participants were individually seated in a sound-attenuated and
electromagnetically-shielded room. The subjects were encouraged to
relax and feel comfortable. Images of 16 ×16 cm in size on a gray
background were presented on a 19-inch LCD computer monitor
positioned 1.5 m in front of the participant. Each stimulus was
preceded by a black xation dot (15 mm in diameter) shown in the
middle of the screen for 0.5 s. Morphs of two actors of opposite sex
were randomly chosen from a group composed of two males and two
females. According to Dimberg (1982), the stimuli were presented in
blocks of 6 of the same kind (i.e. 3 static angry female facial
expressions, 3 static angry male facial expressions, presented
randomly), each one separated by a random interval of 15 to 25 s.
The order of the blocks was randomized, but for each subject, the
blocks containing static or dynamic stimuli were presented alter-
nately as were the blocks containing happy or angry stimuli. In total,
each subject was presented with 24 stimuli: 6 stimuli ×anger vs.
happiness ×static vs. dynamic. The presentation of stimuli was
controlled by a PC running Inquisite2 software (Millisecond Inc.,
Seattle, USA). The rst part of the experiment with EMG recording
lasted about 10 min. After a 20 min break, subjects rated the intensity
of static and dynamic facial expressions. In this second part of the
experiment, a total of 32 stimuli were presented in random order in
two blocks of 16 (4 actors ×anger vs. happiness ×static vs. dynamic)
as used in our previous study (Biele and Grabowska, 2006). Each
stimulus was presented once in each block. The subjects rated the
pictures and animations for emotional intensity according to a 4-point
Likert-type scale (1 —“low intensity,4—“high intensity) (for
details see Biele and Grabowska, 2006). Rating was performed after
each presentation using a four button response pad (Cedrus RB-620,
Cedrus Corporation, San Pedro, USA). The second part of the
experiment with ratings lasted 15 min.
3. Results
3.1. Facial EMG
The data were analyzed using a 2 (Emotion: happy vs. angry)×2
(Stimulus dynamics: static vs. dynamic)×3 (period: means for 0
500 ms, 5001000 ms, and 10001500 ms after stimulus onset)
factorial ANOVA. The dependent variable was the EMG activity
difference (in μV) between mean activity within a given period and
the mean activity within the 1000 ms period before stimulus
presentation. The analysis was performed separately for M. zygoma-
ticus major and for M. corrugator supercilii. Since we expected higher
EMG activity for happy than for angry stimuli and for dynamic than for
static stimuli in M. zygomaticus major, and higher EMG activity for
angry than for happy stimuli and for dynamic than for static stimuli in
M. corrugator supercilii, a one-tailed comparison of Emotion and
Stimulus dynamics was performed. Because the ANOVA showed no
signicant main or interaction effects for participant's gender, this
factor was not considered further, and data recorded from the men
and women were combined for each muscle.
The analysis revealed signicant effects of Emotion (F(1,26) =
5.17; pb0.05) and Period (F(1,26)= 8.15; pb0.001), and signicant
interactions of Period×Emotion (F(2,52)= 4.80; pb0.05) and Emo-
tion×Stimulus dynamics (F(1,26)= 5.00; pb0.05) (Fig. 1).
Follow-up ANOVAs showed that for angry facial expressions the
EMG response was different between the rst (0500 ms) and third
(10001500 ms) period (F(1,26) = 9.52, pb0.005), but not between
the rst and second or the second and third period. For happy
stimuli, the EMG response was smaller in the rst than in the second
331K. Rymarczyk et al. / International Journal of Psychophysiology 79 (2011) 330333
Author's personal copy
period (F(1,26)= 10.7, pb0.005) and smaller in the rst than in the
third period (F(1,26) =8.13, pb0.05). This analysis also revealed
that the EMG response was greater for angry than for happy stimuli
in the seco nd (F( 1,26) = 6.24, pb0.0 1) an d the third (F(1,2 6) = 5.05,
pb0.01) period, but not in the rst period.
Further analysis of the interaction between Emotion and Stimulus
dynamics showed that for happy facial expressions, EMG activity was
greater after the presentation of dynamic stimuli (F(1,26) = 2.92,
pb0.05). However, for angry facial expressions, there was no dif-
ference between dynamic and static stimuli (F(1,26) = 2.01,
p= 0.08). Moreover, the EMG activity in response to dynamic happy
facial expressions was greater than that evoked by dynamic angry
facial expressions (F(1,26)=6.80, pb0.01). In contrast, the EMG
activity in response to static happy and angry stimuli was not
signicantly different (F(1,26)=0.4, p= 0.128).
One sample analysis showed that zygomaticus major activity
signicantly higher than the baseline occurred for the happy dynamic
stimuli in all three periods after stimulus presentation (t(26)=2.46,
pb0.05, t(26)= 2.78, pb0.005 and t(26)= 2.38, pb0.05 respectively),
and for the happy static stimuli, in the secondand third period (t(26) =
2.66, pb0.01 and t(26)= 3.08, pb0.005, respectively). EMG activity for
this muscle was not different from the baseline in any of three periods
after the presentation of angry static or dynamic stimuli.
In contrast to the ndings for the M. zygomaticus major, the
ANOVA of M. corrugator supercilii activity demonstrated signicant
effects only of Emotion (F(1,26) = 4.31; pb0.05), and a signicant
interaction of Period×Stimulus dynamics (F(2,56) = 4.89; pb0.01)
(Fig. 2).
Follow-up analysis of the Period×Stimulus dynamics interaction
revealed that there was no difference between EMG activity in
response to static and dynamic stimuli in the rst (F(1,26) = 0.96,
p=0.16) and second (F(1,26) = 0.37, p= 0.27) period after stimulus
presentation, but in the third period, the EMG activity in response to
dynamic stimuli decreased signicantly in comparison with the
response to static stimuli (F(1,26) = 3.53, pb0.05). To investigate
the effect of stimulus dynamics, we analyzed the muscle activity in the
three periods after stimulus presentation. For static stimuli there was
no difference between any pair of periods. For dynamic stimuli there
was a difference between the rst and third period (F(1,26) = 10.61,
pb0.005), but not between the rst and second, nor between the
second and third.
One sample t-test analysis showed a signicant drop in corrugator
supercilii activity in response to happy dynamic facial expressions
in all three periods after stimulus presentation (t(26) = 2.46, pb0.05,
t(26)= 2.78, pb0.01, and t(26) = 2.39, pb0.05, respectively), while a
drop in activity compared to the baseline after happy static stimuli
occurred only in the second period (t(26) = 2.0, pb0.05). No changes
in the EMG activity of this muscle were observed in any of the
analyzed periods in response to the presentation of static or dynamic
angry stimuli besides the increase of the EMG activity in the rst
period in response to angry dynamic stimuli (t(26) = 1.82, pb0.05).
3.2. Psychological ratings
For ratings (4-point Likert-type scale), a 2 (Emotion: happy vs.
angry)×2 (Stimulus dynamics: static vs. dynamic) factorial ANOVA
was conducted. This analysis revealed a signicant main effect of
Stimulus dynamics (F(1,26)= 12.02; pb0.01). Dynamic stimuli were
rated as more intense than static stimuli. The mean rating (± SEM) for
happy dynamic facial expressions was 2.85 ± 0.04 while for static
happy facial expressions was 2.68± 0.045. For angry facial expres-
sions the mean ratings were 2.95 ± 0.05 and 2.7 ±0.035 respectively.
Neither the other main effect (Emotion) nor any of the factorial
interactions reached statistical signicance.
4. Discussion
In this study we compared the EMG response of two facial muscles
to dynamic expressions of emotion (morphs in which expressions
changed abruptly from neutral to happy or angry) with that to static
expressions (pictures of faces with apex emotional expressions from
dynamic presentations). The recorded data showed that the subjects
reacted spontaneously and rapidly to happy facial expressions with
increased M. zygomaticus major EMG activity and decreased M.
corrugator supercilii EMG activity. In M. zygomaticus major, the
change in the activity was greater in response to dynamic than to
static stimuli, but this was not the case in M. corrugator supercilii. As
expected, angry facial expressions evoked no alterations in the EMG
activity in the M. zygomaticus major, irrespective of the presentation
mode (static or dynamic), but surprisingly this emotion produced
only a minor increase in the mean EMG activity in M. corrugator
supercilii, with no signicant difference between the responses to
static and dynamic stimuli. Thus, increased dynamicity of the
presented stimuli did not evoke more pronounced facial muscle
reaction, at least in M. corrugator supercilii response to angry facial
expressions. However, the subjects rated dynamic facial expressions
as more intense than static ones for both types of emotion.
In general, our results concerning M. zygomaticus major are in
agreement with those of previous EMG studies in which the
presentation of happy facial expressions elicited automatic facial
muscular activity, interpretable as facial mimicry (e.g. Dimberg and
Fig. 1. Mean EMG change activity for M. zygomaticus major in response to static and
dynamic emotional facial expressions.
Fig. 2. Mean EMG change activity for M. corrugator supercilii in response to static and
dynamic emotional facial expressions.
332 K. Rymarczyk et al. / International Journal of Psychophysiology 79 (2011) 330333
Author's personal copy
Thunberg, 1998), and the muscle response was more pronounced
when dynamic stimuli were presented (Sato et al., 2008; Weyers et al.,
2006).
The results obtained for mean EMG activity of M. corrugator
supercilii in response to happy or angry stimuli are less consistent
with those of earlier studies. We found no difference in the responses
to static or dynamic angry facial expressions. Moreover, the response
to angry facial expressions was generally negligible, since in most of
the analyzed periods after stimulus presentation the evoked EMG
activity was not different from the baseline. The only signicant
increase in EMG activity was observed in response to dynamic angry
facial expressions and occurred in the rst period (0500 ms).
Our results concerning the EMG activity of M. corrugator supercilii
are similar to those of Weyers et al. (2006), who failed to observe any
signicant difference in the mean EMG activity of this muscle in
response to static or dynamic angry facial expressions. The lack of an
EMG response to angry facial expressions is puzzling, but the stronger
response to happy than to angry stimuli supports the notion that
angry behavior might be disadvantageous in social interactions
(Weyers et al., 2006) and is generally disapproved of in Western
cultures (Hess and Bourgeois, 2010). It is also likely that the small or
absent EMG response to angry facial expressions is related to the
articial situation of the laboratory environment, where negative
stimuli lose their valence (Larsen et al. 2003).
Increased EMG activity in the rst period (0500 ms) was also
observed by Dimberg and Thunberg (1998) as a reaction to static
presentations of both angry and happy facial expressions, and other
types of visual stimuli (Dimberg et al., 1998). Such a rapid increase
during the rst three 100-ms intervals after stimulus onset may be
interpreted as a blink reex, or as an orienting response (Dimberg,
2000). In our experiments, such a response occurred in reaction to
dynamic angry facial expressions, but was not observed after the
presentation of the three other stimulus types (static and dynamic
happy, and static angry facial expressions). It has been suggested that
a more rapid response to angry facial expressions may reect faster
processing of negative, threat-related signals (Achaibou et al., 2008).
The obtained response pattern may also be an effect of the specicity
of the stimuli, since an angry facial expressions has a stronger
movement signal than other facial expressions (Horstmann and
Ansorge, 2009).
The present study was also designed to investigate the effect of
dynamic facial expression on intensity ratings. Participants rated
dynamic stimuli as more intense than static stimuli for both happy
and angry facial expressions. This might indicate that dynamically-
presented facial expressions were perceived as more realistic than
static ones, and were thus rated as more intense. It is also feasible that
the perceived higher intensity of dynamic stimuli could be evoked by
elevated arousal and attention induced by motion (Simons et al.,
1999; Weyers et al., 2006).
In conclusion, the ndings of our study conrm the importance of
dynamic stimuli in some emotional processing. It appears that in
specic social situations the property of dynamicity facilitates
processing of facial expressions of emotion.
In future studies it would be informative to examine other static
and dynamic facial expressions. Previous studies have indicated that
the intensity of facial mimicry could be related to emotionality or
individual personality (Achaibou et al., 2008; Sonnby-Borgström,
2002; Vrana and Gross, 2004), so dividing subjects into different
groups could make the results more pronounced. It would also be
interesting to study EMG activity in response to videos of real facial
emotions.
Acknowledgements
This study was supported by grant no. H01F 043 29 from the Polish
Ministry of Science and Higher Education to the rst author. The
comments of the reviewers are also acknowledged.
References
Achaibou, A., Pourtois, G., Schwartz, S., Vuilleumier, P., 2008. Simultaneous recording of
EEG and facial muscle reactions during spontaneous emotional mimicry.
Neuropsychologia 46, 11041113.
Bassili, J.N., 1979. Emotion recognition: the role of facial movement and the relative
importance of upper and lower areas of the face. J. Pers. Soc. Psychol. 37,
20492058.
Beaupré, M.G., Hess, U., 2005. Cross-cultural emotion recognition among Canadian
ethnic groups. J. Cross Cult. Psychol. 26, 355370.
Biele, C., Grabowska, A., 2006. Sex differences in perception of emotion intensity in
dynamic and static facial expressions. Exp. Brain Res. 171, 16.
Cacioppo, J.T., Petty, E.P., Losch, M.E., Kim, H.S., 1986. Electromyographic activity over
facial muscle regions can differentiate the valence and intensity of affective
reactions. J. Pers. Soc. Psychol. 50, 260268.
Dimberg, U., 1982. Facial reactions to facial expressions. Psychophysiology 19, 643647.
Dimberg, U., Petterson, M., 2000. Facial reactions to happy and angry facial expressions:
evidence for right hemisphere dominance. Psychophysiology 37, 693696.
Dimberg, U., Thunberg, M., 1998. Rapid facial reactions to emotional facial expressions.
Scand. J. Psychol. 39, 3945.
Dimberg, U., Hansson, G., Thunberg, M., 1998. Fear of snakes and facial reactions: a case
of rapid emotional responding. Scand. J. Psychol. 39, 7580.
Fujimura, T., Sato, W., Suzuki, N., 2010. Facial expression arousal level modulates facial
mimicry. Int. J. Psychophysiol. 76, 8892.
Harwood, N.K., Hall, L.J., Shinkeld, A.J., 1999. Recognition of facial emotional
expressions from moving and static displays by individuals with mental
retardation. Am. J. Ment. Retard. 104, 270278.
Hess, U., Blairy, S., Kleck, R., 1997. The intensity of emotional facial expressions and
decoding accuracy. J. Nonverbal Behav. 21, 241257.
Hess, U., Bourgeois, P., 2010. You smileI smile: e motion expression in social
interaction. Biol. Psychol. 84, 514520.
Horstmann, G., Ansorge, U., 2009. Visual search for facial expressions of emotions: a
comparison of dynamic and static faces. Emotion 9, 2938.
Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., Akamatsu, S., 2001.
Dynamic properties inuence the perception of facial expressions. Perception 30,
875887.
Larsen, J.T., Norris, C.J., Cacioppo, J.T., 2003. Effects of positive and negative affect on
electromyographic activity over zygomaticus major and corrugator supercilii.
Psychophysiology 40, 776785.
Mehrabian, A., 1981. Silent messages: implicit communication of emotions and
attitudes. Wadsworth, Belmont, CA.
Sato, W., Fujimura, T., Suzuki, N., 2008. Enhanced facial EMG activity in response to
dynamic facial expressions. Int. J. Psychophysiol. 70, 7074.
Sato, W., Yoshikawa, S., 2004. The dynamic aspects of emotional facial expressions.
Cogn. Emot. 18, 701710.
Simons, R.F., Detenber, B.H., Roedema, T.M., Reiss, J.E., 1999. Emotion processing in
three systems: the medium and the message. Psychophysiology 36, 619627.
Sonnby-Borgström, M., 2002. Automatic mimicry reactions as related to differences in
emotional empathy. Scand. J. Psychol. 43, 433443.
Vrana, S.R., Gross, D., 2004. Reactions to facial expressions: effects of social context and
speech anxiety on responses to neutral, anger, and joy expressions. Biol. Psychol.
66, 6378.
Wehrle, T., Kaiser, S., Schmidt, S., Scherer, K.R., 2000. Studying dynamic models of facial
expression of emotion using synthetic animated faces. J. Pers. Soc. Psychol. 78,
105119.
Weyers, P., Muhlberger, A., Hefele, C., Pauli, P., 2006. Electromyographic responses to
static and dynamic avatar emotional facial expressions. Psychophysiology 43,
450453.
Yoshikawa, S., Sato, W., 2006. Enhanced perceptual, emotional, and motor processing in
response to dynamic facial expressions of emotion. Jpn Psychol. Res. 48, 213222.
333K. Rymarczyk et al. / International Journal of Psychophysiology 79 (2011) 330333
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
In this study, we presented computer‐morphing animations of the facial expressions of six emotions to 43 subjects and asked them to evaluate the naturalness of the rate of change of each expression. The results showed that the naturalness of the expressions depended on the velocity of change, and the patterns for the four velocities differed with the emotions. Principal component analysis of the data extracted the structures that underlie the evaluation of dynamic facial expressions, which differed from previously reported structures for static expressions in some aspects. These results suggest that the representations of facial expressions include not only static but also dynamic properties.
Article
Full-text available
The influence of the physical intensity of emotional facial expressions on perceived intensity and emotion category decoding accuracy was assessed for expressions of anger, disgust, sadness, and happiness. The facial expressions of two men and two women posing each of the four emotions were used as stimuli. Six different levels of intensity of expression were created for each pose using a graphics morphing program. Twelve men and 12 women rated each of the 96 stimuli for perceived intensity of the underlying emotion and for the qualitative nature of the emotion expressed. The results revealed that perceived intensity varied linearly with the manipulated physical intensity of the expression. Emotion category decoding accuracy varied largely linearly with the manipulated physical intensity of the expression for expressions of anger, disgust, and sadness. For the happiness expressions only, the findings were consistent with a categorical judgment process. Sex of encoder produced significant effects for both dependent measures. These effects remained even after possible gender differences in encoding were controlled for, suggesting a perceptual bias on the part of the decoders.
Article
Full-text available
This study aims to investigate cultural differences in recognition accuracy as well as the in-group advantage hypothesis for emotion recognition among sub-Saharan African, Chinese, and French Canadian individuals living in Canada. The participants viewed expressions of happiness, anger, sadness, fear, disgust, and shame selected from the Montreal Set of Facial Displays of Emotion. These data did not support the in-group advantage hypothesis under the condition of stimulus equivalence. However, both encoder and decoder effects were found. Specifically, French Canadians were more accurate for the decoding of expressions of shame and sadness. Moreover, fear expressions were best recognized when shown by sub-Saharan Africans, suggesting an effect of salience of expressive cues due to morphological features of the face.
Article
Full-text available
A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated.
Article
The aim of the present study was to explore how rapid emotional responses are manifested as facial electromyographic (EMG) reactions when people with explicit fear of snakes are exposed to their fear relevant stimuli. Fifty-six subjects, high or low in fear of snakes, were exposed to pictures of snakes and flowers while facial EMG activity from the corrugator supercilii and the zygomatic major muscle regions was recorded. Measures of autonomic activity and ratings of the stimuli were also collected. Pictures of snakes evoked a rapid corrugator supercilii muscle reaction which was larger in the High fear group as early as 500ms after stimulus onset. The High fear group also rated snakes as more unpleasant and displayed larger skin conductance responses (SCRs) and increased heart rate (HR) when exposed to snakes. Pictures of flowers tended to evoke increased zygomatic major muscle activity which did not differ among the groups. The present results demonstrate that the facial EMG technique is sensitive enough to detect rapidly evoked negative emotional reactions. The results support the hypothesis that people high in fear of snakes are disposed to react very rapidly with a negative emotional response to their fear relevant stimuli.
Article
Dynamic facial expressions of emotion constitute natural and powerful media compared with static ones. However, little is known about the processing of dynamic facial expressions of emotion. In this paper, we describe the results of our recent neuroimaging and psychological studies on this issue. A neuroimaging study was conducted to investigate brain activity while viewing dynamic facial expressions. The results revealed that the broad region of visual cortices, the amygdala, and the right inferior frontal gyrus were more activated in response to dynamic facial expressions than control stimuli, such as static facial expressions and dynamic mosaics. In corresponding with the characteristics of these brain activities, the results of three psychological studies indicated that the dynamic presentation: (a) intensified the perceptual image of the facial expression (perceptual enhancement); (b) enhanced the emotional feeling; and (c) elicited spontaneous and rapid facial mimicry. These results revealed that the dynamic property facilitates the perceptual, emotional, and motor processing of facial expressions of emotion.
Article
We investigated the effect of facial expression arousal level and mode of presentation on facial mimicry. High- and low-arousal facial expressions indicating pleasant and unpleasant emotions were presented both statically and dynamically. Participants' facial electromyographic (EMG) reactions were recorded from the zygomatic major and corrugator supercilii muscles. Stronger zygomatic major muscle activity was evoked for high- compared to low-arousal pleasant expressions. Comparable activity was induced in the corrugator supercilii muscle in response to both high- and low-arousal unpleasant expressions, and this was true for both dynamic and static presentations. These results suggest that the arousal levels of pleasant, but not unpleasant, facial expressions can enhance facial mimicry.
Article
Two studies were conducted to assess the influence of emotional context and social context, in terms of gender and status, on speaker expressivity and observer mimicry in a dyadic interactive setting. For Study 1, 96 same sex dyads and for Study 2, 72 mixed sex dyads participated in a social sharing paradigm. The results showed that in both same sex and mixed sex dyads women smile more than men and members of both sexes use Duchenne smiles rather than non-Duchenne smiles to signal social intent. In same sex dyads facial expressivity and facial mimicry were determined by both the emotional and the social context of the situation. However, whereas emotional context effects maintained, social context effects were absent in mixed sex dyads. The study is the first to show evidence for facial mimicry in an interactional setting and supports the notion that mimicry is dependent on social context.
Article
In order to investigate the role of facial movement in the recognition of emotions, faces were covered with black makeup and white spots. Video recordings of such faces were played back so that only the white spots were visible. The results demonstrated that moving displays of happiness, sadness, fear, surprise, anger and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicated that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of these six emotions was also investigated using normally illuminated and spots-only displays. In both instances the results indicated that different facial regions are more informative for different emitions. The movement patterns characterizing the various emotional expressions as well as common confusions between emotions are also discussed.