This article appeared in a journal published by Elsevier. The attached
copy is furnished to the author for internal non-commercial research
and education use, including for instruction at the authors institution
and sharing with colleagues.
Other uses, including reproduction and distribution, or selling or
licensing copies, or posting to personal, institutional or third party
websites are prohibited.
In most cases authors are permitted to post their version of the
article (e.g. in Word or Tex form) to their personal website or
institutional repository. Authors requiring further information
regarding Elsevier’s archiving and manuscript policies are
encouraged to visit:
Author's personal copy
EMG activity in response to static and dynamic facial expressions
⁎, Cezary Biele
, Anna Grabowska
, Henryk Majczynski
Nencki Institute of Experimental Biology, Department of Neurophysiology, Warsaw, Poland
Warsaw School of Social Sciences and Humanities, Department of Experimental Neuropsychology, Warsaw, Poland
Received 14 April 2010
Received in revised form 4 November 2010
Accepted 5 November 2010
Available online 11 November 2010
The EMG activity associated with static and dynamic facial expressions (morphs with happy or angry
emotions) were compared. We hypothesized that dynamic faces would (a) enhance facial muscular reactions
and (b) evoke higher intensity ratings.
Our analysis showed that dynamic expressions were rated as more intense than static ones. Subjects reacted
spontaneously and rapidly to happy faces with increased zygomaticus major EMG activity and decrease
corrugator supercilii EMG activity - showing greater changes in response to dynamic than to static stimuli in
both muscles. In contrast, angry faces evoked no alteration of EMG activity in zygomaticus muscles and only
small changes in the corrugator muscle EMG, and there was no difference between the responses to static and
dynamic stimuli. It may be concluded that the dynamic property facilitates processing of facial expressions of
© 2010 Elsevier B.V. All rights reserved.
Understanding the emotions of others is crucial for appropriate
social communication. The face is believed to be the most important
channel of emotional expression in humans (Mehrabian, 1981). Most
research on the importance of facial expressions for social interaction
has been conducted using static faces as stimuli. However, the
importance of dynamic properties of facial expression has recently
been emphasized in the psychological literature. Some studies have
shown that dynamic presentation of facial expressions improves the
recognition of emotional content (Harwood et al., 1999; Kamachi
et al., 2001; Sato and Yoshikawa, 2004; Wehrle et al., 2000), and also
enhances emotional arousal (Biele and Grabowska, 2006; Yoshikawa
and Sato, 2006). Despite using different kinds of stimuli, such as the
point-light technique (Bassili, 1979), computer-generated schematic
movies (Wehrle et al., 2000; Weyers et al., 2006), computer-
generated morphed animations (Biele and Grabowska, 2006; Sato
et al., 2008) and natural movies (Fujimura et al., 2010; Harwood et al.,
1999), the majority of studies have highlighted the importance of
dynamics in the evaluation of emotional expression.
It is well known that humans react to emotional facial expressions
with speciﬁc, congruent facial muscle activity (facial mimicry), which
can be reliably measured by electromyography (EMG; e.g. Dimberg,
1982; Larsen et al., 2003). For example, pictures of angry facial
expressions evoke increased M. corrugator supercilii activity, while
pictures of happy facial expressions increase M. zygomaticus major
activity and decrease M. corrugator supercilii activity. These facial
muscular reactions appear to be spontaneous and automatic (Dim-
berg and Thunberg, 1998).
Two recent studies have shown that presentations of dynamic
facial expressions evoke stronger EMG responses than static ones,
although the results are inconsistent. For example, Weyers et al.
(2006) saw a stronger facial reaction (EMG) to stimulation by dy-
namic rather than static happy facial expressions of avatars (computer
synthesized faces), with increased activity of M. zygomaticus major
and decreased activity of M. corrugator supercilii. For angry facial
expressions, they found no signiﬁcant differences in the level of
corrugator supercilii activity evoked by dynamic and static presenta-
tions. In the study of Sato et al. (2008), which employed stimuli
selected from a video database of facial expressions of emotion
prepared by computer-morphing techniques, the dynamic presenta-
tion of happy facial expressions also evoked stronger EMG responses
in M. zygomaticus major. But contrary to the results of Weyers
and coworkers, dynamic angry facial expressions evoked stronger
EMG responses in M. corrugator supercilii than static ones, but no
differences were found in the responses to static and dynamic happy
In view of the discrepancies in the literature, we designed a study
which utilized a computer-morphing technique to present dynamic
expressions of anger and happiness. This technique was able to
increase the dynamicity of the presented facial emotions by showing a
series of pictures starting from a neutral (0%) facial expressions and
progressing to 100% emotional expression within a shorter time
period than in previous studies (Weyers et al., 2006; Sato et al., 2008).
We hypothesized that abrupt changes from neutral to full expression
International Journal of Psychophysiology 79 (2011) 330–333
⁎Corresponding author. Nencki Institute of Experimental Biology, Department of
Neurophysiology, Laboratory of Psychophysiology, 3 Pasteur St., 02-093 Warsaw,
Poland. Tel.: +48 22 5892 393; fax: +48 22 822 53 42.
E-mail address: firstname.lastname@example.org (K. Rymarczyk).
0167-8760/$ –see front matter © 2010 Elsevier B.V. All rights reserved.
Contents lists available at ScienceDirect
International Journal of Psychophysiology
j o u r n a l ho m e p a g e : w w w. e l s e v i e r. c o m / l o c a t e / i j p s y c h o
Author's personal copy
in the presented faces would result in (a) more pronounced facial
muscular reactions and (b) higher intensity ratings compared to static
Thirty right-handed subjects participated in the study. All partici-
pants were told that the thermal responses of skin to human faces were
to be measured. The subjects had normal or corrected-to-normal
eyesight. Each participant was paid 20 PLN(~ 6 EUR) and their informed
consent was obtained twice: before the start of testing and after the
testing, when they were informed of the true goal of the experiment.
During the experiment, the participants were video-taped using a
hidden camera (PANASONIC NV-MX500) for off-line analysis. The video
tapes were inspected together with the EMG records and 3 subjects
were rejected from the data analysis because of body movement, teeth
grinding, or excessive blinking, which caused artifacts in the EMG
signals. This left twenty-seven subjects (15 female and 12 male, age 23–
26 years) for data analyses. The study was conducted in accordance with
the guidelines for ethical research and the experimental procedures
were approved by the Local Ethics Committee.
The stimuli were black and white pictures and animations of two
facial emotional expressions, anger and happiness, performed by four
actors (two female and two male), taken from the Montreal Set of
Facial Displays of Emotion (MSFDE; Beaupré and Hess, 2005). For
dynamic expressions, 1.5 s movies were produced from six pictures
representing different states of emotional intensity, i.e. increasing
valence (neutral, 20, 40, 60, 80 and 100%), prepared using computer-
morphing techniques (Hess et al., 1997). The ﬁrst ﬁve pictures in each
set (neutral to 80% intensity) were presented for 100 ms each, while
the full expression picture (100% of the emotion) was displayed for
1 s, creating the compelling illusion of a short movie clip displaying a
dynamic facial expression of either anger or happiness (Biele and
Grabowska, 2006). The dynamic presentation was more abrupt than
in other studies in which changes from neutral to full expression
lasted 1500 ms (Sato et al., 2008) or 1000 ms (Weyers et al., 2006).
According to Achaibou et al. (2008), displaying the apex of the
emotional expression for 1 s made the dynamic presentation more
natural. Static pictures of the full emotion were also presented for
Following the experiments with EMG recording, the new set of
stimuli were presented and the participants asked to rate their
2.3. EMG recordings
Facial EMG activity was measured using bipolar Ag/AgCl miniature
electrodes (4 mm in diameter). Electrodes ﬁlled with electrode paste
(Brain Products GmbH, Munich, Germany) were positioned over the
M. zygomaticus major and M. corrugator supercilii on the left side of
the face (Cacioppo et al., 1986). The reference electrode (10 mm in
diameter) was attached to the forehead. Before the electrodes were
attached, the skin was cleaned with alcohol, and a thin coating of
electrode paste was applied. This procedure reduced the electrode
side resistance to b5 k . EMG recordings were made using a BrainAmp
MR ampliﬁer (Brain Products), ﬁltered with 30 Hz high-pass, 500 Hz
low-pass and 50 Hz notch ﬁlters, digitized using a 24-bit A/D
converter with a sampling rate equal to 1 kHz, and ﬁnally stored on
a personal computer (PC). Off-line, the signals were rectiﬁed and
integrated with a moving average ﬁlter integrating over 50 ms. The
EMG response was measured as the difference between the recorded
EMG signal and EMG rectiﬁed mean activity during the last second
before stimulus presentation.
Participants were individually seated in a sound-attenuated and
electromagnetically-shielded room. The subjects were encouraged to
relax and feel comfortable. Images of 16 ×16 cm in size on a gray
background were presented on a 19-inch LCD computer monitor
positioned 1.5 m in front of the participant. Each stimulus was
preceded by a black ﬁxation dot (15 mm in diameter) shown in the
middle of the screen for 0.5 s. Morphs of two actors of opposite sex
were randomly chosen from a group composed of two males and two
females. According to Dimberg (1982), the stimuli were presented in
blocks of 6 of the same kind (i.e. 3 static angry female facial
expressions, 3 static angry male facial expressions, presented
randomly), each one separated by a random interval of 15 to 25 s.
The order of the blocks was randomized, but for each subject, the
blocks containing static or dynamic stimuli were presented alter-
nately as were the blocks containing happy or angry stimuli. In total,
each subject was presented with 24 stimuli: 6 stimuli ×anger vs.
happiness ×static vs. dynamic. The presentation of stimuli was
controlled by a PC running Inquisite2 software (Millisecond Inc.,
Seattle, USA). The ﬁrst part of the experiment with EMG recording
lasted about 10 min. After a 20 min break, subjects rated the intensity
of static and dynamic facial expressions. In this second part of the
experiment, a total of 32 stimuli were presented in random order in
two blocks of 16 (4 actors ×anger vs. happiness ×static vs. dynamic)
as used in our previous study (Biele and Grabowska, 2006). Each
stimulus was presented once in each block. The subjects rated the
pictures and animations for emotional intensity according to a 4-point
Likert-type scale (1 —“low intensity”,4—“high intensity”) (for
details see Biele and Grabowska, 2006). Rating was performed after
each presentation using a four button response pad (Cedrus RB-620,
Cedrus Corporation, San Pedro, USA). The second part of the
experiment with ratings lasted 15 min.
3.1. Facial EMG
The data were analyzed using a 2 (Emotion: happy vs. angry)×2
(Stimulus dynamics: static vs. dynamic)×3 (period: means for 0–
500 ms, 500–1000 ms, and 1000–1500 ms after stimulus onset)
factorial ANOVA. The dependent variable was the EMG activity
difference (in μV) between mean activity within a given period and
the mean activity within the 1000 ms period before stimulus
presentation. The analysis was performed separately for M. zygoma-
ticus major and for M. corrugator supercilii. Since we expected higher
EMG activity for happy than for angry stimuli and for dynamic than for
static stimuli in M. zygomaticus major, and higher EMG activity for
angry than for happy stimuli and for dynamic than for static stimuli in
M. corrugator supercilii, a one-tailed comparison of Emotion and
Stimulus dynamics was performed. Because the ANOVA showed no
signiﬁcant main or interaction effects for participant's gender, this
factor was not considered further, and data recorded from the men
and women were combined for each muscle.
The analysis revealed signiﬁcant effects of Emotion (F(1,26) =
5.17; pb0.05) and Period (F(1,26)= 8.15; pb0.001), and signiﬁcant
interactions of Period×Emotion (F(2,52)= 4.80; pb0.05) and Emo-
tion×Stimulus dynamics (F(1,26)= 5.00; pb0.05) (Fig. 1).
Follow-up ANOVAs showed that for angry facial expressions the
EMG response was different between the ﬁrst (0–500 ms) and third
(1000–1500 ms) period (F(1,26) = 9.52, pb0.005), but not between
the ﬁrst and second or the second and third period. For happy
stimuli, the EMG response was smaller in the ﬁrst than in the second
331K. Rymarczyk et al. / International Journal of Psychophysiology 79 (2011) 330–333
Author's personal copy
period (F(1,26)= 10.7, pb0.005) and smaller in the ﬁrst than in the
third period (F(1,26) =8.13, pb0.05). This analysis also revealed
that the EMG response was greater for angry than for happy stimuli
in the seco nd (F( 1,26) = 6.24, pb0.0 1) an d the third (F(1,2 6) = 5.05,
pb0.01) period, but not in the ﬁrst period.
Further analysis of the interaction between Emotion and Stimulus
dynamics showed that for happy facial expressions, EMG activity was
greater after the presentation of dynamic stimuli (F(1,26) = 2.92,
pb0.05). However, for angry facial expressions, there was no dif-
ference between dynamic and static stimuli (F(1,26) = 2.01,
p= 0.08). Moreover, the EMG activity in response to dynamic happy
facial expressions was greater than that evoked by dynamic angry
facial expressions (F(1,26)=6.80, pb0.01). In contrast, the EMG
activity in response to static happy and angry stimuli was not
signiﬁcantly different (F(1,26)=0.4, p= 0.128).
One sample analysis showed that zygomaticus major activity
signiﬁcantly higher than the baseline occurred for the happy dynamic
stimuli in all three periods after stimulus presentation (t(26)=2.46,
pb0.05, t(26)= 2.78, pb0.005 and t(26)= 2.38, pb0.05 respectively),
and for the happy static stimuli, in the secondand third period (t(26) =
2.66, pb0.01 and t(26)= 3.08, pb0.005, respectively). EMG activity for
this muscle was not different from the baseline in any of three periods
after the presentation of angry static or dynamic stimuli.
In contrast to the ﬁndings for the M. zygomaticus major, the
ANOVA of M. corrugator supercilii activity demonstrated signiﬁcant
effects only of Emotion (F(1,26) = 4.31; pb0.05), and a signiﬁcant
interaction of Period×Stimulus dynamics (F(2,56) = 4.89; pb0.01)
Follow-up analysis of the Period×Stimulus dynamics interaction
revealed that there was no difference between EMG activity in
response to static and dynamic stimuli in the ﬁrst (F(1,26) = 0.96,
p=0.16) and second (F(1,26) = 0.37, p= 0.27) period after stimulus
presentation, but in the third period, the EMG activity in response to
dynamic stimuli decreased signiﬁcantly in comparison with the
response to static stimuli (F(1,26) = 3.53, pb0.05). To investigate
the effect of stimulus dynamics, we analyzed the muscle activity in the
three periods after stimulus presentation. For static stimuli there was
no difference between any pair of periods. For dynamic stimuli there
was a difference between the ﬁrst and third period (F(1,26) = 10.61,
pb0.005), but not between the ﬁrst and second, nor between the
second and third.
One sample t-test analysis showed a signiﬁcant drop in corrugator
supercilii activity in response to happy dynamic facial expressions
in all three periods after stimulus presentation (t(26) = 2.46, pb0.05,
t(26)= 2.78, pb0.01, and t(26) = 2.39, pb0.05, respectively), while a
drop in activity compared to the baseline after happy static stimuli
occurred only in the second period (t(26) = 2.0, pb0.05). No changes
in the EMG activity of this muscle were observed in any of the
analyzed periods in response to the presentation of static or dynamic
angry stimuli besides the increase of the EMG activity in the ﬁrst
period in response to angry dynamic stimuli (t(26) = 1.82, pb0.05).
3.2. Psychological ratings
For ratings (4-point Likert-type scale), a 2 (Emotion: happy vs.
angry)×2 (Stimulus dynamics: static vs. dynamic) factorial ANOVA
was conducted. This analysis revealed a signiﬁcant main effect of
Stimulus dynamics (F(1,26)= 12.02; pb0.01). Dynamic stimuli were
rated as more intense than static stimuli. The mean rating (± SEM) for
happy dynamic facial expressions was 2.85 ± 0.04 while for static
happy facial expressions was 2.68± 0.045. For angry facial expres-
sions the mean ratings were 2.95 ± 0.05 and 2.7 ±0.035 respectively.
Neither the other main effect (Emotion) nor any of the factorial
interactions reached statistical signiﬁcance.
In this study we compared the EMG response of two facial muscles
to dynamic expressions of emotion (morphs in which expressions
changed abruptly from neutral to happy or angry) with that to static
expressions (pictures of faces with apex emotional expressions from
dynamic presentations). The recorded data showed that the subjects
reacted spontaneously and rapidly to happy facial expressions with
increased M. zygomaticus major EMG activity and decreased M.
corrugator supercilii EMG activity. In M. zygomaticus major, the
change in the activity was greater in response to dynamic than to
static stimuli, but this was not the case in M. corrugator supercilii. As
expected, angry facial expressions evoked no alterations in the EMG
activity in the M. zygomaticus major, irrespective of the presentation
mode (static or dynamic), but surprisingly this emotion produced
only a minor increase in the mean EMG activity in M. corrugator
supercilii, with no signiﬁcant difference between the responses to
static and dynamic stimuli. Thus, increased dynamicity of the
presented stimuli did not evoke more pronounced facial muscle
reaction, at least in M. corrugator supercilii response to angry facial
expressions. However, the subjects rated dynamic facial expressions
as more intense than static ones for both types of emotion.
In general, our results concerning M. zygomaticus major are in
agreement with those of previous EMG studies in which the
presentation of happy facial expressions elicited automatic facial
muscular activity, interpretable as facial mimicry (e.g. Dimberg and
Fig. 1. Mean EMG change activity for M. zygomaticus major in response to static and
dynamic emotional facial expressions.
Fig. 2. Mean EMG change activity for M. corrugator supercilii in response to static and
dynamic emotional facial expressions.
332 K. Rymarczyk et al. / International Journal of Psychophysiology 79 (2011) 330–333
Author's personal copy
Thunberg, 1998), and the muscle response was more pronounced
when dynamic stimuli were presented (Sato et al., 2008; Weyers et al.,
The results obtained for mean EMG activity of M. corrugator
supercilii in response to happy or angry stimuli are less consistent
with those of earlier studies. We found no difference in the responses
to static or dynamic angry facial expressions. Moreover, the response
to angry facial expressions was generally negligible, since in most of
the analyzed periods after stimulus presentation the evoked EMG
activity was not different from the baseline. The only signiﬁcant
increase in EMG activity was observed in response to dynamic angry
facial expressions and occurred in the ﬁrst period (0–500 ms).
Our results concerning the EMG activity of M. corrugator supercilii
are similar to those of Weyers et al. (2006), who failed to observe any
signiﬁcant difference in the mean EMG activity of this muscle in
response to static or dynamic angry facial expressions. The lack of an
EMG response to angry facial expressions is puzzling, but the stronger
response to happy than to angry stimuli supports the notion that
angry behavior might be disadvantageous in social interactions
(Weyers et al., 2006) and is generally disapproved of in Western
cultures (Hess and Bourgeois, 2010). It is also likely that the small or
absent EMG response to angry facial expressions is related to the
artiﬁcial situation of the laboratory environment, where negative
stimuli lose their valence (Larsen et al. 2003).
Increased EMG activity in the ﬁrst period (0–500 ms) was also
observed by Dimberg and Thunberg (1998) as a reaction to static
presentations of both angry and happy facial expressions, and other
types of visual stimuli (Dimberg et al., 1998). Such a rapid increase
during the ﬁrst three 100-ms intervals after stimulus onset may be
interpreted as a blink reﬂex, or as an orienting response (Dimberg,
2000). In our experiments, such a response occurred in reaction to
dynamic angry facial expressions, but was not observed after the
presentation of the three other stimulus types (static and dynamic
happy, and static angry facial expressions). It has been suggested that
a more rapid response to angry facial expressions may reﬂect faster
processing of negative, threat-related signals (Achaibou et al., 2008).
The obtained response pattern may also be an effect of the speciﬁcity
of the stimuli, since an angry facial expressions has a stronger
movement signal than other facial expressions (Horstmann and
The present study was also designed to investigate the effect of
dynamic facial expression on intensity ratings. Participants rated
dynamic stimuli as more intense than static stimuli for both happy
and angry facial expressions. This might indicate that dynamically-
presented facial expressions were perceived as more realistic than
static ones, and were thus rated as more intense. It is also feasible that
the perceived higher intensity of dynamic stimuli could be evoked by
elevated arousal and attention induced by motion (Simons et al.,
1999; Weyers et al., 2006).
In conclusion, the ﬁndings of our study conﬁrm the importance of
dynamic stimuli in some emotional processing. It appears that in
speciﬁc social situations the property of dynamicity facilitates
processing of facial expressions of emotion.
In future studies it would be informative to examine other static
and dynamic facial expressions. Previous studies have indicated that
the intensity of facial mimicry could be related to emotionality or
individual personality (Achaibou et al., 2008; Sonnby-Borgström,
2002; Vrana and Gross, 2004), so dividing subjects into different
groups could make the results more pronounced. It would also be
interesting to study EMG activity in response to videos of real facial
This study was supported by grant no. H01F 043 29 from the Polish
Ministry of Science and Higher Education to the ﬁrst author. The
comments of the reviewers are also acknowledged.
Achaibou, A., Pourtois, G., Schwartz, S., Vuilleumier, P., 2008. Simultaneous recording of
EEG and facial muscle reactions during spontaneous emotional mimicry.
Neuropsychologia 46, 1104–1113.
Bassili, J.N., 1979. Emotion recognition: the role of facial movement and the relative
importance of upper and lower areas of the face. J. Pers. Soc. Psychol. 37,
Beaupré, M.G., Hess, U., 2005. Cross-cultural emotion recognition among Canadian
ethnic groups. J. Cross Cult. Psychol. 26, 355–370.
Biele, C., Grabowska, A., 2006. Sex differences in perception of emotion intensity in
dynamic and static facial expressions. Exp. Brain Res. 171, 1–6.
Cacioppo, J.T., Petty, E.P., Losch, M.E., Kim, H.S., 1986. Electromyographic activity over
facial muscle regions can differentiate the valence and intensity of affective
reactions. J. Pers. Soc. Psychol. 50, 260–268.
Dimberg, U., 1982. Facial reactions to facial expressions. Psychophysiology 19, 643–647.
Dimberg, U., Petterson, M., 2000. Facial reactions to happy and angry facial expressions:
evidence for right hemisphere dominance. Psychophysiology 37, 693–696.
Dimberg, U., Thunberg, M., 1998. Rapid facial reactions to emotional facial expressions.
Scand. J. Psychol. 39, 39–45.
Dimberg, U., Hansson, G., Thunberg, M., 1998. Fear of snakes and facial reactions: a case
of rapid emotional responding. Scand. J. Psychol. 39, 75–80.
Fujimura, T., Sato, W., Suzuki, N., 2010. Facial expression arousal level modulates facial
mimicry. Int. J. Psychophysiol. 76, 88–92.
Harwood, N.K., Hall, L.J., Shinkﬁeld, A.J., 1999. Recognition of facial emotional
expressions from moving and static displays by individuals with mental
retardation. Am. J. Ment. Retard. 104, 270–278.
Hess, U., Blairy, S., Kleck, R., 1997. The intensity of emotional facial expressions and
decoding accuracy. J. Nonverbal Behav. 21, 241–257.
Hess, U., Bourgeois, P., 2010. You smile–I smile: e motion expression in social
interaction. Biol. Psychol. 84, 514–520.
Horstmann, G., Ansorge, U., 2009. Visual search for facial expressions of emotions: a
comparison of dynamic and static faces. Emotion 9, 29–38.
Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., Akamatsu, S., 2001.
Dynamic properties inﬂuence the perception of facial expressions. Perception 30,
Larsen, J.T., Norris, C.J., Cacioppo, J.T., 2003. Effects of positive and negative affect on
electromyographic activity over zygomaticus major and corrugator supercilii.
Psychophysiology 40, 776–785.
Mehrabian, A., 1981. Silent messages: implicit communication of emotions and
attitudes. Wadsworth, Belmont, CA.
Sato, W., Fujimura, T., Suzuki, N., 2008. Enhanced facial EMG activity in response to
dynamic facial expressions. Int. J. Psychophysiol. 70, 70–74.
Sato, W., Yoshikawa, S., 2004. The dynamic aspects of emotional facial expressions.
Cogn. Emot. 18, 701–710.
Simons, R.F., Detenber, B.H., Roedema, T.M., Reiss, J.E., 1999. Emotion processing in
three systems: the medium and the message. Psychophysiology 36, 619–627.
Sonnby-Borgström, M., 2002. Automatic mimicry reactions as related to differences in
emotional empathy. Scand. J. Psychol. 43, 433–443.
Vrana, S.R., Gross, D., 2004. Reactions to facial expressions: effects of social context and
speech anxiety on responses to neutral, anger, and joy expressions. Biol. Psychol.
Wehrle, T., Kaiser, S., Schmidt, S., Scherer, K.R., 2000. Studying dynamic models of facial
expression of emotion using synthetic animated faces. J. Pers. Soc. Psychol. 78,
Weyers, P., Muhlberger, A., Hefele, C., Pauli, P., 2006. Electromyographic responses to
static and dynamic avatar emotional facial expressions. Psychophysiology 43,
Yoshikawa, S., Sato, W., 2006. Enhanced perceptual, emotional, and motor processing in
response to dynamic facial expressions of emotion. Jpn Psychol. Res. 48, 213–222.
333K. Rymarczyk et al. / International Journal of Psychophysiology 79 (2011) 330–333