Content uploaded by Magdalena Rychlowska
Author content
All content in this area was uploaded by Magdalena Rychlowska on Sep 02, 2014
Content may be subject to copyright.
From the Eye to the Heart: Eye Contact Triggers
Emotion Simulation
Magdalena Rychlowska
Department of Psychology
Clermont Université, France
34 av Carnot
63037 Clermont-Ferrand
rychlowska@wisc.edu
Leah Zinner
Department of Psychology
Oglethorpe University,
4484 Peachtree Rd NE, Atlanta,
GA, 30319
lzinner@oglethorpe.edu
Serban C. Musca
Université Rennes 2, France
CRPCC
Place Recteur Henri Le Moal
35043 Rennes Cedex
serbanclaudiu.musca@uhb.fr
Paula M. Niedenthal
Department of Psychology
University of Wisconsin-Madison
1202 W Johnson St
Madison, WI, 53706-1611
niedenthal@wisc.edu
ABSTRACT
Smiles are complex facial expressions that carry multiple
meanings. Recent literature suggests that deep processing of
smiles via embodied simulation can be triggered by achieved
eye contact. Three studies supported this prediction. In Study
1, participants rated the emotional impact of portraits, which
varied in eye contact and smiling. Smiling portraits that
achieved eye contact were more emotionally impactful than
smiling portraits that did not achieve eye contact. In Study 2,
participants saw photographs of smiles in which eye contact
was manipulated. The same smile of the same individual
caused more positive emotion and higher ratings of authenticity
when eye contact was achieved than when it was not. In Study
3, participants’ facial EMG was recorded. Activity over the
zygomatic major (i.e. smile) muscle was greater when
participants observed smiles that achieved eye contact
compared to smiles that did not. These results support the role
of eye contact as a trigger of embodied simulation. Implications
for human-machine interactions are discussed.
ACM Classification Keywords
H.1.2 Models and Principles: User/Machine Systems—Human
factors; H.5.2 Information Systems: Information Interfaces and
Presentation: User-centered design; H.5.3 Information Systems:
Information Interfaces and Presentation—Synchronous
interaction
General Terms
Experimentation, Human Factors
Keywords
Eye contact, smile, facial expression, embodied simulation
1. INTRODUCTION
There is a road from the eye to the heart that does not go
through the intellect.
-G. K. Chesterton
Understanding the subtle meaning of facial expression is a daily
challenge, and the smile might be the most challenging of
expressions. While it is true that prototypical smiles are
universally recognized as signs of joy [11, 15, 22], suggesting
that this expression is easily interpreted, other research [1, 13]
attests to its complexity.
How do people understand a smile? This question is addressed
in the Simulation of Smiles Model (SIMS), recently proposed
by Niedenthal, Mermillod, Maringer, and Hess [30]. The
present research was conducted in order to test a specific
hypothesis generated by the SIMS, namely that eye contact is a
sufficient trigger for embodied simulation of smiles.
1.1 The Simulation of Smiles (SIMS) Model
The SIMS model integrates social psychological research with
recent findings in neuroscience in order to propose how the
specific meaning of a smile is arrived at. According to the
SIMS, three operations can be used to process smiles:
perceptual analysis (matching the smile to representations of
prototypical smiles), top-down application of beliefs and
stereotypes, and embodied simulation.
Embodied simulation refers to partial reenacting of a
corresponding state in the motor, somatosensory, affective and
reward systems. This reenacting represents the meaning of the
expression to the perceiver [17, 10, 29] as if he/she was in the
place of the smiling person. The perception of a smile is
therefore accompanied by the bodily and affective states
associated with the production of this facial expression. In
addition to affective state, an important part of the embodied
simulation of a smile is facial mimicry. We define facial
mimicry as the visible or non-visible use of facial musculature
by an observer to imitate another person’s facial expression
[30].
The important role of the facial mimicry was suggested by the
findings of Stel & van Knippenberg [37]. They showed that
inhibiting facial mimicry decreased the speed of judging facial
displays as expressing positive or negative emotion. In another
study, Maringer et al. [26] showed that inhibition of facial
mimicry impaired the distinction between genuine and
nongenuine smiles. A recent study by Neal and Chartrand [28]
further bolsters this conclusion, showing that amplifying facial
mimicry improves one’s ability to read others’ facial emotions.
Although parts of embodied simulation, such as facial mimicry,
appear to be helpful in forming an accurate understanding of
facial expression, what is less clear are the conditions under
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies
are not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. To copy
otherwise, or republish, to post on servers or to redistribute to lists,
requires prior specific permission and/or a fee.
Gaze-In’12, October 26, 2012, Santa Monica, California, USA.
Copyright 2012 ACM 978-1-4503-1516-6 …$15.00.
which embodied simulation occurs. According to the SIMS
model, a sufficient though not necessary trigger for embodied
simulation is the achievement of eye contact with the individual
displaying the expression.
1.2 Eye Contact as a Trigger to Simulation
Both developmental research [14, 19, 25], and work on
intimacy [21, 34] provide hints of the role of eye contact in
embodied simulation of emotion. This role is more explicitly
indicated by the findings of Bavelas, Black, Lemery, and
Mullett [6] on the perception of pain expressions. There, a
confederate faked the experience of pain and expressed the pain
facially. Further, he made eye contact with some of the
participants but not others. Eye contact significantly affected
participants’ reactions: they mimicked the confederate’s
expressions most clearly when eye contact with the confederate
was made. Relatedly, Schrammel and colleagues [35] showed
that participants’ zygomatic major muscle activity was stronger
when viewing happy faces than neutral faces, and, most
importantly, facial expression had an effect only under
conditions of eye contact. These results suggest a close link
between eye contact and facial mimicry.
In the present three studies, our aim was to test the SIMS
model’s specific hypothesis that eye contact is a trigger of
embodied simulation of the smile. The first study relied on
existing portraiture paintings. We selected portraits of subjects
who achieved different degrees of eye contact with the viewer,
and who expressed smiles. Participants saw each portrait twice.
On one exposure the participant viewed the full portrait; on the
other exposure the eyes of the portrait subject were obscured.
The indicator of embodied simulation was the participant’s
rating of the emotional impact of the painting. Since embodied
simulation is related to affective change, the more a smile is
embodied in the self, the more the viewer should report an
emotional response to the portrait. If the eye-contact-as-trigger
hypothesis is correct, then the emotional impact of the portrait
should be significantly greater when the eyes are unmasked
versus masked, and this should be particularly true if the viewer
achieves eye contact with the portrait on the unmasked trial. In
contrast, if participants were using a perceptual analysis for
decoding the smile, then seeing the eyes per se would be
important, but level of eye contact would be irrelevant to
personal feelings of emotion.
2. STUDY 1
2.1 Method
2.1.1 Participants
Undergraduates (101 female, 13 male) from two medium-size
universities participated in exchange for course credit. Data
from 6 participants were discarded because they were
incomplete or because they failed to follow instructions.
2.1.2 Stimuli
Paintings were selected from art archive internet sites by a
research assistant who was blind to the hypotheses. Criteria that
guided the selection of potential target portraits included that
the portrait showed a frontal and not profile view, and that the
eyes were clearly visible. Neither portraits of celebrities nor
very famous portraits were included in the final set. The 16
target portraits were selected based on a pilot study involving
39 undergraduate students (27 female, 12 male) from a
medium-sized university. Participants saw 32 smiling portraits
and rated the extent to which they were certain that the subject
of the portrait was actually smiling. Responses were made on
scales from 0 (not at all sure) to 100 (very sure). The 16
portraits selected as targets were those for which the average
ratings of certainty that the displayed expression was a smile
were the highest (M = 73.22, SD = 13.07). Among the 16
targets, the level of eye contact varied substantially (see
examples in Figure 1).
72 paintings from the 16th through 20th centuries, 56 distractors
and 16 target portraits, constituted the final stimulus set
1
. The
distractors (portraits, landscapes, and still life works) were
included to minimize demand characteristics.
A mask (pattern: small checkerboard, colors: 98, 92, 56 and
181, 188, 146 RGB) obscured the eyes for one presentation of
all 32 portraits (i.e., both target and distractor portraits; Fig. 1,
bottom panel). Four mask sizes (128 by 22 pixels, 158 by 22
pixels, 189 by 45 pixels and 242 by 60 pixels) were used,
depending on the face area proportions. Masks did not
systematically cover any particular portion of the eye area but
always obscured eye gaze, and they were applied randomly to
the landscape and still life paintings.
2.1.3 Procedure
Participants were tested in pairs, but worked independently at
individual computer stations. They were seated approximately
0.5 m from the screen (20", display resolution: 1280 x 768).
The experiment was programmed in E-Prime Version 1.2
(1996-2006 Psychology Software Tools).
Each of the 72 paintings was presented twice (once masked and
once unmasked) in a random order, with the constraint that one
exposure occurred in the first, and the other in the second half
of the trials. Stimuli were displayed on a black background.
The inter-trial interval was 800 ms, during which participants
saw a black screen.
1
Stimuli are available on-line at :
https://www.dropbox.com/s/q48il7ti6cse7ui/Study%201.zip
Figure 1. Portraits achieving eye contact (left) and not
achieving eye contact (right), in unmasked (top row) and
masked (bottom row) conditions.
.
For masked and unmasked presentations, target portraits were
accompanied by the question, presented simultaneously at the
bottom of the screen, “How emotional is the impact of the
painting?” Participants responded by positioning a cursor on a
bar ranging from 0 (no emotion) to 100 (a lot of emotion).
Positive emotion was not mentioned in the question in order to
minimize demand characteristics. For half of the distractors, a
filler question appeared and the other half was presented
without a question.
In the second part of the experiment, participants saw the 16
target portraits again. This time they rated the amount of
perceived eye contact (“How much eye contact does the subject
establish with you as the viewer?”) using the scale described
above (cursor bar ranging from 0, no eye contact to 100, a lot of
eye contact). At the end of the session the experimenter
debriefed the participants and probed for suspicion.
2.1.4 Results
We first divided the target portraits into two groups, based on a
median split of the eye contact ratings averaged across subjects:
portraits achieving eye contact and portraits not achieving eye
contact.
Ratings of emotional impact were then submitted to a 2 (mask:
masked vs. unmasked) x 2 (eye contact: achieved or not
achieved) repeated-measures ANOVA. Unsurprisingly, there
was a main effect of mask, F(1,107) = 92.05, p < .001, such that
emotional impact was higher for unmasked (M = 54.02, SD =
16.83) than for masked portraits (M = 42.97, SD = 15.64, d =
0.93). Emotional impact also varied as a function of eye
contact, F(1,107) = 117.80, p < .001, such that portraits that
achieved eye contact had more emotional impact on the
observer than portraits that did not achieve eye contact (M =
53.63, SD = 15.84, M = 43.36, SD = 15.93, d = 1.04).
However, as predicted, mask interacted with eye contact, F
(1,107) = 17.76, p < .001, such that the difference between the
emotional impact of masked and unmasked trials was higher for
portraits achieving eye contact (M = 13.09, SD = 12.57) than for
smiles that did not achieve eye contact (M = 9.00, SD = 13.39, d
= 0.41).
The dichotomization of continuous variables is a controversial
practice, which decreases the statistical power [7]. We therefore
reanalyzed the data using eye contact as a continuous variable.
Since participants rated the emotional impact of each of the 16
target portraits twice, impact ratings could not be considered
independent. Therefore, we used hierarchical modeling (HLM
software, version 6.06) [26] with portraits as the level-1 units
and participants as level-2 units. There were a total of 1728
observations. The intercept was allowed to vary randomly.
Mask and eye contact were specified as predictors.
Analysis of the main effects revealed the expected effect of
mask, t(107) = 9.93, p < .001, such that the emotional impact of
unmasked portraits was higher than the impact of masked
portraits. Also, emotional impact significantly increased with
eye contact, t(1726) = 11.18, p < .001. Most importantly, mask
interacted with eye contact, t(1726) = 4.43, p < .001, such that
the difference between masked and unmasked trials was
greatest for portraits achieving high levels of eye contact.
2.1.5 Discussion
Our results are consistent with the hypothesis that eye contact
triggers embodied simulation of smiles, estimated by the
reported emotional impact of portraiture painting. This impact
was greater when the subject’s eyes were visible, versus when
masked. More importantly, the difference was significantly
greater when eye contact was achieved. Facial mimicry and
the production of a corresponding emotional state are two
components of embodied simulation. Our finding complements
other results in the literature that demonstrate eye contact is
associated with greater facial mimicry [6, 35].
A limitation of Study 1 was that although we experimentally
manipulated whether or not the eyes were visible, we did not
manipulate eye contact. Further, we used one indicator of
simulation – emotional impact. In Study 2 we tried to address
these limitations by manipulating eye contact and using a
different measure of embodied simulation, namely, ratings of
positivity and genuineness of smiles. We were inspired by past
research showing that smiles judged as genuine are related to
greater facial mimicry and positive feelings in the perceiver [12,
36]. If eye contact is a trigger of embodied simulation, ratings
of positivity and genuineness of smiles should be higher under
conditions of achieved eye contact.
3. STUDY 2
3.1 Method
3.1.1 Participants
41 undergraduates (40 females, 1 male) from a medium-sized
university took part in exchange for course credit. Data from 4
participants were discarded from further analyses due to their
failure to follow instructions.
3.1.2 Materials
72 photographs of smiles were developed for the study. 12
models (6 female, 6 male) were photographed by a professional
photographer in the presence of an expert on facial expression
of emotion. The expert used standard instructions [12] for
eliciting Duchenne and non-Duchenne smiles. Each model was
photographed smiling with three levels of eye contact: direct
gaze (high eye contact), left averted and right averted gaze (see
Figure 2).
3.1.3 Procedure
Participants were tested in pairs, but worked independently.
They were exposed to each of the 72 photographs
2
(screen size:
20", display resolution: 1280 x 768, picture size: 380 by 475
pixels) for 1500 msec. Their task was to rate the degree to
which they perceived the smile to be genuine on a scale ranging
from 0 (not genuine at all) to 100 (very genuine), and the
degree to which they perceived the smile to be positive on a
scale ranging from 0 (not at all positive) to 100 (very positive).
2
Stimuli are available on-line at :
https://www.dropbox.com/s/wvoead207bhljc9/Study%202.zip
Figure 2. Smile with achieved eye contact and gaze
averted to the left/right
.
3.1.4 Results
Two one-way ANOVAs were conducted with gaze (eye contact
or averted) as the independent variable, and genuineness and
positivity as the dependent variables. There was a main effect
of gaze on ratings of genuineness such that smiles with eye
contact were judged as more genuine (M = 60.99, SD = 11.21)
than smiles with averted gaze (M = 58.93, SD = 10.08), t(36) =
2.47, p = .018, d = 0.42. This was also true for positivity: smiles
that achieved eye contact were rated as significantly more
positive (M = 64.29, SD = 11.68) than smiles with averted gaze
(M = 60.54, SD = 10.31), t(36) = 4.76, p < .001, d = 0.81.
Mediational analyses indicated that the effect of eye contact on
genuineness disappeared when controlling for positivity,
F(1,34) = 1.73, p > .1. However, the effect of eye contact on
positivity was still significant over and above the differences in
ratings of genuineness, F(1,34) = 16.19, p <.001. This is
consistent with complete mediation, such that the increased
perceived genuineness of smiles that make eye contact was
largely determined by the increased feelings of positive emotion
generated by such smiles.
3.1.5 Discussion
The present study used an experimental manipulation of eye
contact and found that eye contact was related to higher ratings
of both positivity and genuineness, for both Duchenne and non-
Duchenne smiles. In light of past findings on the extent to
which “genuine” smiles produce physiological, bodily, and
experiential signs of positive affect, we suggest that the present
positivity ratings can be one valid indicator of emotional
simulation. In our experiment ratings of positivity fully
mediated the relationship between eye contact and perceived
genuineness. This result suggests that judgments of the
genuineness of smiles may not be based only on perceptual
features of the smile, but also on the affective experience of the
perceiver.
A limitation of these two studies is that only self-reported
indicators of embodied simulation - emotional impact and
ratings of positivity - were used. The aim of Study 3 was to
address this limitation by adding a measure of facial mimicry.
Participants’ EMG activity was recorded while they were
observing smiles in which eye contact was manipulated. If eye
contact is a sufficient trigger of embodied simulation, smiles
should be mimicked more when eye contact is achieved than
when it is not.
4. STUDY 3
4.1 Method
4.1.1 Participants
A total of 27 female undergraduate students from a medium-
size university participated in the experiment. They were
recruited on campus and received 10 € compensation.
4.1.2 Materials
Experimental stimuli were prepared according to the parameters
described in Study 2. This time, participants saw photographs of
6 models (3 female, 3 male) displaying facial expressions
(neutral or smiling) and two levels of eye contact (eye contact
achieved, and averted gaze – no eye contact) for a total of 24
facial stimuli
3
.
4.1.3 Procedure
3
Stimuli are available on-line at :
https://www.dropbox.com/s/he3m6el1mv5lyfe/Study%203.zip
Participants were tested individually. Facial stimuli were
presented on a computer screen (screen size: 17", display
resolution: 1024 x 768, picture size: 760 by 950 pixels) for 8 s.
Each stimulus appeared three times in a random order, with the
constraint that two photographs of the same face never occurred
in succession. The inter-trial interval was 500 ms. Presentations
began with a screen prompting participants to press the space
bar when ready. Participants were told to imagine real
interactions with models of the photographs.
Activity of the zygomatic major (ZM) muscle was recorded on
the left side of the face, according to the established guidelines
[16] and using bipolar 10 mm Ag/AgCl surface-electrodes
filled with SignaGel (Parker Laboratories Inc.). As a pretext for
the placement of electrodes used to record ZM activity,
participants were told that their brain waves would be recorded
- and a dummy electrode was also placed in the center of the
forehead.
The EMG raw signal was measured with the 16 Channel Bio
Amp amplifier (ADInstruments, Inc.), digitized by a 16 bit
analogue-to-digital converter (PowerLab 16/30, ADInstruments,
Inc.), and stored with a sampling rate of 1000 Hz. Data were
filtered with a 10-Hz high-pass filter, a 400-Hz low-pass filter,
and a 50-Hz notch filter.
Next, participants saw the 24 photographs once again and rated
the degree to which they perceived the facial expression to be
positive on a scale ranging from 0 (not at all positive) to 100
(very positive), identical to the procedure used in Study 2. At
the end of the session participants completed a questionnaire
that tested their understanding of the task and probed for
suspicion. These post-experiment responses indicated that the
cover story was persuasive.
4.1.4 Results
4.1.4.1 EMG Activity
The scores of interest were expressed as a difference in the
mean activity during the last 500 ms before stimulus onset and
the mean activity in the time window 500-1500 ms after
stimulus onset. EMG data were subjected to 2 (facial
expression: neutral, smile) x 2 (gaze: direct vs. averted)
analyses of variance (ANOVA), with both expression and gaze
as within subject factors.
Analysis of the main effects showed a significant main effect of
expression such that ZM activity was higher for smiles than for
neutral expression, F(1,26) = 11.89, p = .002. The interaction
between expression and gaze was not significant F(1,26) =
2.32, p > .1, but post-hoc comparisons showed that smiling
photographs achieving eye contact elicited higher ZM activity
(M = 49.89 mV, SD = 64.78) than photographs with averted
gaze (M = 32.11 mV, SD = 52,50), t(1,26) = 2.54, p = .017, d =
0.52, see Figure 3. This difference was not significant for
neutral photographs (MEC = 6.04 mV, SD = 33.28, MAverted =
3.63 mV, SD = 42.46), t(1,26) = 0.47, p > .5, d = 0.10.
4.1.4.2 Ratings of positivity
Positivity scores were subjected to 2 x 2 analyses of variance
with facial expression and gaze as within subject factors. A
significant main effect of facial expression was found, F(1,26)
= 547.47, p < .001. Not surprisingly, smiles (M = 83.43, SD =
9.30) were rated as significantly more positive than neutral
facial expressions (M = 24.61, SD = 12.84), t(26) = 23.40, p
<.001, d = 4.62. Again, the expression-gaze interaction was not
significant, F(1,26) = 0.36, p > .5, but post-hoc comparisons
showed that ratings of positivity were significantly higher for
smiling photographs achieving eye contact (M = 84.93 mV, SD
= 8.48) than for smiling photographs with averted gaze (M =
81.93 mV, SD = 11.03), p = .020, d = 0.51. This difference was
not significant for neutral photographs (MEC = 25.52 mV, SD =
12.80, MAverted = 23.70 mV, SD = 13.76), t(1,26) = 1.38, p > .1,
d = 0.27.
4.1.5 Discussion
This study used a psychophysiological indicator of embodied
simulation to supplement the self-reported measures used in
Study 1 and 2. We found that smiles provoked greater
zygomatic major activity under conditions of eye contact
compared to averted gaze. These results are in line with the
findings of Bavelas et al. [6], where facial expressions of pain
elicited greater mimicry in condition of eye contact than when
eye contact was not achieved. Also, Schrammel et al. [35]
showed that smiles of animated virtual characters had an effect
on participants’ zygomatic activity only if the character directly
turned towards the observer (and thus, when eye contact was
achieved). At first pass these results seem contradictory to these
obtained by Mojzisch, Schilbach, Helmert, Pannasch,
Velichkovsky, & Vogeley [27], where participants smiled both
in response to characters who made eye contact and those who
were turned away. Note however that in this study mean
zygomatic activity was (not significantly) higher for conditions
where virtual characters gazed directly at participants,
compared to when characters were turned away. It should be
also mentioned that only males participated in the research of
Mojzisch et al. [27], whereas earlier EMG findings [9] suggest
that females show more a pronounced facial mimicry effect
than males.
In Study 3, the main effect of gaze was not qualified by an
interaction with facial expression, as was found by Schrammel
et al. [35]. This may be due to the type of stimuli used in the
two studies. Note that Schrammel and colleagues used dynamic
sequences presenting virtual characters, while in our study
participants observed photographs of real persons. Moreover,
we specifically manipulated eye contact, while Schrammel et al.
[35] varied the character’s body orientation. The lack of
significant interaction may be also due to an insufficient
statistical power. The impact of eye contact on facial mimicry
and possible moderations should be investigated in further
studies involving more participants.
5. GENERAL DISCUSSION
The present studies were motivated by a prediction [30], that
eye contact is a sufficient trigger of embodied simulation of
smiles. We used two types of stimuli – portraiture paintings and
portrait photography – and three measures of embodied
simulation: emotional impact, smile positivity and facial EMG.
In the first study, achieved eye contact elicited more emotion
than non-achieved eye contact. The second study showed that
eye contact increased the perceived positivity and genuineness
of smiles. Finally, the third study demonstrated eye contact is
associated with greater imitation of smiles than averted gaze.
Although our dependent measures are only parts of a complex
phenomenon of embodied simulation, findings from these three
studies support our prediction and highlight the importance of
eye contact in the judgment of smiles. Moreover, these effects
of mutual gaze can extend to other facial and bodily expressions
[39].
Achieved eye contact is a powerful social signal. When
perceiving direct gaze, people allocate their attentional
resources to the interaction and engage in intensive processing
of their interaction partners’ faces [18]. Eye contact has also
been proposed to be a signal of approach motivation. For
example, Adams and Kleck [2, 3] found that eye contact
increased the recognition accuracy and perceived intensity of
so-called approach-oriented emotions (i.e., anger and
happiness). Such findings are neither completely consistent
with, nor contradictory to the present account. We argue
however that the effects of eye contact extend beyond mere
attention and information, and involve emotional experience
along with imitation of the interaction partner.
We believe that deeper understanding of eye contact can inform
the design of trustworthy and persuasive robots, helping to
solve one of the fundamental questions in building social
robots: when is the imitation appropriate [8]? Existing research
indicates that mimicry can act like "social glue", fostering
prosocial attitudes and cooperation [5, 38, 20]. Consequently,
results of the reported three studies suggest that a robot
producing or imitating human facial expressions under
conditions of eye contact should elicit higher emotional
responses than a robot that does not achieve eye contact. It is
indeed possible, but the situation is more complex that it seems:
recent studies showed that not only people tend to mimic more
sympathetic interaction partners [23] but also that being
imitated by an outgroup member can have negative
consequences and decrease likability [24]. Thus, gaze behavior
should vary as a function of the type of the robot, with more
likable robots achieving more eye contact. On the other hand,
referential gaze and head alignment with the object of interest
would be more effective in educational contexts [4].
Another important problem is whether eye contact of strongly
humanlike robots, along with a display of smiles, will elicit
mimicry and positive emotion or rather feelings of eeriness and
discomfort? These questions deserve experimental
investigation. We believe that the present research can help in
designing robots and agents that “invite" motivated, personal
processing of facial expressions [31, 32]. This embodied
processing of smiles, frowns or other grimaces can make their
impact more visceral and more persuasive.
6. ACKNOWLEDGEMENTS
The authors would like to thank Pierre Chausse and Cyril
Bernard for their competent programming, and Sophie
Figure 3. Mean change of zygomatic activity as a
function of facial expression and gaze.
.
Monceau, Alexandra Buonanotte, Elena Dujour, and Marie
Dejardin for their work as experimenters.
7. REFERENCES
[1] Abe, J.A., Beetham, M., and Izard, C. 2002. What do smiles
mean? An analysis in terms of differential emotions theory.
In An empirical reflection on the smile, M.H. Abel, Ed.
Edwin Mellen Press, Lewiston, NY, 83-110.
[2] Adams, R. and Kleck, R. E. 2003. Perceived gaze direction
and the processing of facial displays of emotion. Psychol.
Sci. 14, 644-647.
DOI= 10.1046/j.0956-7976.2003.psci_1479.x
[3] Adams, R. and Kleck, R. E. 2005. The effects of direct and
averted gaze on the perception of facially communicated
emotion. Emotion. 5, 3-11. DOI= 10.1037/1528-3542.5.1.3
[4] Andrist, S., Pejsa, T., Mutlu B., and Gleicher, M. 2012.
Designing Effective Gaze Mechanisms for Virtual Agents.
In Proceedings of the 30th ACM/SigCHI Conference on
Human Factors in Computing Systems (Austin, TX). CHI
’12. ACM, New York, NY, 705-714.
DOI=10.1145/2207676.2207777
[5] Bailenson, J. N. and Yee, N. 2005. Digital chameleons.
Psychol. Sci. 16, 814-819. DOI=10.1111/j.1467-
9280.2005.01619.x
[6] Bavelas, J. B., Black, A., Lemery, C. R., and Mullett, J.
1986. "I show how you feel": Motor mimicry as a
communicative act. J. Pers. Soc. Psychol. 50, 322-329.
DOI=10.1037/0022-3514.50.2.322
[7] Brauer, M. (2002). L’analyse des variables indépendantes
continues et catégorielles: alternatives à la dichotomisation.
An. Ps. 102(3), 449-484. DOI= 10.3406/psy.2002.29602
[8] Breazeal, C., and Scassellati, B. 2002. Robots that imitate
humans. Trends Cogn. Sci. 6, 481-487.
DOI=10.1016/S1364-6613(02)02016-8
[9] Dimberg, U. and Lundqvist, L.-O. 1990. Gender differences
in facial reactions to facial expressions. Biol. Psychol. 30,
151–159.DOI=http://dx.doi.org/10.1016/0301-
0511(90)90024-Q
[10] Decety, J. and Sommerville, J. A. 2003. Shared
representations between self and other: A social cognitive
neuroscience view. Trends Cogn. Sci. 7, 527-533.
DOI=10.1016/j.tics.2003.10.004
[11] Ekman, P. 1994. Strong evidence for universals in facial
expression: A reply to Russell's mistaken critique. Psychol.
Bull. 115, 268-287. DOI=10.1037/0033-2909.115.2.268
[12] Ekman, P. and Davidson, R. 1993. Voluntary smiling
changes regional brain activity. Psychol. Sci. 4, 342-345.
DOI=10.1111/j.1467-9280.1993.tb00576.x
[13] Ekman, P. and Friesen, W. V. 1982. Felt, false and
miserable smiles. J. Nonverbal Behav. 6, 238-252.
[14] Farroni T., Csibra G., Simion, F., and Johnson, M. H. 2002.
Eye contact detection in humans from birth. P. Natl. Acad.
Sci. USA. 99, 9602-9605.
DOI=10.1073/pnas.152159999
[15] Frank, M. and Stennett, J. 2001. The forced-choice
paradigm and the perception of facial expression of
emotion. J. Pers. Soc. Psychol. 80, 75-85.
DOI=10.1037/0022-3514.80.1.75
[16] Fridlund, A. J. and Cacioppo, J. T. 1986. Guidelines for
human electromyographic research. Psychophysiology. 23,
567-89. DOI=10.1111/j.1469-8986.1986.tb00676.x
[17] Gallese, V. 2003. The roots of empathy: The shared
manifold hypothesis and the neural basis of
intersubjectivity. Psychopathology. 36, 171-180.
DOI=10.1159/000072786
[18] George, N. and Conty, L. 2008. Facing the gaze of others.
Neurophysiol. Clin. 38, 197-207.
DOI=10.1016/j.neucli.2008.03.001
[19] Hains, S. M. J. and Muir, D. W. 1996. Infant sensitivity to
adult eye direction. Child Dev. 67, 1940–1951.
DOI=10.1111/j.1467-8624.1996.tb01836.x
[20] Heyes, C. in press. What can imitation do for cooperation?
In Signalling, Commitment and Emotion, B. Calcott, R.
Joyce & K. Stereiny, Eds. MIT Press, Cambridge, MA.
[21] Iizuka, Y. 1992. Eye contact in dating couples and
unacquainted couples. Percept. Motor Skills. 75, 457-461.
DOI=10.2466/pms.1992.75.2.457
[22] Izard, C. 1971. The face of emotion. Appleton-Century-
Crofts, New York, NY.
[23] Likowski, K.U., Mühlberger, A., Seibt, B., Pauli, P., and
Weyers, P. 2008. Modulation of facial mimicry by attitudes.
J. Exp. Soc. Psychol. 44, 1065-1072. DOI=
10.1016/j.jesp.2007.10.007
[24] Likowski, K.U., Schubert, T.W., Fleischmann, B.,
Landgraf, J., and Volk, A. Submitted. Positive effects of
mimicry are limited to the ingroup.
[25] Lohaus, A., Keller, H., and Voelker, S. 2001. Relationships
between eye contact, maternal sensitivity, and infant crying.
Int. J. Behav. Dev. 25, 542-548.
DOI=10.1080/01650250042000528
[26] Maringer, M. , Krumhuber, E. G., Fischer, A. H., and
Niedenthal, P. M. 2011. Beyond smile dynamics: mimicry
and beliefs in judgments of smiles. Emotion. 11, 181-7.
DOI=10.1037/a0022596
[27] Mojzisch, A., Schilbach, L., Helmert, J., Pannasch, S.,
Velichkovsky, B. M., and Vogeley, K. 2006. The effects of
self-involvement on attention, arousal, and facial expression
during social interaction with virtual others: A
psychophysiological study. Soc. Neurosci. 1, 184-195.
DOI= 10.1080/17470910600985621
[28] Neal, D. and Chartrand, T. 2011. Embodied Emotion
Perception: Amplifying and Dampening Facial Feedback
Modulates Emotion Perception Accuracy. Soc. Psychol.
Person. Sci. DOI=10.1177/1948550611406138
[29] Niedenthal, P.M. 2007. Embodying Emotion. Science. 316,
1002-1005. DOI=10.1126/science.1136930
[30] Niedenthal, P. M., Mermillod, M., Maringer, M., and Hess,
U. 2010. The Simulation of Smiles (SIMS) Model:
Embodied simulation and the meaning of facial expression.
Behav. Brain Sci. 33, 417-480.
DOI=10.1017/S0140525X10000865
[31] Pitcher, D., Garrido, L., Walsh, V., and Duchaine, B. 2008.
TMS disrupts the perception and embodiment of facial
expressions. J. Neurosci. 28, 8929-8933.
DOI=10.1523/JNEUROSCI.1450-08.2008
[32] Pourtois, G., Sander, D., Andres, M., Grandjean, D.,
Reveret, L., Olivier, E., and Vuilleumier, P. 2004.
Dissociable roles of the human somatosensory and superior
temporal cortices for processing social face signals. Eur. J.
Neurosci. 20, 3507-3515. DOI= 10.1111/j.1460-
9568.2004.03794.x
[33] Raudenbush, S. W., Bryk, A., Cheong, Y. F., and Congdon,
R. 2004. HLM 6: Hierarchical linear and nonlinear
modeling. Scientific Software International, Chicago.
[34] Russo, N. 1975. Eye contact, interpersonal distance, and the
equilibrium theory. J. Pers. Soc. Psychol. 31, 497-502.
DOI=10.1037/h0076476
[35] Schrammel, F., Pannasch, S., Graupner, S.-T., Mojzisch, A.,
and Velichovsky, B. M. 2009. Virtual friend or threat? The
effects of facial expression and gaze interaction on
psychophysiological responses and emotional experience.
Psychophysiology. 46, 922-931. DOI= 10.1111/j.1469-
8986.2009.00831.x
[36] Soussignan, R. 2002. Duchenne smile, emotional
experience, and autonomic reactivity: A test of the facial
feedback hypothesis. Emotion. 2, 52-74.
DOI= 10.1037/1528-3542.2.1.52
[37] Stel, M. and van Knippenberg. 2008. The role of facial
mimicry in the recognition of affect. Psychol. Sci. 19, 984-
985. DOI=10.1111/j.1467-9280.2008.02188.x
[38] van Baaren, R., Janssen, L., Chartrand, T.L., and
Dijksterhuis, A. 2009. Where is the love? The social aspects
of mimicry. Phil. Trans. R. Soc. B. 364, 2381-2389. DOI=
10.1098/rstb.2009.0057
[39] Wang, Y., Newport, R., and Hamilton, A. F. 2011. Eye
contact enhances mimicry of intransitive hand movements.
Biol. Lett. 23, 7-10. DOI=10.1098/rsbl.2010.0279