ArticlePDF Available

Abstract and Figures

The ability to detect facial expressions of pain is crucial in eliciting prosocial behaviors towards the individual experiencing pain. Previous studies have shown that the sufferers' gender can affect the observers' explicit judgment of the pain face, thus suggesting its possible influence on pain decoding. The present study investigates whether the sufferer's gender affects the observer's reflexive or implicit detection of facial expression of pain. More specifically, we used implicit measures to test whether observers detect pained expression more quickly or accurately on male or female faces. In three experimental studies, we devised a set of stimuli using computer-generated faces. In Experiment 1, prototypical female and male avatars with different facial expressions (pain, anger, disgust, and neutral) were displayed, while subjects' (N=34) accuracy and speed at identifying the expressions were recorded. In Experiment 2, participants (N=56) watched videos of the avatars displaying dynamic expressions and had to quickly and accurately identify each expression. In Experiment 3, participants (N=38) were shown an androgynous avatar face showing different expressions and were asked to identify the face as either female or male. Overall, we found that the target's gender affected the observer's reflexive decoding of the facial expression of pain. Specifically, the results showed that participants, regardless of their gender, were slower and less accurate in recognizing pain expressions (but not other expressions) on female faces. Furthermore, androgynous faces displaying pained expressions were more likely to be categorized as male than female. Several potential explanations are discussed.
Content may be subject to copyright.
Gender effects in pain detection: Speed and accuracy in decoding female
and male pain expressions
Paolo Riva
, Simona Sacchi, Lorenzo Montali, Alessandra Frigerio
University of Milano-Bicocca, Italy
article info
Article history:
Received 17 July 2010
Received in revised form 4 February 2011
Accepted 24 February 2011
Available online 23 March 2011
Keywords:
Pain expressions
Emotional expressions
Gender differences
Face perception
Implicit measures
abstract
The ability to detect facial expressions of pain is crucial in eliciting prosocial behaviors towards the
individual experiencing pain. Previous studies have shown that the sufferers’ gender can affect the
observers’ explicit judgment of the pain face, thus suggesting its possible influence on pain decoding.
The present study investigates whether the sufferer’s gender affects the observer’s reflexive or implicit
detection of facial expression of pain. More specifically, we used implicit measures to test whether
observers detect pained expression more quickly or accurately on male or female faces. In three
experimental studies, we devised a set of stimuli using computer-generated faces. In Experiment 1,
prototypical female and male avatars with different facial expressions (pain, anger, disgust, and neu-
tral) were displayed, while subjects’ (N= 34) accuracy and speed at identifying the expressions were
recorded. In Experiment 2, participants (N= 56) watched videos of the avatars displaying dynamic
expressions and had to quickly and accurately identify each expression. In Experiment 3, participants
(N= 38) were shown an androgynous avatar face showing different expressions and were asked to
identify the face as either female or male. Overall, we found that the target’s gender affected the
observer’s reflexive decoding of the facial expression of pain. Specifically, the results showed that par-
ticipants, regardless of their gender, were slower and less accurate in recognizing pain expressions
(but not other expressions) on female faces. Furthermore, androgynous faces displaying pained
expressions were more likely to be categorized as male than female. Several potential explanations
are discussed.
Ó2011 European Federation of International Association for the Study of Pain Chapters. Published by
Elsevier Ltd. All rights reserved.
1. Introduction
Since Darwin (1872), the pain face has been considered primar-
ily for its communicative function (Craig, 1992; Hadjistavropoulos
and Craig, 2004; Prkachin, 2009). Indeed, it often seems advanta-
geous for the sufferer to have the chance to translate a distressing
internal state into a message that can be perceived by someone in
the environment (Prkachin et al., 1994; Simon et al., 2008). Re-
search has found that the specific pain-related facial movements
(Prkachin and Solomon, 2008) are consistent across genders (Kunz
et al., 2006), ages (Hadjistavropoulos, 2005) and cognitive abilities
(Kunz et al., 2007).
To be effective in its communicative function, the pain face,
once encoded, needs to be detected by someone in the environ-
ment. The facial expression of pain is highly salient to observers
(Craig, 1992); it prompts them to provide support for the suf-
ferer or to protect themselves from a threat (Williams, 2002;
Yamada and Decety, 2009). Furthermore, observers need to
identify the pain face correctly and quickly in order to provide
rapid support or to receive immediate signals of a possible
alarm. Researchers have examined observers’ ability to discrim-
inate the facial expression of pain from expressions of other ba-
sic emotions (Simon et al., 2008). However, past research has
also shown variance and errors in the observer’s ability to de-
tect pain (Kappesser and Williams, 2002; Prkachin and Craig,
1994), suggesting that further investigation of the variables that
can bias the ability to detect the facial pain expression is
needed.
Several studies indicate that the sufferer’s social characteristics,
including gender, ethnicity and age, can affect the observers’
perception of pain. For instance, pain may be under-recognized
in older people and members of ethnically disadvantaged groups
(Horgas and Elliott, 2004; Green et al., 2003). Gender may also af-
fect the process. For instance, Robinson and Wise (2004) found that
observers judged that women undergoing a cold pressor task expe-
rienced more intense pain than men. Hirsh et al. (2008) showed
that computer representations of female faces were rated as suffer-
1090-3801/$36.00 Ó2011 European Federation of International Association for the Study of Pain Chapters. Published by Elsevier Ltd. All rights reserved.
doi:10.1016/j.ejpain.2011.02.006
Corresponding author. Address: Department of Psychology, University of
Milano-Bicocca, 1, Piazza Ateneo Nuovo, 20126 Milano, Italy. Tel.: +39 02 6448
3775; fax: +39 02 6448 3706.
E-mail address: paolo.riva1@unimib.it (P. Riva).
European Journal of Pain 15 (2011) 985.e1–985.e11
Contents lists available at ScienceDirect
European Journal of Pain
journal homepage: www.EuropeanJournalPain.com
ing more intense pain than representations of male faces. These
findings related to observers’ social judgment are consistent with
pain-related gender role stereotypes (Hoffmann and Tarzian,
2001), and they suggest that gender can bias the observer’s decod-
ing of a target face.
However, studies have shown that explicit and reflective social
judgment might diverge from implicit, or reflexive, cognitive pro-
cesses and associations (Greenwald and Nosek, 2008; Sloman,
1996). These latter are faster, they take place automatically and
unintentionally, and they are less influenced by social desirability.
In real life, facial pain cues are often received accidentally, and
behavioral responses can occur under uncertainty or time pres-
sure; under these conditions, observers are more likely to rely on
automatic and implicit processes than controlled reasoning (Tait
et al., 2009). Therefore, implicit methods can assess the involun-
tary effects of the target’s gender on pain detection while bypass-
ing explicit reasoning and social desirability concerns. Consistent
with these premises, the present study aimed to examine, using
implicit measures, whether the sufferer’s gender affects the obser-
ver’s speed and accuracy in detecting otherwise identical facial
expressions of pain.
2. Experiment 1
In Experiment 1, we investigated whether participants de-
tected facial expression more quickly and/or accurately in female
or male faces. Furthermore, we tested the distinctness of the
interaction between gender and the pain face versus other
threat-relevant facial expressions. Anger and disgust were cho-
sen as control stimuli because of their negative valence and their
threat-relevant nature (Kappesser and Williams, 2002; Williams,
2002).
As already mentioned, implicit methodology was used in this
and the following experiments. Implicit methods are appropriate
in cases in which some sort of knowledge or schema influences
an observer’s cognition or behavior, but the observer might not
be aware of that knowledge and its influence. This methodological
choice was preferred because of the known role of automatic cog-
nitive processes in pain decoding. Tait and colleagues (2009) ar-
gued that observers often perceive sufferers’ pain under
conditions of uncertainty. This phenomena fosters the use of an
intuitive/heuristic cognitive system (also known as ‘‘System 1’’),
which is fast, effortless and automatic, and which can ultimately
bias the observer’s ability to decode pain expressions. Considering
the role of automatic cognitive processes in pain judgment, we re-
corded observers’ reaction times and accuracy to determine
whether an association existed between the targets’ gender and
the observers’ perceptions of pain expressions. Through the accu-
racy scores, we examined the interaction between gender and
the observer’s confusion of the pain face with the other negative
emotional expressions (i.e., misinterpreting pain as disgust or dis-
gust as pain).
2.1. Method
2.1.1. Participants
Participants were 34 undergraduate students (24 females, 10
males; mean age = 25 ± 5.6) at the University of Milano-Bicocca.
They received class credit for participating. All participants were
Caucasian with normal or corrected-to-normal vision.
2.1.2. Stimuli and design
We adopted computer-generated avatar faces to present iden-
tical pain expressions and precisely control the expressive inten-
sity of each face, in line with previous studies (Hirsh et al., 2008,
2009). A series of 400 400 pixel computer avatars was gener-
ated. We used FaceGen 3.1, a computer software program that
allows the user to manipulate the facial movements of an avatar
along with other features, such as its gender, age and ethnic
group. Using this tool, we generated Caucasian faces that were
identical to each other except for their gender and their facial
expressions, thus controlling all the other possible confounding
variables (e.g., the degree of facial attractiveness, the gaze direc-
tion, facial asymmetry, ethnicity and general facial morphology).
Moreover, using computer-generated faces provided us with a
unique way to objectively match the faces’ intensity of expres-
sion; the exact same expression (in terms of the actions involved
and intensity of each movement) could be displayed on a female
or a male avatar face. FaceGen 3.1 includes a series of pre-pro-
grammed slides that allow the user to generate expressions of
anger, disgust, fear, happiness, sadness and surprise. A computer
programmer was hired to add a pain slide (see Fig. S1, see the
online version at 10.1016/j.ejpain.2011.02.006) that included
the facial movements, or action unities (AUs) involved in the
pain face (AUs: 4, 6, 7, 10, and 43). The exclusive presence of
these AUs and the specificity of the resulting pain face compared
with other basic emotions were confirmed by three trained
judges, who were blind to the purpose of the research, using
the Facial Action Coding System (FACS) (Ekman and Friesen,
1978). A subsequent study was conducted to determine whether
naïve observers perceived the new slide’s expression as pained.
Thirty-one participants (58% females, mean age = 23 ± 2.43) were
presented with a series of eight avatar faces expressing pain, an-
ger, disgust and neutrality. Participants had to identify each
expression by choosing one out of eight options: fear, anger, dis-
gust, sadness, shame, happiness, pain and neutrality. The results
showed that participants correctly identified the pain expression
generated with FaceGen 3.1 in 80.6% of the cases (50 times out
of 62 cases). This pattern was similar to that for disgust (79% of
correct hits). Anger was correctly identified in 87% of cases,
whereas the neutral expression was correctly identified in the
93.5% of the total cases.
The resulting experimental design considered the following fac-
tors: 2 (face gender: male vs. female) 4 (expression: anger, dis-
gust, neutral or pain) 2 (participant’s gender: male vs. female),
with the first two factors varied within-subjects and the last one
between subjects.
2.1.3. Procedure
We employed a speeded dichotomous decision task in which
individuals were presented a series of avatar faces on a com-
puter monitor one at a time. Stimuli presentation and response
measurements in this and the subsequent experimental proce-
dures were controlled by the software package e-Prime™
(Schneider et al., 2002). The images were displayed at the center
of a 17-in. color monitor screen on a uniform, black background
with a 75-Hz refresh rate. The level of brightness was consistent
for every image. Participants observed the monitor at a viewing
distance of about 50 cm without head restraint. They were in-
structed that for each trial, they should quickly and accurately
classify avatar facial expressions as pained or ‘‘other’’ by pressing
the ‘‘D’’ or ‘‘K’’ key on the keyboard with their left and right
forefinger, respectively. During each trial, a fixation cross ap-
peared on the screen; approximately 500 ms later, an avatar face
was presented. As soon as the participant provided his or her an-
swer, the avatar face disappeared and another fixation cross ap-
peared. Participants completed a training phase composed of 25
trials and then judged 80 trials (10 stimuli for each condition).
After each trial, participants received feedback regarding the cor-
rectness of their responses. The stimuli were presented
randomly.
985.e2 P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11
2.2. Results
2.2.1. Accuracy
Participants’ answers were classified as either correct (1) or
incorrect (0). We assigned a score of 1 when the pain expression
was correctly detected and when expressions of anger, disgust or
neutrality were perceived as ‘‘other’’; vice versa, we assigned 0
scores when other expressions were identified as pain and pain
as other expressions. Overall, participants made 289 errors
(11.89%) (M= 8.50, SD = 4.51). An accuracy index was computed
as the sum of the correct answers.
Crucially, the 2 (face gender: male vs. female) 4 (expression:
anger, disgust, neutral or pain) 2 (participant’s gender: male vs.
female) ANOVA revealed an interaction effect between facial
expression and gender, F(1, 32) = 24.15, p< .001,
g
2
p
¼:42. More
specifically, the results showed that accuracy in detecting pain
expressions significantly increased when pain was showed by a
male face compared with a female one; there was no difference be-
tween male and female faces and observers’ accuracy of identifying
the other three expressions (see Table 1).
The analysis also yielded a main effect for expression,
F(1, 32) = 38.26, p< .001,
g
2
p
¼:54. As the post hoc analyses
showed, the expression of anger was misperceived (M= 9.65,
SD = 0.74) to the same extent as the neutral expression (M= 9.85,
SD = 0.55). The participants were less accurate in identifying dis-
gust (M= 8.49, SD = 1.55), which is more similar to pain than the
other two expression (Simon et al., 2008). Participants scored low-
est on identifying the expression of pain (M= 7.76, SD = 1.46). At
this regard, data on the type of errors participants made revealed
that 29.61% of pain expressions were classified as non-pain expres-
sions, while 8.47% of non-pain expressions as pain expression.
These results might arise from the nature of the task, which re-
quired that pain be identified specifically, while the expressions
of anger, disgust and neutrality could be classified roughly into a
‘‘not-pain’’ category.
The ANOVA did not reveal a main effect for participant’s gender,
F(1, 32) = 1.11, p= .30 or interaction effects between participants’
gender and the other factors, Fs(1, 32) < 2.50, ps > .12.
2.2.2. Latency
After the analysis on the level of answer accuracy, we consid-
ered the latency time between the stimuli presentation and the
participants’ answer, for correct answers only. Reaction times
(RT) were filtered at 2 standard deviations (calculated for each con-
dition). This procedure eliminated 5.51 % of the total responses.
Consistent with the analysis on accuracy, statistics revealed a
significant interaction effect between expression and face gender,
F(1, 26) = 3.29, p< .03,
g
2
p
¼:11. As Table 2 shows, the face gender
did not affect the participants’ RT in classifying anger and disgust;
however, when presented with a neutral expression, participants
were faster in answering when the avatar had a female appearance
than when it was male; conversely, when presented with a pain
expression, participants were faster in identifying the stimulus
when the avatar had a male than female appearance (LSD test:
ps < .05).
The statistical analysis did not reveal a main effect of partici-
pants’ gender, F(1, 26) = 3.26, p= .08: in decoding facial expres-
sions, female participants (M= 871.01, SD = 208.09) were as fast
as males (M= 1004.04, SD = 269.39). Moreover participants’ gender
interacted with face gender, F(1, 26) = 5.72, p= .02,
g
2
p
¼:18: fe-
male participants (M = 899.32, SD = 225.36) were as fast as male
participants (M= 978.83, SD = 224.15) when they had to identify
male expressions but male participants (M = 1029.25,
SD = 283.67) were slower than female participants (M = 842.69,
SD = 191.33) when they were asked to decode expressions showed
by female faces. The type of expression showed by the faces signif-
icantly affected the participants RT, F(1, 26) = 14.16, p< .001,
g
2
p
¼:35: participants were relatively faster in classifying neutral
expressions (M= 849.21, SD = 202.69) and relatively slower in clas-
sifying disgust expressions (M= 1054.93, SD = 285.15). Analysis did
not yield the avatar face gender main effect, F(1, 26) = .02, p= .89.
The three-way interaction effect among participants’ gender,
avatar gender and expression was not significant, F(1, 26) = 1.85,
p> .18.
Overall, Experiment 1 provided some preliminary evidence that
pain was more salient in male faces at the reflexive, implicit level.
Participants detected pain on male faces more accurately than on
female faces. Furthermore, they identifying the pain face more
quickly when the avatar had a male face than when it had a female
one. These results were specific to the pain face, as shown by the
analyses of the other expressions.
3. Experiment 2
In Experiment 2, we aimed to replicate the findings of Experi-
ment 1 using dynamic stimuli instead of static facial displays. Spe-
cifically, Experiment 2 was meant to investigate the threshold at
which observers could correctly identify a pain expression as a
function of the gender of the face displaying it. Based on the find-
ings of Experiment 1, we expected that female faces needed to dis-
play higher magnitudes of pain to be detected by observers.
3.1. Method
3.1.1. Participants
Fifty-six Milano-Bicocca university students (33 females; mean
age = 23 ± 2.8) were recruited to participate in a study on the
perception of facial expressions. They received class credit for
participating. All subjects were Caucasian and had normal or cor-
rected-to-normal vision.
3.1.2. Stimuli and design
The participants were shown series of 16-s dynamic visual
stimuli (video) of male and female avatar faces displaying anger,
Table 1
Mean (standard deviations) of accuracy in decoding expression when showed by
female or by male faces. Experiment 1.
Expression Face gender TOT
Female Male
Neutral 9.91
a
(0.38) 9.79
a
(0.73) 9.85 (0.55)
Anger 9.74
a
(0.67) 9.56
a
(0.82) 9.65 (0.74)
Disgust 8.79
a
(1.47) 8.18
a
(1.62) 8.49 (1.55)
Pain 6.88
a
(1.49) 8.65
b
(1.43) 7.76 (1.46)
Note: Different letters indicate in each row statistical differences between male and
female condition, according to the LSD test (p< .001).
Table 2
Mean (standard deviations) of participants’ latency time (in milliseconds) between
the expression presentation and the answer when correct. Experiment 1.
Expression Face gender TOT
Female Male
Neutral 785.18
a
(211.02) 848.69
b
(194.36) 816.94 (202.69)
Anger 925.22
a
(297.04) 905.91
a
(205.50) 915.57 (251.27)
Disgust 969.88
a
(281.03) 1068.23
a
(289.27) 1019.06 (285.15)
Pain 937.81
a
(156.44) 866.49
b
(189.27) 902.15 (212.86)
Note: Different letters indicate in each row statistical differences between male and
female condition, according to the LSD test (p< .05).
P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11 985.e3
disgust or pain. Each video was created by pooling 100 frames rep-
resenting facial expressions with increasing intensity (the degree
of increase was controlled and matched for each expression using
the FaceGen 3.1 controls). Each video started with a neutral face
and moved progressively through the sequence of 100 frames,
reaching the maximum amount of the three expressions after 16 s.
The experimental design considered the following factors: 2 (face
gender: male vs. female) 3 (expression: anger, disgust or pain) 2
(participant’s gender: male vs. female), with the first two factors var-
ied within-subjects and the last one between subjects.
3.1.3. Procedure
The 16-s dynamic visual stimuli were presented on a computer
monitor one at a time in a randomized order. The images were dis-
played at the center of the screen of a 17-in. monitor, with a uni-
form black background and a 75-Hz refresh rate. The level of
brightness was consistent for every image. Participants viewed
the monitor at a distance of about 50 cm without head restraint.
Participants were asked to watch each video and identify the facial
expressions as quickly and accurately as possible. They were in-
structed to press the space bar on the computer keyboard as soon
as they thought they had identified the expression displayed by the
avatar face. The face disappeared when the space bar was pressed,
and the participant had to indicate which of the three expressions
s/he had identified by pressing a designated key. Participants com-
pleted a training phase composed of two trials, then judged 12 tri-
als (two stimuli for each condition). Participants were given
feedback about the correctness of their response after each trial.
3.2. Results
The scores for reaction times and accuracy were subjected to a 2
(face gender: male vs. female) 3 (expression: anger, disgust or
pain) 2 (participant’s gender: male vs. female) ANOVA. Reaction
time was the crucial dependent variable (because it directly re-
flects the detection threshold), and accuracy was not independent
of reaction times (e.g., higher latency corresponds with higher
expression intensity and, consequently, an easier task); however,
we report the analyses of reaction times and accuracy separately
for improved clarity.
3.2.1. Latency
We recorded the time between the beginning of the video and
when the participants pressed the space bar and selected cases
in which participants correctly identified each expression. Reaction
times (RT) were filtered at 2 standard deviations (calculated for
each condition). This procedure eliminated 2.73% of the total
responses.
Facial gender interacted with the type of expression,
F(1, 60) = 10.058; p< .001;
g
2
p
¼:251. As shown in Table 3, three
patterns of results were found. The face’s gender did not affect par-
ticipants’ RT for accurately decoding an expression of anger. Partic-
ipants decoded disgust more quickly when it was displayed by a
female face compared with a male face. In contrast, and consistent
with the findings on static facial displays in Experiment 1, partici-
pants identified dynamic pain expressions more quickly on male
avatar faces than female ones.
Neither the face’s gender (F(1, 30) = 1.68, p= .204) or of the type
of expression (F(1, 60) = .27, p= .762) had a significant main effect
on participants’ RT.
The statistical analysis did not reveal a main effect for partici-
pants’ gender, F(1, 30) = 1.16, p= .289. No interaction effects were
found between participants’ gender and the other factors,
Fs(1, 30) < 1.52, ps > .22.
3.2.2. Accuracy
Participants’ answers were then classified as either correct (1)
or incorrect (0). Overall, participants made 222 errors (M= 3.96,
SD = 2.41; 33.03% of the total; Table S1, see the online version at
10.1016/j.ejpain.2011.02.006 for the exact type of errors made).
An accuracy index was computed as the sum of the correct
answers.
In line with the results of Experiment 1, a significant interaction
between face gender and expression was present, F(2, 108) = 7.69,
p< .001,
g
2
p
¼:12. As shown in Table 4, participants were more
accurate in decoding the expression of anger and disgust on female
rather than male faces. However, the higher accuracy of detecting
facial expression on a female face dropped when avatars displayed
a pain expression.
There was a significant main effect of the face gender on accu-
racy level, F(1, 54) = 25.04, p< .001,
g
2
p
¼:32. Post hoc analyses re-
vealed that accuracy was greater when the expression was
displayed by a female (M= 1.47, SD = 0.57) than a male target
(M= 1.17, SD = 0.67).
The analysis also yielded a main effect of expression,
F(2, 108) = 5.14, p= .008,
g
2
p
¼:08. More specifically, participants
were less accurate in decoding the expression of pain (M= 1.18,
SD = 0.70) than anger (M= 1.39, SD = 0.65) and disgust (M= 1.41,
SD = 0.79). No significant differences emerged between the two lat-
ter factors.
The ANOVA revealed neither a main effect of participant’s gen-
der, F(1, 54) = 1.49, p= .23, nor interaction effects between partici-
pants’ gender and the other factors, Fs(1, 54) < .58, ps > .55.
In sum, Experiment 2 confirmed Experiment 1’s findings that
the male pain face is more salient than the female pain face at
the implicit level. Results indicated that dynamic expressions of
pain – unlike other negative emotions – were judged as pained
more quickly if they were displayed by a male face. The null results
for the accuracy score do not contradict the findings for reaction
time. Indeed, at the moment participants identified an emotion
in the dynamic display, the mean pain expression intensity dis-
played by female avatars was likely to be higher than that of male
avatars. Thus, we obtained the same accuracy scores for male and
female faces, but the female pain expressions were easier to detect
because they were watched for a longer time and were therefore
more intense.
Table 3
Mean (standard deviations) of participants’ latency time (in milliseconds) between
the expression presentation and the answer when correct. Experiment 2.
Expression Face gender TOT
Female Male
Anger 10003.98
a
(500.99) 10724.702
a
(470.86) 10523.10 (2767.14)
Disgust 9199.50
a
(428.03) 11458.09
b
(500.112) 10361.98 (2913.40)
Pain 11318.68
a
(436.14) 9829.29
b
(580.06) 10771.91 (2637.84)
Note: Different letters indicate in each row statistical differences between male and
female condition, according to the LSD test (p< .001).
Table 4
Mean (standard deviations) of accuracy in decoding expression when showed by
female or by male faces. Experiment 2.
Expression Face gender TOT
Female Male
Anger 1.59
a
(0.72) 1.18
b
(0.95) 1.40 (0.65)
Disgust 1.68
a
(0.74) 1.14
b
(0.69) 1.41 (0.71)
Pain 1.15
a
(0.93) 1.20
a
(0.93) 1.19 (0.68)
Note: Different letters indicate in each row statistical differences between male and
female condition, according to the LSD test (p< .001).
985.e4 P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11
4. Experiment 3
Experiment 3 was designed to further test the prediction that
the facial expression of pain and the target’s gender interact at
the implicit level. The findings from Experiments 1 and 2 revealed
that perceivers detect pain on male faces more quickly and accu-
rately than on female faces. Reversing the perspective of these pre-
vious experiments, we predicted that detecting a facial expression
of pain would lead observers to perceive an androgynous target as
more masculine.
4.1. Method
4.1.1. Participants
Participants were 39 undergraduate students (29 females, 10
males; mean age = 25 ± 5.31) at the University of Milano-Bicocca.
They received class credit for participating. All participants were
Caucasian with normal or corrected-to-normal vision.
4.1.2. Stimuli and design
Ten versions of an androgynous avatar face were generated using
FaceGen 3.1. The ambiguous stimuli were created by setting the
gender bar at the midpoint between the male and female poles. A pi-
lot study with 21 participants (56% females, mean age = 22 ± 2.10)
was conducted to test that the adopted avatar face conveyed similar
degrees of perceived masculinity and femininity. Participants rated
slightly different versions of the androgynous face on a scale from 0
(completely masculine) to 10 (completely feminine). For the study,
we selected the avatar faces whose mean scores were not statisti-
cally different from the middle point of the scale, t(1, 20) = 1.60,
p= .13.
The avatars’ facial movements were then manipulated to gener-
ate four different facial expressions: neutrality, anger, disgust and
pain (see Fig. S2, see the online version at 10.1016/j.ej-
pain.2011.02.006). For an in-depth investigation of the hypothesis,
we also manipulated the intensity of each facial expression. The
stimuli differed according to the intensity level displayed: intensity
was categorized as low (corresponding levels of the built-in Face-
Gen 3.1 controls: 0.1, 0.2, 0.3), medium (0.4, 0.5, 0.6, 0.7) or high
(0.8, 0.9, 1). The crucial stimuli set consisted of three images per
condition, resulting in a total of 36 trials. We also included fillers
(12 male avatars and 12 female avatars, varied by expression and
intensity level) to increase the variability of the stimuli shown to
participants.
The number of ‘‘female’’ answers and the latency time were
subjected to a 3 (expression: pain, anger or disgust) 3 (intensity
level: low, medium or high) 2 (participant’s gender: male vs. fe-
male) ANOVA, with the first two factors varied within-subjects and
the last one between subjects. The factorial design was incomplete
because the neutral expression’s intensity was not manipulable
(the nine neutral faces did not differ by degree of expression
intensity).
4.1.3. Procedure
We employed a speeded dichotomous decision task. Respon-
dents were presented with a series of computer-generated avatar
faces displayed at the center of the screen of a 17-in. monitor, on
a uniform, black background with a 75-Hz refresh rate. The level
of brightness was consistent for every image. Participants viewed
the monitor from a distance of about 50 cm without head restraint.
They were asked to categorize each target as female or male by
pressing two corresponding keys on the computer keyboard using
their left and right forefingers, respectively. The stimuli were pre-
sented randomly, and participants were instructed to respond as
quickly and accurately as possible.
4.2. Results
4.2.1. Gender classification
The gender categorization of androgynous stimuli was ana-
lyzed. We assigned a score of 0 to male answers and 1 to female
answers.
The analysis yielded a significant main effect for expression,
F(1, 37) = 58.77, p< .001,
g
2
p
¼:61 (see Table 5). Supporting our pre-
diction, the lowest number of female answers was associated with
pain expressions (M= 0.37, SD = 0.71), followed by disgust
(M= 0.99, SD = 1.05) and anger expressions (M= 1.84, SD = 1.07).
The analysis also yielded a main effect for expression level of inten-
sity, F(1, 37) = 9.54, p< .01,
g
2
p
¼:20. Participants classified the
ambiguous stimuli as female more frequently at low levels of inten-
sity (M= 1.30, SD = 1.02) than at medium (M= 1.02, SD = 0.94) and
high levels of intensity (M= 0.89, SD = 0.88).
In the previous analysis, we did not consider the neutral expres-
sions because their intensity could not be manipulated. However,
we compared the total number of ‘‘female’’ answers by expression
type independent of intensity level using a 4-level within-subject
ANOVA (expression: neutral, anger, disgust or pain). The analysis
confirmed a significant effect for expression type, F(1, 38) = 88.99,
p< .001,
g
2
p
¼:70. The ambiguous stimuli showing neutral expres-
sions obtained the highest number of ‘‘female’’ responses
(M= 7.22, SD = 2.45) compared with the other three expressions,
p< .001 (see Table 5). The ‘‘female’’ responses to the other three
expressions remained significantly different.
The ANOVA did not reveal a main effect for participant gender,
F(1, 37) = 0.02, p= .87 or interaction effects between participant
gender and the other factors, Fs(1, 37) < 2.58, ps > .12.
These results replicate those of Experiments 1 and 2 and show
that decisions about a face’s gender and its emotional expressions
of pain are not independent. Therefore, we found that androgynous
faces displaying pain were considered more masculine than femi-
nine when the images were controlled for other negative emotions.
5. General discussion
The detection of pain in others is a highly salient event for
onlookers (Craig, 1992; Williams, 2002). However, this strong sal-
ience can be moderated by the target’s social characteristics. In-
deed, consistent with pain-related gender role stereotypes
(Hoffmann and Tarzian, 2001) early studies suggested that gender
can actually affect the observers’ explicit judgment of pain on fe-
male and male target (see Hirsh et al., 2008). Yet, past research
has also shown that explicit social judgment might diverge from
implicit, or reflexive, cognitive processes and associations (Green-
wald and Nosek, 2008; Sloman, 1996). Whereas explicit judgments
can be biased by a wide range of factors, such as response biases,
faking, motivation and opportunity (e.g., the motivation to control
prejudice) implicit processes tend to be more free of response fac-
tors such as social desirability and faking tendencies (Greenwald
et al., 2002). Moving from these premises, in the present study,
Table 5
Mean (standard deviations) of number of female categorizations by type of expression
and intensity level. Experiment 3.
Expression Intensity
SUM Low Medium High
Anger 5.52
a
(3.04) 1.95
a
(1.05) 1.75
a
(1.08) 1.82
a
(1.09)
Disgust 2.98
b
(2.68) 1.44
b
(1.16) 0.98
b
(1.00) 0.56
b
(0.99)
Pain 1.12
c
(1.93) 0.51
c
(0.85) 0.33
c
(0.72) 0.28
c
(0.56)
Neutral 7.22
d
(2.45)
Note: Different letters indicate in each column statistical differences among each
expression, according to the LSD test (p< .05).
P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11 985.e5
we found that the target’s gender affected the observer’s reflexive
decoding of the facial expression of pain. More specifically, we
found that observers’ ability to detect pain in a female face was
lower than their ability to detect pain in male faces. This tendency
was consistent across three experiments and represented a unique
response to pain expressions compared with a baseline (e.g., neu-
tral expression) or other negative facial expressions (e.g., anger and
disgust).
Overall, the results showed that male pain faces were more eas-
ily processed at the reflexive level. In Experiment 1, participants
confused the pain expression with other expressions to a higher
degree in female faces than male faces. Experiment 2 showed that
participants took longer to correctly identify a dynamic pain
expression on a female face compared with a male face. Because
the dynamic stimuli showed facial expressions of increasing inten-
sity, a more intense pain expression was needed for participants to
correctly identify pain on a female face, whereas lower-intensity
pain expressions were sufficient to detect suffering on male faces.
Finally, Experiment 3 showed that participants tended to perceive
an androgynous face as less feminine when it displayed pain than
other negative emotions.
Although our experimental design did not allow us to identify
the underlying mechanism, several parallel and convergent expla-
nations might be pointed out. Further research should directly
investigate the processes that determine the higher salience of
male pain face at the implicit level.
First, we can speculate that the observers’ ability to decode pain
faces might depend upon gender differences in pain behavior,
which implies different exposures to male and female pain faces.
Indeed, research has reported significant gender differences in
the perception, experience and expression of pain. A large amount
of clinical literature indicates that women more frequently experi-
ence and are more intensely affected by a large number of pain
syndromes – both chronic and acute – and they describe the pain
as more intense, widespread, and of a greater duration than men
do (Dao and LeResche, 2000; Heitkemper and Jarrett, 2001; Morin
et al., 2000; Robinson et al., 1998; Rollman and Lautenbacher,
2001; Unruh, 1996). Experimental research has found that women
exhibit significantly greater sensitivity to pain and lower pain tol-
erance (Fillingim and Maixner, 1995; Wise et al., 2002). For that
reason, even though men and women do not seem to differ in their
facial configurations in response to pain stimuli (Kunz et al., 2006),
it might be that onlookers are more frequently exposed to facial
pain expressions on female rather than male faces precisely be-
cause women suffer more from a larger variety of pain syndromes.
Studies on pain judgment have reported the systematic tendency
of observers who are more exposed to patients in pain to underes-
timate the pain those patients experience (Marks and Sachar,
1973; Marquié et al., 2003). The commonly referenced socio-cogni-
tive process is thought to result from habituation; that is, the con-
sequence of being repeatedly exposed to others’ suffering might be
a diminished sensitivity to pain in others (Prkachin et al., 2007).
Habituation has received empirical support (Prkachin and Ro-
cha, 2010; Prkachin et al., 2004), although boundary conditions
have been pointed out (i.e., the observer’s accuracy can increase
after spending more time with the person in pain) (Miaskowski
et al., 1997). Taking into account the habituation bias, a greater
exposure to females’ pain expressions might lead to a decreased
sensitivity to pain displayed by women. Indeed, Experiments 1
and 2 showed that participants took longer and needed more in-
tense pain expressions to correctly identify pain on female com-
pared with male faces. Future studies should investigate whether
habituation underpins observers’ impaired ability to detect female
pain compared with male pain. Observers could be experimentally
exposed either to a male or a female face for a specific length of
time, similar to the design used by Prkachin and Rocha (2010),to
determine whether length of exposure produces different out-
comes. However, this possible explanation is inconsistent with
our findings related to anger detection under the control condition.
It is well known that people are more exposed to male expressions
of anger than female; according to the habituation hypothesis, we
would expect greater speed and accuracy in identifying anger on a
female face.
As a potential parallel to the habituation bias (e.g., decreased
sensitivity to female pain faces resulting from greater exposure
to them), top-down inhibition processes might have contributed
to observers’ impaired ability to decode female pain faces. Stereo-
types are known to affect the way observers perceive a target face
(Hugenberg and Bodenhausen, 2004; Hugenberg and Sacco, 2008).
The psychology literature has widely shown that stereotypes bias
the processing of potentially ambiguous information in a stereo-
type-consistent manner across multiple domains (Darley and
Gross, 1983; Duncan, 1976). Consistent with the notion that ex-
pected gender differences in facial expressivity can be prompted
by gender stereotypes, a recent theory of pain underestimation
posits that female gender is an invalidating factor in pain judg-
ment: given the stereotypical views of women as dramatizing,
observers’ certainty about a women’s pain might be reduced (Tait
et al., 2009). Accordingly, our study shows higher error rates in
perceiving pain expression in females than in males.
Further evidence of gender influences in observers’ ability to de-
code pain expressions comes from brain research. Simon et al.
(2006) found the neurological activation of the observer’s brain de-
pended upon the gender of the person expressing pain, but not
upon the observer’s gender. In their study, the authors provided
fMRI support for their hypothesis that the ‘‘implicit processing of
male pain expression triggers an emotional reaction characterized
by a threat-related response’’ (p. 309). Indeed, several activations
induced by male facial displays were significantly decreased when
observers watched female facial displays of pain. Observing male
actors expressing pain activated several areas known for a
threat-related response, like the ventromedial prefrontal cortex,
SII/posterior insula and anterior insula. However, several areas
activated by male facial displays – including somatosensory areas,
the amygdala and the perigenual ACC – registered a significant de-
crease when observers watched female facial displays of pain. The
authors argued that the interaction between the pattern of neural
activation and actor’s gender might be due to the social communi-
cative value of pain conveyed by male and female faces. A male
pain expression may be more strongly linked with potentially
threatening situations than a female pain expression (LeDoux,
2000), activating the fight-flight system in the brain of the obser-
ver. Indeed, previous scholars suggested that pain can foment a
disposition towards aggression (Berkowitz, 1993), and recent
empirical data confirmed that pain increases aggressive tempta-
tions in humans (Riva et al., 2010). Because men are physically lar-
ger and stronger and therefore more dangerous on average than
women, detecting a male in pain might pose a threat particularly
salient. Simon et al.’s (2008) results are consistent with our find-
ings that observers are selectively (compared to neutral and anger
expression) more accurate and faster at decoding pain displayed by
male faces. Moreover, Experiment 3 showed that under conditions
of ambiguity (e.g., androgynous faces), participants perceived the
pain expressions as more masculine. Again, this might be the con-
sequence of a pain detection system that is selectively biased to
detect any possibility of a relevant threat in the environment. Thus,
the tendency to perceive a gender-ambiguous pain expression as
masculine may rely upon the higher cost of failing to detect pain
in a male compared with a female. In other words, because the cost
of overdetecting pain in a male face might be lower than the cost of
a missing it, the observer’s judgment may be biased toward deci-
phering the expression of pain on an androgynous face as mascu-
985.e6 P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11
line (for a discussion of the error management system that might
favor a bias for false alarms over misses, see Haselton and Buss,
2000).
The main advantages of the study were twofold. First, the use of
implicit methods allowed us to assess the involuntary effects of the
target’s gender on pain detection while bypassing social desirabil-
ity issues. Furthermore, in many naturalistic circumstances, pain
cues are unclear and treatment decisions have to be made under
conditions of uncertainty or time pressure. Under these conditions,
decision-makers rely on automatic, heuristic and implicit pro-
cesses rather than on more laborious reasoning (Tait et al., 2009).
Our methodological choice also accounted for seemingly conflict-
ing past results, which were based on explicit methods and indi-
cated that female faces are judged to express greater pain
intensity than male faces (Hirsh et al., 2008). These results are con-
sistent with explicit, stereotypical views that women are more
prone to express their pain (Hoffmann and Tarzian, 2001), yet they
can diverge from the implicit and automatic cognitive processes
involved in pain decoding. Second, the use of computer-generated
avatars allowed us to create stimuli with identical facial expres-
sions but different genders. However, the use of computer-gener-
ated faces might limit the generalizability of the current study’s
findings; although they allowed us to achieve greater experimental
control, their use might have limited the external validity of the
study. Thus, these findings should be interpreted with caution be-
cause we do not yet know the degree to which findings from com-
puterized stimuli are generalizable to human faces. Further
replication with different stimuli and samples is warranted. More-
over, the influence of other social factors, such as age, race and eth-
nicity, should also be investigated.
With regard to observers’ gender, neither main nor interaction
effects have been consistently found during the three experiments.
The only analysis that showed influence of observers’ gender re-
vealed that female participants were faster than males in decoding
anger, disgust and neutral expression. These results are in line with
the notion that female are generally better than males at basic face
perception (McBain et al., 2009; McClure, 2000). However, in line
with previous research (Simon et al., 2006), we did not find any
interaction between observers’ gender and the specific expression
of pain. Nevertheless, failures to find any consistent effect related
to the participants’ sex might be due to a lack of power from an
insufficient number of subjects or to the uneven distribution of
participants’ sex in our samples.
In terms of clinical implications, our findings are in keeping
with evidence that women are less likely to receive treatment for
pain than men (Calderone, 1990; McDonald, 1994; Hoffmann and
Tarzian, 2001). In this sense, observers’ impaired ability to detect
the female pain face could lead them to disregard the sufferer’s
experience and needs and fail to provide adequate care.
6. Conflict of Interest
The authors have no conflict of interest with respect to this
article.
Acknowledgements
We would like to thank Eric D. Wesselmann for comments on
earlier versions of this paper. We also thank Shannon Lapsley
and Shajuan Jackson for proof reading the article.
Appendix A. Supplementary material
Supplementary data associated with this article can be found, in
the online version, at doi:10.1016/j.ejpain.2011.02.006.
References
Berkowitz L. Aggression: its causes, consequences, and control. New York: McGraw-
Hill; 1993.
Calderone JL. The influence of gender on the frequency of pain and sedative
medication administered to postoperative patients. Sex Roles 1990;23:713–25.
Craig KD. The facial expression of pain: better than a thousand words? J Pain
1992;1:153–62.
Dao TT, LeResche L. Gender differences in pain. J Orofacial Pain 2000;14:169–84.
Darley JM, Gross PH. A hypothesis-confirming bias in labeling effects. J Pers Soc
Psychol 1983;44:20–33.
Darwin C. The expression of the emotions in man and animals. New
York: Philosophical Library; 1872.
Duncan BL. Differential social perception and attribution of intergroup violence:
testing the lower limits of stereotyping blacks. J Pers Soc Psychol
1976;34:590–8.
Ekman P, Friesen WV. Facial action coding system: a technique for the
measurement of facial movement. Palo Alto, Calif.: Consulting Psychologists
Press; 1978.
Fillingim RB, Maixner W. Gender differences in the responses to noxious stimuli.
Pain Forum 1995;4:209–21.
Green CR, Anderson KO, Baker TA, Campbell LC, Decker S, Fillingim RB, et al. The
unequal burden of pain: confronting racial and ethnic disparities in pain. Pain
Med 2003;4:277–94.
Greenwald AG, Banaji MR, Rudman LA, Farnham SD, Nosek BA, Mellott DS. A unified
theory of implicit attitudes, stereotypes, self-esteem, and self-concept. Psychol
Rev 2002;109:3–25.
Greenwald AG, Nosek BA. Attitudinal dissociation: what does it mean? In: Petty R,
Fazio RH, Briñol P, editors. Attitudes: insights from the new implicit
measures. Hillsdale, NJ: Lawrence Erlbaum Associates; 2008. p. 65–82.
Hadjistavropoulos T. Assessing pain in older persons with severe limitations in
ability to communicate. In: Gibson S, Weiner D, editors. Pain in older
persons. Seattle: IASP Press; 2005. p. 135–51.
Hadjistavropoulos T, Craig KD. Social influences and the communication of pain. In:
Hadjistavropoulos T, Craig KD, editors. Pain: psychological perspectives. New
York: Erlbaum; 2004. p. 87–112.
Haselton MG, Buss DM. Error management theory: a new perspective on biases in
cross-sex mind reading. J Pers Soc Psychol 2000;78:81–91.
Heitkemper MM, Jarrett M. Gender differences and hormonal modulation in visceral
pain. Curr Pain Headache Rep 2001;5:35–43.
Hirsh AT, Alqudah AF, Stutts LA, Robinson ME. Virtual human technology: capturing
sex, race, and age influences in individual pain decision policies. Pain
2008;140(1):231–8.
Hirsh AT, George SZ, Robinson ME. Pain assessment and treatment disparities: a
virtual human technology investigation. Pain 2009;143(1–2):106–13.
Hoffmann DE, Tarzian AJ. The girl who cried pain: a bias against women in the
treatment of pain. J Law Med Ethics 2001;29:13–27.
Horgas AL, Elliott AF. Pain assessment and management in persons with dementia.
Nurs Clin North Am 2004;39:593–606.
Hugenberg K, Bodenhausen GV. Ambiguity in social categorization. The role of
prejudice and facial affect in race categorization. Psychol Sci 2004;15(5):342–5.
Hugenberg K, Sacco DF. Social categorization and stereotyping: how social
categorization biases person perception and face memory. Soc Pers Psych
Compass 2008;2(2):1052–72.
Kappesser J, Williams ACdC. Pain and negative emotions in the face: judgments by
health care professionals. Pain 2002;99(1–2):197–206.
Kunz M, Gruber A, Lautenbacher S. Sex differences in facial encoding of pain. J Pain
2006;7(12):915–28.
Kunz M, Scharmann S, Hemmeter U, Schepelmann K, Lautenbacher S. The facial
expression of pain in patients with dementia. Pain 2007;133(1–3):221–8.
LeDoux J. Cognitive–emotional interactions: listen to the brain. In: Lane RD, Nadel L,
editors. Cognitive neuroscience of emotion. New York: Oxford University Press;
2000. p. 129–55.
Marks RM, Sachar EJ. Undertreatment of medical inpatients with narcotic
analgesics. Ann Intern Med 1973;78:173–81.
Marquié L, Raufaste E, Lauque D, Mariné C, Ecoiffier M, Sorum P. Pain rating by
patients and physicians: evidence of systematic pain miscalibration. Pain
2003;102:289–96.
McBain R, Norton D, Chen Y. Females excel at basic face perception. Acta Psychol
2009;130(2):168–73.
McClure EB. A meta-analytic review of sex differences in facial expression
processing and their development in infants, children, and adolescents.
Psychol Bull 2000;126:424–53.
McDonald DD. Gender and ethnic stereotyping and narcotic analgesic
administration. Res Nurs Health 1994;14:45–9.
Miaskowski C, Zimmer EF, Barrett KM, Dibble SL, Wallhagen M. Differences in
patients’ and family caregivers’ perceptions of the pain experience influence
patient and caregiver outcomes. Pain 1997;72:217–26.
Morin C, Lund JP, Villarroel T, Clokie CML, Feine JS. Differences between the sexes in
post-surgical pain. Pain 2000;85:79–85.
Prkachin KM. Assessing pain by facial expression: facial expression as nexus. Pain
Res Manage 2009;14(1):53–8.
Prkachin KM, Berzins S, Mercer SR. Encoding and decoding of pain expressions: a
judgment study. Pain 1994;58(2):253–9.
Prkachin KM, Craig KD. Expressing pain: the communication and interpretation of
facial pain signals. J Nonverbal Behav 1994;19:191–205.
P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11 985.e7
Prkachin KM, Mass H, Mercer SR. Effects of exposure on perception of pain
expression. Pain 2004;111(1–2):8–12.
Prkachin KM, Rocha EM. High levels of vicarious exposure bias pain judgments. J
Pain 2010;11(9):904–9.
Prkachin KM, Solomon PE. The structure, reliability and validity of pain
expression: Evidence from patients with shoulder pain. Pain 2008;139(2):
267–74.
Prkachin KM, Solomon PE, Ross J. Underestimation of pain by health-care providers:
towards a model of the process of inferring pain in others. Can J Nurs Res
2007;39:88–106.
Riva P, Wirth JH, Williams KD. The social impact of suffering: physical pain thwarts
social needs. In: Presented at the annual meeting of the midwestern
psychological association, Chicago, IL, May 2010.
Robinson ME, Wise EA. Prior pain experience: influence on the observation of
experimental pain in men and women. J Pain 2004;5:264–9.
Robinson ME, Wise EA, Riley JL, Atchison JA. Sex differences in clinical pain: a multi-
sample study. J Clin Psych Med Settings 1998;5(4):413–24.
Rollman GB, Lautenbacher S. Sex differences in musculoskeletal pain. Clin J Pain
2001;17:20–4.
Simon D, Craig KD, Gosselin F, Belin P, Rainville P. Recognition and
discrimination of prototypical dynamic expressions of pain and emotions.
Pain 2008;135(1–2):55–64.
Simon D, Craig KD, Miltner WHR, Rainville P. Brain responses to dynamic facial
expressions of pain. Pain 2006;126:309–18.
Schneider W, Eschman A, Zuccolotto A. E–Prime reference guide. Pittsburgh,
PA: Psychology Software Tools Inc.; 2002.
Sloman SA. The empirical case for two systems of reasoning. Psychological Bulletin
1996;119:3–22.
Tait RC, Chibnall JT, Kalauokalani D. Provider judgments of patients in pain: seeking
symptom certainty. Pain Med 2009;10:11–34.
Unruh AM. Gender variations in clinical pain experience. Pain 1996;65:123–67.
Williams ACdC. Facial expression of pain: an evolutionary account. Behav Brain Sci
2002;25:439–88.
Wise EA, Price DD, Myers CD, Heft MW, Robinson ME. Gender role expectations of
pain: relationship to experimental pain perception. Pain 2002;96(3):335–42.
Yamada M, Decety J. Unconscious affective processing and empathy: an
investigation of subliminal priming on the detection of painful facial
expressions. Pain 2009;143:71–5.
985.e8 P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11
Table S1
Frequencies and percentages (in parenthesis) of hits and errors the participants made
in Experiment 2.
Overall Male target Female target
Pain
Total number of responses 220 111 109
Correct answers 133 (60.4%) 68 (61.2%) 65 (59.6%)
Judged as anger 50 (22.7%) 29 (26.1%) 21 (19.2%)
Judged as disgust 37 (16.8%) 14 (12.6%) 23 (21.1%)
Anger
Total number of responses 223 112 111
Correct answers 184 (82.5%) 94 (83.9%) 90 (81.0%)
Judged as pain 20 (8.9%) 9 (8.0%) 11 (9.9%)
Judged as disgust 19 (8.5%) 9 (8.0%) 10 (9.0%)
Disgust
Total number of responses 220 110 110
Correct answers 133 (60.4%) 67 (60.9%) 66 (60%)
Judged as anger 61 (27.7%) 33 (30.0%) 28 (25.4%)
Judged as pain 26 (11.8%) 10 (9.0%) 16 (14.5%)
P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11 985.e9
Fig. S1. Examples of female and male stimuli used in Experiments 1 and 2 to manipulate each facial expression (from the top: neutral, anger, disgust and pain expression).
985.e10 P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11
Fig. S2. The four androgynous faces adopted in Experiment 3.
P. Riva et al. / European Journal of Pain 15 (2011) 985.e1–985.e11 985.e11
... However, such overgeneralization could be particularly relevant in the clinical pain context. Indeed, the pain management literature indicates that pain reported by patients is often met with doubt and skepticism on the part of observers (Blomqvist & Edberg, 2002;Clarke & Iphofen, 2005;Montali et al., 2011), which might lead to an underestimation of the patient's pain (Riva, Rusconi, et al., 2011;Rusconi et al., 2010), especially when displayed by the elderly and women (Blomqvist & Edberg, 2002;Riva, Sacchi, et al., 2011), or ethnic minorities like black individuals (Banaji et al., 2021;Hoffman et al., 2016). In this context, different trust construal features will likely be at stake, ranging from a generalised trust to a more specific trust developed within the patient-carer relationship. ...
... sex). The computer-generated faces were created with FaceGen 3.1 and were extracted from a dataset already used in previous work (Riva, Sacchi, et al., 2011). They were all androgynous, that is, they were meant to equally express male and female visual features. ...
... We added different degrees of expression (i.e., intensity) to increase the variability of the stimulus set and reduce habituation to these artificial faces. The distinctions between levels of emotional intensity were tested in Riva, Rusconi, et al. (2011) and Riva, Sacchi, et al. (2011). However, we did not include this variable as an independent predictor. ...
Article
Full-text available
Past research indicates that patients' reports of pain are often met with skepticism and that observers tend to underestimate patients' pain. The mechanisms behind these biases are not yet fully understood. One relevant domain of inquiry is the interaction between the emotional valence of a stranger's expression and the onlooker's trustworthiness judgment. The emotion overgeneralization hypothesis posits that when facial cues of valence are clear, individuals displaying negative expressions (e.g., disgust) are perceived as less trustworthy than those showing positive facial expressions (e.g., happiness). Accordingly, we hypothesized that facial expressions of pain (like disgust) would be judged more untrustworthy than facial expressions of happiness. In two separate studies, we measured trustworthiness judgments of four different facial expressions (i.e., neutral, happiness, pain, and disgust), displayed by both computer-generated and real faces, via both explicit self-reported ratings (Study 1) and implicit motor trajectories in a trustworthiness categorization task (Study 2). Ratings and categorization findings partly support our hypotheses. Our results reveal for the first time that when judging strangers' facial expressions, both negative expressions were perceived as more untrustworthy than happy expressions. They also indicate that facial expressions of pain are perceived as untrustworthy as disgust expressions, at least for computer-generated faces. These findings are relevant to the clinical setting because they highlight how overgeneralization of emotional facial expressions may subtend an early perceptual bias exerted by the patient's emotional facial cues onto the clinician's cognitive appraisal process.
... In the general field of emotions, it has been proposed that women are better than men at recognizing emotions experienced by others (Hall, 1978;Kret & De Gelder, 2012;McClure, 2000;Sasson et al., 2010;Thayer & Johnsen, 2000;Thompson & Voyer, 2014;Wingenbach et al., 2018; however, see Grimshaw et al., 2004;Palermo & Coltheart, 2004). This has also been suggested for the recognition of facial expressions of pain (Hill & Craig, 2004;Keogh, 2014;Prkachin et al., 2004; however, see Riva et al., 2011;Simon et al., 2006). In addition, women have been shown to be more sensitive to variations in pain expressions than men, which leads to less underestimation bias (Miron-Shatz et al., 2020;Prkachin et al., 2004;Robinson & Wise, 2003). ...
... In addition, one previous study has directly compared the utilization of different kinds of facial stimuli in a Bubbles paradigm and suggests that the results obtained with avatar faces generalize to real faces (Robinson et al., 2014). Also, the use of avatars has been previously validated in different experimental settings and has been shown to give results similar to those obtained with real faces in pain Hirsh et al., 2009;Lin et al., 2020;Meister et al., 2020;Riva et al., 2011;Tessier et al., 2019;Wandner et al., 2010). It is possible that the use of this type of stimulus, in which action units were varying more systematically, has favored the male strategy, and minimized sex differences in terms of ability in comparison with what would be expected in real life. ...
... Also, motion sensitivity has been found to differ between men and women (Vanston & Strother, 2017). Future research using material featuring human individuals should also consider the impact of the actor's gender profile on sexual differences, because it has been previously demonstrated that the gender of the stimuli could impact the perception of pain (Riva et al., 2011;Simon et al., 2008Simon et al., , 2006. Although most of the results suggest that pain is in general better processed for male faces than for women faces (Coll et al., 2012;Pronina & Rule, 2014;Simon et al., 2006), our analysis suggests that pain expressions were in this case more accurately discriminated in female-looking faces than in male-looking faces (see Supplemental Materials Section 1 for more details). ...
Article
Full-text available
It has been proposed that women are better than men at recognizing emotions and pain experienced by others. They have also been shown to be more sensitive to variations in pain expressions. The objective of the present study was to explore the perceptual basis of these sexual differences by comparing the visual information used by men and women to discriminate between different intensities of pain facial expressions. Using the data-driven Bubbles method, we were able to corroborate the woman advantage in the discrimination of pain intensities that did not appear to be explained by variations in empathic tendencies. In terms of visual strategies, our results do not indicate any qualitative differences in the facial regions used by men and women. However, they suggest that women rely on larger regions of the face that seems to completely mediate their advantage. This utilization of larger clusters could indicate either that women integrate simultaneously and more efficiently information coming from different areas of the face or that they are more flexible in the utilization of the information present in these clusters. Women would then opt for a more holistic or flexible processing of the facial information, while men would rely on a specific yet rigid integration strategy. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
... Research about recognition of pain revealed that the sender's sex seems to modulate the observer's judgement [8][9][10], with varying patterns of sex differences across studies. To add answers to this question, the sex of the avatars was the individual feature varied in this study, with the exploratory hypothesis that female avatars showing the same intensity and combination of Action Units lead to higher ratings of pain by observers compared to male avatars. ...
... For note, not all decoding studies looking for sex differences in perception of pain in others have revealed this tendency to see more pain in the face of women (e.g. [9]). ...
... Hirsh et al. [20] and Wandner et al. [23] used virtual human technology to create virtual characters including short videos of computer-generated avatars and brief vignettes for each of them. Riva and his colleagues [9] used static pictures as well as dynamic sequences of avatar faces created with FaceGen that changed progressively from a neutral face to a maximum of the studied emotions (anger, disgust, and pain) to determine the identification threshold for the given states. Our aim was different because we assumed the idea of a universal pain face to be an unrealistic simplification and tried to compare different empirically derived pain faces [7] for their functional equivalence in observers. ...
Article
Full-text available
Objectives The decoding of facial expressions of pain plays a crucial role in pain diagnostic and clinical decision making. For decoding studies, it is necessary to present facial expressions of pain in a flexible and controllable fashion. Computer models (avatars) of human facial expressions of pain allow for systematically manipulating specific facial features. The aim of the present study was to investigate whether avatars can show realistic facial expressions of pain and how the sex of the avatars influence the decoding of pain by human observers. Methods For that purpose, 40 female (mean age: 23.9 years) and 40 male (mean age: 24.6 years) observers watched 80 short videos showing computer-generated avatars, who presented the five clusters of facial expressions of pain (four active and one stoic cluster) identified by Kunz and Lautenbacher (2014). After each clip, observers were asked to provide ratings for the intensity of pain the avatars seem to experience and the certainty of judgement, i.e. if the shown expression truly represents pain. Results Results show that three of the four active facial clusters were similarly accepted as valid expressions of pain by the observers whereas only one cluster (“raised eyebrows”) was disregarded. The sex of the observed avatars influenced the decoding of pain as indicated by increased intensity and elevated certainty ratings for female avatars. Conclusions The assumption of different valid facial expressions of pain could be corroborated in avatars, which contradicts the idea of only one uniform pain face. The observers’ rating of the avatars’ pain was influenced by the avatars’ sex, which resembles known observer biases for humans. The use of avatars appeared to be a suitable method in research on the decoding of the facial expression of pain, mirroring closely the known forms of human facial expressions.
... De modo distinto, a percepção de faces com emoções de valência negativa, tal como raiva e nojo, e a expressão neutra parecem não ser influenciadas pelo sexo da pessoa que produz a expressão facial. Adicionalmente, taxas de erro são mais elevadas na avaliação de expressões de dor em mulheres que homens (Riva, Sacchi, Montali, & Frigerio, 2011). ...
... Os presentes achados também são concordantes com a ideia de que existe um processo de habituação frente a expressões femininas de dor, visto que as mulheres apresentam uma maior variedade de síndromes dolorosas e de uma inibição top-down atribuída aos estereótipos femininos. Deste modo, passa a haver uma detecção seletiva das faces de dor no homem como sinal de ameaça ao ambiente (Riva et al., 2011). ...
Article
Full-text available
A expressão facial de dor pode provocar diferentes reações comportamentais. Todavia, ainda não está claro se a face de dor evoca respostas motoras mais lentas ou mais rápidas, quando comparada à expressão com valência positiva, e sua interação com o sexo da pessoa que demonstra a expressão facial. O objetivo desse trabalho foi avaliar o padrão de resposta motora de mulheres em uma tarefa de reconhecimento de expressões faciais de alegria e dor em faces femininas e masculinas. Na tarefa experimental, 32 estudantes classificaram emoções faciais dinâmicas de homens e mulheres entre as opções de alegria e dor, sendo registradas as respostas de tempo de reação manual (TRM). A ANOVA indicou uma diferença entre faces masculinas e femininas apenas para a identificação da dor (p = 0,001), mas não da alegria (p = 0,064). Neste caso, a dor foi reconhecida mais rapidamente na face masculina (TRM = 625,1 ms) que na face feminina (TRM = 668,0 ms). Considera-se que este padrão de resposta motora pode estar relacionado à detecção de situações potencialmente ameaçadoras no ambiente, com possibilidade de ser estudado em pessoas com dor crônica.
... On a different verge, Riva et al. (2011) have instead found that the observers' ability to detect pain in a female face was lower than their ability to detect pain in male faces, i.e., that male pain faces are more easily processed at the reflexive level. Relatedly, Simon et al. (2006) in an fMRI study found that observing male (vs. ...
Article
Full-text available
Background The need to wear surgical masks in everyday life has drawn the attention of psychologists to the negative effects of face covering on social processing. A recent but not homogeneous literature has highlighted large costs in the ability to recognize emotions. Methods Here it was investigated how mask covering impaired the recognition of facial mimicry in a large group of 220 undergraduate students. Sex differences in emotion recognition were also analyzed in two subgroups of 94 age-matched participants. Subjects were presented with 112 pictures displaying the faces of eight actors (4 women and 4 men) wearing or not wearing real facemasks, and expressing seven emotional states (neutrality, surprise, happiness, sadness, disgust, anger and fear). The task consisted in categorizing facial expressions while indicating the emotion recognizability with a 3-point Likert scale. Scores underwent repeated measures ANOVAs. Results Overall, face masking reduced emotion recognition by 31%. All emotions were affected by mask covering except for anger. Face covering was most detrimental to sadness and disgust, both relying on mouth and nose expressiveness. Women showed a better performance for subtle expressions such as surprise and sadness, both in masked and natural conditions, and men for fear recognition (in natural but especially masked conditions). Conclusion Anger display was unaffected by masking, also because corrugated forehead and frowning eyebrows were clearly exposed. Overall, facial masking seems to polarize non-verbal communication toward the happiness/anger dimension, while minimizing emotions that stimulate an empathic response in the observer.
... On a different verge, Riva et al. [34] have instead found that the observers' ability to detect pain in a female face was lower than their ability to detect pain in male faces, i.e., that male pain faces are more easily processed at the re exive level. Relatedly, Simon et al. [35] in an fMRI study found that observing male (vs. ...
Preprint
Full-text available
Background Recently, the need to continuously wear surgical masks in everyday life has drawn the attention of neuroscientists and psychologists to the negative effects of face covering on social processing. A very recent but not very homogeneous literature has highlighted large costs in the ability to recognize emotions. Methods Here it was investigated how surgical masks covering impaired the recognition of facial mimicry in a large group of 220 undergraduate Italian students. Sex differences in emotion recognition were also observed in 2 subgroups of 94 age-matched participants. Subjects were presented with 112 pictures displaying the faces of 8 actors (4 women and 4 men) wearing or not wearing real facemasks, and expressing 7 emotional states (neutrality, surprise, happiness, sadness, disgust, anger and fear). The task consisted in categorizing the emotion while indicating the recognizability degree with a 3-point Likert scale. Scores underwent repeated measures ANOVAs. Results Overall, face masking reduced emotion recognition by 31%. All emotions were affected by mask covering except for anger. Face covering was most detrimental for sadness and disgust, both relying on mouth and nose inspection. Women showed a better performance for subtle expressions such as surprise and sadness, both in masked and natural conditions, and men for fear recognition (in natural but especially masked conditions). Conclusions Anger display was unaffected by masking, since corrugated forehead and frowning eyebrows were clearly exposed. Unlike digitally created masks, real masks were able to show inhalation-related sucking associated with startle reaction (in surprise, and especially fear expressions), thus providing further cues for emotion recognition.
Article
The accurate perception of others’ pain is a prerequisite to provide needed support. However, social pain perception is prone to biases. Multiple characteristics of individuals bias both physical and social pain judgments (e.g., ethnicity and facial structure). The current work extends this research to a chronically stigmatized population: released prisoners (i.e., releasees). Recognizing the large United States releasee rates and the significant role support plays in successful re-integration, we conducted four studies testing whether people have biased judgments of White male releasees’ sensitivity to social pain. Compared with the noncriminally involved, people judged releasees as less sensitive to social pain in otherwise identical situations (Studies 1a–3), an effect that was mediated by perceived life hardship (Study 2). Finally, judging releasees’ as relatively insensitive to social pain undermined perceivers’ social support judgments (Study 3). The downstream consequences of these findings on re-integration success are discussed.
Article
Objectives: Gender has been suggested to play a critical role in how facial expressions of pain are perceived by others. With the present study we aim to further investigate how gender might impact the decoding of facial expressions of pain, (i) by varying both the gender of the observer as well as the gender of the expressor and (ii) by considering two different aspects of the decoding process, namely intensity decoding and pain recognition. Methods: In two online-studies, videos of facial expressions of pain as well as of anger and disgust displayed by male and female avatars were presented to male and female participants. In the first study, valence and arousal ratings were assessed (intensity decoding) and in the second study, participants provided intensity ratings for different affective states, that allowed for assessing intensity decoding as well as pain recognition. Results: The gender of the avatar significantly affected the intensity decoding of facial expressions of pain, with higher ratings (arousal, valence, pain intensity) for female compared to male avatars. In contrast, the gender of the observer had no significant impact on intensity decoding. With regard to pain recognition (differentiating pain from anger and disgust), neither the gender of the avatar, nor the gender of the observer had any affect. Conclusions: Only the gender of the expressor seems to have a substantial impact on the decoding of facial expressions of pain, whereas the gender of the observer seems of less relevance. Reasons for the tendency to see more pain in female faces might be due to psychosocial factors (e.g., gender stereotypes) and require further research.
Article
This study investigated whether there are gender differences in attention to bodily expressions of pain and core emotions. Three experiments are reported using the attentional dot probe task. Images of men and women displaying bodily expressions, including pain, were presented. The task was used to determine whether participants’ attention was drawn towards or away from target expressions. Inconsistent evidence was found for an attentional bias towards body expressions, including pain. While these biases were affected by gender, patterns varied across the Experiments. Experiment 1, which had a presentation duration of 500 ms, found a relative bias towards the location of male body expressions compared to female expressions. Experiments 2 and 3 varied stimulus exposure times by including both shorter and longer duration conditions (e.g., 100 vs. 500 vs. 1250 ms). In these experiments, a bias towards pain was confirmed. Gender differences were also found, especially in the longer presentation conditions. Expressive body postures captured the attention of women for longer compared to men. These results are discussed in light of their implications for why there are gender differences in attention to pain, and what impact this has on pain behaviour. Perspective: We show that men and women might differ in how they direct their attention towards bodily expressions, including pain. These results have relevance to understanding how carers might attend to the pain of others, as well as highlighting the wider role that social-contextual factors have in pain.
Article
Background Pain assessment and pain care are influenced by the characteristics of both the patient and the caregiver. Some studies suggest that the pain of older persons and of females may be underestimated to a greater extent than the pain of younger and male individuals. Aims This study investigated the effect of age and sex on prosocial behavior and pain evaluation. Methods 40 young (18-30 y/o; 20 women) and 40 older adults (55-82 y/o; 20 women) acted as healthcare professionals rating the pain and offering help to patients of both age groups. Trait empathy and social desirability were measured with questionnaires. Results Linear mixed models showed that older and male patients were offered more help and were perceived as being in more intense pain than younger and female patients. Conclusion The characteristics of the patients seem to have a greater impact on prosocial behavior and pain assessment compared to those of the observers, which bears significant implications for the treatment of pain in clinical contexts.
Article
Full-text available
In a modified 4 * 4 factorial design with race (Black-White) of the harm-doer and race (Black-White) of the victim as the major factors, the phenomenon of differential social perception of intergroup violence was established. In the study, which used a modification of Interaction Process Analysis, 96 White paid undergraduates, observing a videotape of purported ongoing interaction occurring in another room, labeled an act (ambiguous shove) as more violent when it was performed by a Black than when the same act was perpetrated by a White, indicating that the concept of violence was more accessible when viewing a Black, as compared to a White, committing the same act. Causal attributions were also divergent. Situation attributions were preferred when the harm-doer was White, and person (dispositional) attributions were preferred in the Black-protagonist conditions. Results are discussed in terms of perceptual threshold, stereotypy, and attributional biases. (21 ref)
Article
A review of the literature on gender and clinical pain reveals a disproportionate representation of women receiving treatment for many pain conditions and suggests that women report more severe pain, more frequent pain, and pain of longer duration than do men. Gender differences in pain perception have also been extensively studied in the laboratory, and ratings of experimentally induced pain also show some sex disparity, with females generally reporting lower pain thresholds and tolerance than males. However, there is little consensus on whether these apparent differences reflect the way men and women respond to pain, differing social rules for the expression of pain, or biologic differences in the way noxious stimuli are processed. In this paper, our working hypothesis is that the higher prevalence of chronic orofacial pain in women is a result of sex differences in generic pain mechanisms and of as-yet unidentified factors unique to the craniofacial system. We will review the evidence concerning gender differences in the prevalence of pain conditions, with a focus on orofacial pain conditions. Evidence and hypotheses concerning biologic and psychosocial factors that could influence prevalence rates will also be discussed.
Article
Structured interviews of 37 medical inpatients being treated with narcotic analgesics for pain showed that 32% of the patients were continuing to experience severe distress, despite the analgesic regimen, and another 41% were in moderate distress. Chart review suggested significant undertreatment with narcotics: meperidine in doses of 50 mg every 3 to 4 hours or less (if needed) was prescribed for 63% of the 37 patients; a dose of more than 75 mg was prescribed for only 1 patient. The average amount actually received per day by the patients was 90 mg. A questionnaire survey of 102 house staff physicians in two New York teaching hospitals showed considerable misinformation about meperidine. Many physicians underestimated the effective dose range, overestimated the duration of action, and exaggerated the dangers of addiction for medical inpatients receiving meperidine in a therapeutic dosage range. Physicians who exaggerated the dangers of addiction were more likely to prescribe lower doses of drugs, even for patients with terminal malignancy. The authors suggest that such misconceptions probably lead to undertreatment with narcotic analgesics, causing much needless suffering in medical inpatients.
Article
Many recent experiments have used parallel Implicit Association Test (IAT) and self- report measures of attitudes. These measures are sometimes strongly correlated. However, many of these studies find apparent dissociations in the form of (a) weak correlations between the two types of measures, (b) separation of their means on scales that should coincide if they assess the same construct, or (c) differing correlations with other variables. Interpretations of these empirical patterns are of three types: single-representation — the two types of measures assess a single attitude, but under the influence of different extra-attitudinal process influences; dual-representation — the two types of measures assess distinct forms of attitudes (e.g., conscious vs. unconscious; implicit vs. explicit); and person vs. culture — a variant of the dual- representation view in which self-report measures reflect personal attitudes, whereas IAT measures reflect non-attitudinal cultural or semantic knowledge. Proponents sometimes interpret evidence for single versus dual constructs as evidence for single versus dual structural representations. Behavioral evidence can establish the discriminant validity of implicit and explicit attitude phenomena (dual constructs), but cannot choose among single- vs. dual-representation interpretations because the distinct constructs remain susceptible to interpretation in terms of either one or two representations. Selecting among representational accounts must therefore be based on considerations of explanatory power or parsimony.
Article
Patient demographic characteristics and nonverbal communication displays have been found to influence the assessment and treatment of pain. Numerous methodological limitations of these previous investigations constrain the research questions that could be addressed and the conclusions that have been yielded. The current analogue study employed an innovative research design and novel virtual human technology to investigate disparities in clinical decision making for pain assessment and treatment. Fifty-four currently practicing nurses participated in this study delivered via the Internet. Thirty-two vignettes of virtual patients were presented; each vignette contained a video clip of the patient and clinical summary information describing a post-surgical context. Nurses were asked to make decisions regarding the assessment of pain intensity and unpleasantness, in addition to treatment with non-opioid and opioid medications. The patient demographic cues of sex, race, and age, as well as facial expression of pain, were systematically manipulated across vignettes and hypothesized to influence assessment and treatment ratings. Idiographic and nomothetic statistical analyses were conducted to test these hypotheses. Results indicated that at the idiographic level, sex, race, age, and pain expression cues accounted for significant, unique variance in assessment and treatment policies among many nurse participants. Pain expression was the most prominent cue throughout these policy domains. Within-cue differences emerged at the nomothetic level; the size and consistency of these differences varied across policy domains. The current investigation demonstrates the application of novel virtual human technology to the study of disparities in pain-related decision making. These data indicate that the assessment and treatment of acute post-surgical pain often varies based on virtual human demographic characteristics and facial expressions of pain. Implications of the present findings are discussed in the context of the extant literature. Methodological considerations and future research directions are also discussed.