Content uploaded by Kun Guo
Author content
All content in this area was uploaded by Kun Guo on May 31, 2022
Content may be subject to copyright.
1
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
www.nature.com/scientificreports
Dogs and humans respond to
emotionally competent stimuli by
producing dierent facial actions
Caeiro Cátia
1,2, Guo Kun1 & Mills Daniel
2
The commonality of facial expressions of emotion has been studied in dierent species since Darwin,
with most of the research focusing on closely related primate species. However, it is unclear to what
extent there exists common facial expression in species more phylogenetically distant, but sharing a
need for common interspecic emotional understanding. Here we used the objective, anatomically-
based tools, FACS and DogFACS (Facial Action Coding Systems), to quantify and compare human and
domestic dog facial expressions in response to emotionally-competent stimuli associated with dierent
categories of emotional arousal. We sought to answer two questions: Firstly, do dogs display specic
discriminatory facial movements in response to dierent categories of emotional stimuli? Secondly, do
dogs display similar facial movements to humans when reacting in emotionally comparable contexts?
We found that dogs displayed distinctive facial actions depending on the category of stimuli. However,
dogs produced dierent facial movements to humans in comparable states of emotional arousal. These
results refute the commonality of emotional expression across mammals, since dogs do not display
human-like facial expressions. Given the unique interspecic relationship between dogs and humans,
two highly social but evolutionarily distant species sharing a common environment, these ndings give
new insight into the origin of emotion expression.
e common origin of emotions has long been a subject of scientic interest1 with dierent emotional responses
producing a diverse range of communicative elements, especially through the face. Facial expressions are also
correlates of internal state in both humans2 and other animals3–6 and so may be used, in part, to infer emotion
alongside other component processes, such as physiological activation and behavioural tendencies7. Many studies
(e.g.8,9) use an holistic approach (i.e. categorizing the whole face as angry, happy, etc.) to classify the target facial
expressions, which reects the way the human brain processes faces7,10, but can be problematic when examining
the underlying mechanism of emotion perception across species. For instance, there is a diverse range of smiling
faces with dierent visual characteristics and dierent emotional meanings in humans11,12. As a classic exam-
ple, the Duchenne smile (felt, true enjoyment) diers by one muscle contraction from the non-Duchenne smile
(unfelt, usually produced in formal greetings). Moreover, during laughter and depending on the context, a blend
of both Duchenne and non-Duchenne smiles is oen observed13. Hence, simply classifying a facial expression
as “happy” is too simplistic and less meaningful for cross-species comparison. Furthermore, the same ‘holistic’
facial morphological conguration could have dierent functional meanings (i.e. result in distinctly dierent
behavioural consequences) depending on the species14,15. For example, the Play Face (PF) and the Full Play Face
(FPF) are variants of the same facial expression, where the former presents an open mouth with lower teeth
exposed, and the latter incorporates visible upper teeth. Both the PF and the FPF represent dierent degrees of
playful expression in great apes (humans included)16–18. Conversely, in crested macaques, mandrills and geladas,
the FPF is not just a more intense version of the PF, but instead is derived from convergence between the PF
and the silent-bared teeth display (SBT), a facial expression observed in aliative settings such as grooming19.
Additionally, the SBT indicates submission and appeasement in Barbary macaques14, signals anity and benign
intentions in humans, and, in chimpanzees, is present in a range of situations from response to aggression to
anity contexts15.
As an alternative to an holistic descriptive approach, the decomposition and objective description of distinct
anatomical regions of facial features, such as occurs with the Facial Action Coding System (FACS20), has been the
golden standard to study human facial expressions of emotion across individuals of dierent races and cultures
1School of Psychology, University of Lincoln, Lincoln, UK. 2School of Life Sciences, University of Lincoln, Lincoln,
UK. Correspondence and requests for materials should be addressed to C.C. (email: ccorreaicaeiro@lincoln.ac.uk)
Received: 25 May 2017
Accepted: 20 October 2017
Published: xx xx xxxx
OPEN
www.nature.com/scientificreports/
2
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
for several decades21,22. Each of the discrete facial movements identied (Action Units, AUs) is the result of an
independent facial muscle contraction that can produce several changes in appearance to the face, which in turn
are used to identify which AUs are activated. us, FACS codes facial movements from a purely anatomical basis,
avoiding circular reasoning or a priori assumptions of emotion meaning. Recently, FACS has been adapted to
several non-human species23–30, such as chimpanzees and orangutans, following the original methodology20 and
has proven to be a successful tool for objectively investigating and comparing facial expressions of closely related
species31,32. For example, chimpanzees and humans share an identical facial muscle plan (diering by only one
muscle)20,33, but chimpanzees display both homologous (e.g. play face and human laugh) and species-specic
expressions (e.g. pant-hoot)32,34.
While the human prototypical facial expressions of emotion are well established, little is known about the
quantitative and empirical nature of the emotional facial displays of the domestic dog, an evolutionarily remote,
but socially complex species which oen shares the human social environment and frequently engages in inter-
specic communication with an emotional content (e.g.35). To date, functional facial expressions in dogs have
been largely discussed holistically in relation to their approach-avoidance value, for example, the “threat gape” in
ght-ight situations3, and the PF or the Relaxed Open Mouth (ROM) as a social communicative signal for play
solicitation and within play bouts3,36,37. With the development of the FACS for the domestic dog24, it becomes
possible to apply a bottom-up technique to investigate the composition and meaning of dogs’ facial expressions
and, more importantly, to establish possible analogies with humans, with whom they socially interact.
Dogs and humans, like other mammals, have a homologous facial anatomy plan38,39 even though they belong
to phylogenetically distant groups. Additionally, both species share a common mammalian neuroanatomy for the
basic emotions such as fear and happiness40–44, typically live in a common social and physical environment, are
very facially expressive (e.g.2,3,20,24), and respond to the same or similar conspecic and heterospecic social cues
(e.g.45). Consequently, the facial cues and expression of emotion in home-dwelling pet dogs provide a unique
comparative model for the study of phylogenetic inertia (i.e. absence of expected change and/or adaptation to an
optimal state given specic selection pressures in the current environment)46–48 versus evolutionary divergence
(i.e. a set of changes brought about by selection pressures from a common ancestor resulting in homologies)49,50
versus evolutionary convergence (i.e. a set of changes from selection pressures acting in independent lineages to
create similarity in the resulting analogies)47,50.
Here, we investigated the mechanistic basis of facial expressions in humans and dogs, by objectively measur-
ing their video recorded facial actions during immediate reactions to emotionally-competent stimuli. e FACS20
and the DogFACS24 were applied in a range of contexts associated with four categories of emotional responses:
a) happiness, b) positive anticipation, c) fear, and d) frustration (Table1). Instead of selecting the basic emotions
that are known to produce universal facial signals in humans51, we focused on emotions that are dened by evo-
lutionary and biologically consistent criteria: 1) essential for solving adaptive problems in mammals (e.g. fear of
a threat prompts ight increasing survival)52, 2) arise from corresponding physiological markers (e.g. elevated
opioid levels can increase playfulness53), and 3) correlate with specic neuroanatomical regions (e.g. Nucleus
accumbens neurons activate before a positive event leading to positive anticipation44,54). is approach reduces
anthropomorphic and anthropocentric bias in the selection and comparison of emotions, i.e. instead of trying
to identify stereotypically human emotions in dogs, we focused on examining shared underlying mammalian
homologies55. Furthermore, for each category of emotion (e.g. fear), we used a range of contexts to generate the
emotional response (thunderstorms, specically avoided objects, etc.). is increased the likelihood of identify-
ing the general facial responses to the emotional category of stimulus (e.g. general facial actions of fear), instead
of behavioural motivations (e.g. facial actions displayed for thunderstorms, but not in other fear contexts). We
only analysed spontaneous emotional reactions because posed responses could dier from spontaneous ones in
duration, intensity, symmetry and form56–58.
Given the common origin of mammalian facial musculature59 and its link to emotion2, and the nature of the
long association between humans and dogs, it is plausible that similar emotional reactions might share common
facial correlates in these two species (i.e. that the same emotions are closely associated with the same core facial
muscle movements). erefore, we tested two hypotheses: 1) Do dogs produce specic facial actions in response
to dierent categories of emotionally-competent stimuli (i.e. stimuli that produce an emotion cascade resulting
in a distinct brain response60,61)? If so, this would provide evidence that dogs possess an adaptive facial behav-
ioural repertoire (sensu Schmidt and Cohn62) with expressive and/or communicative functions associated with
the underlying emotional response, as a result of evolutionary pressures. is is a precondition for the main
hypothesis: 2) Do dogs use similar facial muscles to humans to express similar emotions? A lack of signicant
dierences between humans and dogs would potentially reect phylogenetic inertia and be consistent with the
shared basic origin of emotional expression as proposed by Darwin1 or reect convergent evolution. On the
other hand, widespread signicant dierences would indicate that facial expressions of emotion are not highly
conserved features across dierent species.
Results
Human facial actions of emotion. Our study showed convergent validity with previous studies (Table2)
for the core AUs63 associated with each emotion (Table1). Humans showed a dierential use of their facial muscu-
lature to produce a higher rate (i.e. in comparison to the relaxed condition representing neutral facial expression)
of specic prototypical facial actions during an emotional response, while using various AUs exibly between
contexts. e comparable core AUs identied in our study included AU5 and AU20 for fear, AU14, AU17 and
AU24 for frustration, and AU6 for happiness, conrming our human videos as a suitable baseline to compare with
the dog videos. is also allowed us to verify that the naturalistic videos used to extract our data can still produce
robust results, with facial signals strongly linked to corresponding emotional responses. Additionally, the human
FACS coding created a baseline to compare with the dogs’ responses, based on human-dog muscular homologies.
www.nature.com/scientificreports/
3
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
Emotion Brain system Denition
Humans Dogs
Trigger Context Trigger Context
Fear Fear/anxiety
Aversion/
avoidance to
a stimulus
perceived as a
threat, leading to
ight, ght, freeze
and/or distress
response40,44.
Visualisation
of dangerous
animal,
experiencing
high/fast
moving
vehicles.
During
presentation
of animal or
experience in a
park ride.
Experience of
thunderstorms,
visualisation of
specic objects.
During
thunderstorms
or presentation
of objects.
Frustration Rage/Anger
Result of denial
of a resource
by presence of
a physical or
social barrier,
omission/delay
of an expected
reward, a barrier
to success or
to achieve a
goal42,103.
Possibility of
gaining a high
monetary
reward with
its subsequent
loss(es).
Aer loss of
game/life from
the UK TV
show game
e Cube till
start of another
attempt/return
next to the
presenter aer
game ends.
Visualisation
of a desired
resource (toy,
food, space) that
is or becomes
inaccessible.
Aer rst
attempt to
gain access to
the resource
and during its
subsequent
denials.
Positive anticipation Seeking/expectancy
Expectation of
potential gain.
Desire of a known
and predictable
resource/goal.
Period from
signalling of
a reward till
moment before
receiving
reward99,104.
Visualisation
of food,
unwrapping
a gi.
From visual
presentation
of food item
till moment
before eating;
From visual
presentation
of gi till
revelation of
item.
Visualisation of
food or hearing
meal/food
related word(s);
Visualisation of
leash, hearing
walk related
word(s).
Aer trigger
presentation till
moment before
obtaining food
or leaving home.
Happiness Play/Joy
Aer/during a
positive activity/
situation that
has a positive
outcome for the
individual and
is intrinsically
hedonistic/
pleasurable.
Only observed
in the absence of
immediate tness
threats and is
highly dependent
on all proximate
needs being
fullled105,106.
Gain of a high
monetary
reward.
Aer win from
the UK TV
show game
e Cube till
moment that
contestant
returns next to
the presenter.
Initiation of
a play bout;
visualisation of
owner.
During play with
conspecics or
owner; reunion
with owner aer
long period of
separation.
Relaxation —
Absence of any
emotionally-
linked stimuli and
response.
— — — —
Table 1. Emotion, brain system44, denition of emotion, trigger stimuli and context analysed for both species.
Dierent triggers were selected for humans and dogs to allow functional equivalence between species93,94.
Emotion
Humans
Previous studies SourceRate HSE
Fear
AU5
AU7
AU20
AD38
19.700
19.900
24.700
15.500
5.285
5.028
5.285
4.161
AU5, AU20 (54)
Frustration
AU14
AU17
AU24
AU28
AD84
21.100
20.100
21.900
19.600
16.950
5.985
5.920
5.849
5.162
4.555
AU14, AU17, AU24 (20,55,56)
Positive Anticipation NS — — — —
Happiness AU6 33.100 6.308 AU6 (54)
Table 2. Unique human facial movements displayed during dierent emotional contexts. Signicantly dierent
to relaxed context in one context only (p < 0.05, Kruskal-Wallis with post-hoc pairwise comparisons, Dunn-
Bonferroni corrected). Unique facial actions found in previous studies are also reported here for comparison
purposes, with the respective literature source. H: Test statistic, SE: Standard Error.
www.nature.com/scientificreports/
4
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
Specically, our analysis showed that compared to the relaxed condition, humans displayed a signicantly
higher rate of specic facial actions, in response to stimuli associated with happiness, fear and frustration, but
not positive anticipation. For the fear context, the rate of AU5, AU7, AU20 and AD38 were signicant; for the
frustration context, the rate of AU14, AU17, AU24, AU28 and AD84 were signicant; for the happiness context,
the rate of AU6 was signicant. In the Supplementary TableS4, the facial actions that humans displayed during
two or more emotions, but not during the relaxed context are also reported.
Dog facial actions of emotion. In support of our rst hypothesis, we found signicant dierences between
each emotional response and relaxation for particular facial and ear actions: Dogs consistently showed a higher
rate of AD126 during fear contexts; AD37, AD137 and EAD102 during positive anticipation; and AU27 during
happiness. However, frustrated dogs did not display higher rates of any of the facial actions (Fig.1, Table3).
e higher rates of these specic facial actions are thus strongly linked to the respective emotional contexts. In
the Supplementary TableS4, we report the display of signicantly higher rates of those facial actions common
Figure 1. Examples of visual representations of unique dog facial actions displayed during emotional
reactions. (Individual images composed for illustration of muscular movements only, found in Pixabay 2016,
free for commercial use, https://pixabay.com/en/herder-action-dog-animal-dog-plays-1726406/, https://
pixabay.com/en/dogs-cute-pet-animal-canine-1181868/, https://pixabay.com/en/animal-canine-cute-dog-
friendly-1837003/).
Emotion
Dogs
Rate HSE
Fear AD126 29.300 7.945
Frustration NS — —
Positive Anticipation AD37
AD137
EAD102
25.200
30.825
24.300
7.263
8.716
8.281
Happiness AU27 27.600 7.436
Table 3. Unique dog facial movements displayed during dierent emotional contexts. Signicantly dierent
than relaxed context in one context only (p < 0.05, Kruskal-Wallis with post-hoc pairwise comparisons, Dunn-
Bonferroni corrected). H: Test statistic, SE: Standard Error.
www.nature.com/scientificreports/
5
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
between emotions (but absent during relaxation); like humans, dogs made a clear, exible use of their facial
musculature.
Comparison of human and dog facial actions of emotion. To investigate our second and main
hypothesis, direct comparison between human and dog facial actions for each emotion revealed signicant dif-
ferences in specic actions for all the examined emotions (Table4), demonstrating the dierential use of facial
musculature between humans and dogs when facing equivalent emotional situations. When compared with
humans, dogs displayed higher rates of AD19 in a fearful situation, AU45 during frustration, AD19 and AD37
during positive anticipation, AD19 during happiness, and AU1 in relaxed situations. We also report in Table4
the facial movements that were signicantly higher for humans compared with dogs in the same context. ese
results show that, in equivalent contexts, dogs and humans mostly activate dierent facial actions. Out of all the
facial actions entered in the analysis, only two Action Units (AU25, AU26) and two Action Descriptors (AD33
and AD40) showed no signicant dierences between the two species in any of the contexts. is indicates that
the majority of facial actions which both dogs and humans are able to produce were used dierently for all emo-
tional contexts. Eect sizes were mostly intermediate to large and are reported in the Supplementary TableS5.
Control variables. We found no signicant dierences between dierent categories of cephalic type, ear
morphology and breed for any of the signicant facial actions in our main analysis of dogs, i.e. these variables had
no signicant impact on the production of facial actions in the emotional contexts analysed in this study. For jowl
length, the rate of AU10 during happiness was indistinguishable in dogs with short and long jowls, but was higher
than in dogs with no jowls (Kruskal-Wallis, with Dunn-Bonferroni correction, H2 = 6.736, p = 0.050); further-
more AU12 had a signicantly higher rate in dogs with short jowls than in dogs with long jowls (Kruskal-Wallis,
with Dunn-Bonferroni correction, H2 = 9.889, p = 0.036). However, this observation might be a coding artefact as
long or no jowls make these AU movements less conspicuous and thus harder to recognise. Finally, we found sig-
nicantly higher levels of arousal for all emotions when compared with the relaxed referent (Kruskal-Wallis, with
Dunn-Bonferroni correction, Fear: H4 = 45.20, p = 0.0001, f rustration: H4 = 69.650, p = 0.0001, positive antici-
pation: H4 = 69.857, p = 0.0001, happiness: H4 = 96.500, p = 0.0001) and for happiness when compared with fear
(Kruskal-Wallis, with Dunn-Bonferroni correction, H4 = 51.300, p = 0.0001). is supports the validity of the
relaxed context as a point of reference, since this represents, by denition, absence of/very low arousal.
Discussion
is study provides the rst empirical evidence to address two important research questions in comparative emo-
tion and interspecic emotion understanding: dogs display specic discriminatory facial movements in response
to dierent emotional stimuli, but do not display similar facial movements to humans when reacting to emotion-
ally comparable contexts.
Dogs tended to make more frequent use of AD126 in a fearful context; AD37, AD137 and EAD102 during
positive anticipation; and AU27 in a happiness context (Fig.1). Such observation is in agreement with a widely
held belief that dogs facially react in a common way as a species when confronted with particular challenges
or situations in their environment (e.g. when fearful64,65). is also supports Darwin’s suggestions that facial
expressions are correlates of internal states in animals1, and are based on homologous facial musculature with
the potential to produce similar facial information for a given species. Given previous studies mentioning varied
facial cues in frustrated dogs (also described as displacement or conict behaviours35,66), it was surprising that we
did not observe distinctive facial actions for frustration in dogs. However, it might be that the frustration signals
previously reported are not specic to frustration and are instead used exibly in association with other behav-
iours and/or in response to dierent contexts (e.g. stress67, discomfort68, fear65), and/or that facial expressions of
frustration are more context specic, without a common dynamic anatomical structure. Analysing more videos
or featuring a wider range of contexts, to account for potential exibility, might identify specic facial actions for
this emotion in certain common contexts.
Emotion Rate USE
Fear
AD19
AD50
AU10
AU16
AU27
AU43
40.000
178.000
172.000
163.000
170.000
150.000
22.425
20.564
22.517
22.680
20.132
14.759
Frustration AU43
AU45
AU10
160.000
167.000
156.500
15.884
22.728
22.517
Positive Anticipation
AD19
AD37
AU10
AU18
30.000
45.000
168.000
147.000
20.941
19.638
22.588
19.638
Happiness AD19
AU10
AU12
36.000
184.000
147.000
21.995
22.588
22.728
Relaxation AU1 35.000 20.564
Table 4. Facial movements that dier between species, within each emotional context (p < 0.05). Bold: higher
for dogs, U: Test statistic, SE: Standard Error.
www.nature.com/scientificreports/
6
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
Regarding our second hypothesis, our results revealed that humans and dogs produced distinct facial
expressions of emotion and little commonality was found in the facial actions triggered, i.e., dogs did not show
human-like facial expressions. is observation refutes the idea of continuity of emotional expression between
dierent species. Given the clear facial muscular homology and comparable anatomical facial plan between
humans and dogs38,39, their shared social environment and long history of intensive mutual social interaction69,70,
it might have been expected that dogs and humans would at least share some facial responses within a comparable
emotion context.
Most of the basic facial muscles are similar between humans and dogs, and share primary functions unrelated
to facial expression (e.g. the orbicularis oculi muscle that closes the eye to protect the eye or the zygomaticus
major that draws the lips back to facilitate feeding). However, when it comes to emotional expression, dogs do not
seem to make use of their muscles in the same way that humans do. is might be due to the inability of the dog’s
muscles to produce particular movements present in humans because of their dierent facial morphology (e.g.
lack of localised fat deposits). is is the case, for example, with AU6 (produced by the orbicularis oculi muscle),
which is a fundamental AU present in all Duchenne happy faces in humans, but in dogs it was never observed
even though the same muscle is present and functional. e human face has a more complex facial musculature,
i.e. a few muscles are not present in dogs, such as the levator palpebrae that produces AU5 or the risorius that
produces AU20. Both AU5 and AU20 (i.e. eyes widely open, lips stretched laterally and downwards) are critical
facial actions of a fearful face in humans, but it is impossible for dogs to produce the equivalent facial change.
Another example of muscular variance that might reect the reported dierence in the displayed facial actions
relates to the human-dog frustration comparison. Dogs lack a zygomaticus minor muscle, which produces AU14
in humans. is specic movement is one of the core action units of human frustration facial expression. Given
the lack of these specic muscles in dogs, it is possible that other muscles could be activated to produce reliable
fear or frustration cues in this species. is appears to be the case for fear, since AD126 production had unique
characteristics in this context. However, we did not identify an equivalent in the case of frustration, as discussed
earlier.
Given the low number of specic facial actions produced in association with each emotion, we suggest that
dogs do not display a composed facial expression with several facial actions being integrated in a stereotyp-
ical display, as is observed in humans. Instead, dogs seem to produce isolated actions in response to specic
emotionally-competent stimuli.
Due to well-known human perception biases (e.g. anthropomorphic tendencies71), it is essential to determine
in an objective and standardized way exactly how dogs produce emotional cues. e results in relation to our
second hypothesis illustrate the error of broad descriptions and over-reliance on holistic categorization of facial
behaviour, which can lead to biases in the perception of facial expressions of a dierent species (e.g.14,15). It has
been argued that an alternative explanation for why humans perceive non-human animals’ emotions as if conspe-
cics is because of evolutionary homologies between the species, based on shared mammalian brain structures
and behavioural expression patterns42,71. However, our study refutes this hypothesis, as the homologies between
humans and dogs seem to be limited to the underlying musculature, rather than their activation. In other domes-
tic animals, such as horses72 and sheep73, the basic facial musculature also appears to be well conserved26,74, but
the emotional displays appear to diverge from what is observed in humans. It is worth noting that in both of these
species (and arguably most domestic animals) the ears appear to be widely used as part of emotional signals, while
in humans the ears have more vestigial muscles. Close analysis of the aural signals may therefore be a fruitful line
of enquiry in the future. In this respect, it is also worth noting that greater similarities are seen between dierent
species of primates’ (including human) facial expressions75, with them showing a continuity not only at a mus-
cular level but also at the level of expression production. In the case of the dog-human comparison, and unlike
the chimpanzee-human comparison, facial expressions seem to be exclusive to the species. Phylogenetic inertia
was likely involved in the development of the well-conserved facial musculature of dogs and humans47, but there
are clearly dierences in the way this musculature is used to produce facial expressions associated with the inten-
tional or unintentional communication of emotion.
Our ndings of humans and dogs displaying distinctively dierent facial expressions have important theoret-
ical and practical implications on human-dog emotion understanding and its underlying cognitive mechanisms.
ere is little theoretical understanding of interspecic emotional appraisal, but we examine the extension of two
competing theories of human-human emotion perception into the eld of human-dog emotion perception: “e
mental simulation strategy” (i.e. if the same facial movements are produced in response to the same emotional
stimuli in both species, individuals could form an understanding of the emotions of others through their own
facial actions), and “e implicit social learning strategy” (i.e. if facial movements produced are dierent between
species, an associative learning between external events and motivational states is likely needed for understand-
ing the emotions of other species)76,77. Given that in our study facial movement production diered between
humans and dogs, it is unlikely that a mental simulation strategy could be useful or adopted by both species when
interpreting heterospecic expressive facial signals. Instead, an implicit social learning strategy would be a more
meaningful way to establish a functional human-dog emotion understanding. is would at least partly explain
why untrained humans do not seem procient in reading dogs’ facial and body language78–80, particularly subtle
cues such as head turning or nose licking81. is is further supported by the neurocognitive evidence that peo-
ple read dogs’ and humans’ social cues using overlapping brain areas82,83 and similar cognitive mechanisms76,84.
Indeed, humans represent non-human animals’ aective space similarly to that of conspecics’71,85 and incor-
rectly identify emotions in dogs that have been shown to be a direct result of anthropomorphic subjective judge-
ments (e.g. guilt86). In our study, most emotional facial actions produced by dogs were in the mouth area, using
the tongue or the opening of the mouth, and none were in the eye area, which is an important dierence from
humans that produce eye region AUs in many of the prototypical facial expressions. us this might be another
reason why humans seem to nd it very hard to read dogs’ communication.
www.nature.com/scientificreports/
7
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
The preceding studies strongly suggest that humans read dog communication as they would read other
humans (own-species bias87), and our results indicate this is potentially problematic when it comes to evaluation
of the emotional content of these signals. is leads to another important issue: all studies to date on human per-
ception of dogs are missing bias-free empirical validation (e.g.88–90). For example, when showing a “happy dog”
static picture, the only “validation” is the agreement of other subjective human judges, without ever specifying
why a “happy dog” is identied as happy, leading to subjective anthropomorphic assessments and circular rea-
soning, i.e. dierent human dog “experts” agree with each other on what a “happy dog” should look like, instead
of examining what does a “happy dog” actually look like. For the rst time, our study demonstrates how to vali-
date dogs’ facial expressions by applying an anatomical and bias-free method based on the presence of empirical
stimuli of relevance.
It remains to be seen to what extent the current ndings can be generalised across all dog breeds and other
emotional contexts, given the variation in facial morphology and possibly muscular functionality in dierent
breeds. Nonetheless, we did establish that features like variation in general head shape may be less important than
changes in jowl structure, given the location of key AUs used to express the emotions studied. Despite this study’s
focus on facial expressions alone, it is important to mention that emotional expression is clearly multimodal
in many species including those studied here (e.g. vocalisations, body posture, and odours may be important
moderators of the message sent). While the stimuli were selected for their fundamental emotional equivalence
between the two species, the resulting expression might have been moderated by other factors associated with
their processing, which were not controlled for (e.g. social vs non-social stimulus source91). Nevertheless, the
critical feature is that they should largely share a nal common emotional process, whose expression we set out
to measure. us, future studies might aim to compare dogs of dierent breeds and morphologies, as well as
incorporate the multimodality of emotional signals, and test these hypotheses in dierent settings (e.g. the more
controlled laboratory) to expand on these conclusions.
Methods
Video source and individuals. e videos used in this study were selected from online databases (www.
youtube.com and AM-FED database92). Only one individual per video was coded. Whenever there was more
than one individual present in the same video that fullled all the necessary criteria (see section b), one of the
individuals was randomly selected and coded.
e sample consisted of 150 individuals in total, 50 humans and 100 family dogs, distributed equally between
emotional contexts (happiness, positive anticipation, fear, frustration and relaxation, Table1). For detailed infor-
mation on individuals, please see Supplementary materialS1.
Criteria for video selection. The four categories of emotion were defined according to literature
sources (Table1). As a control, we selected videos of neutral/relaxed individuals, i.e., where any specific
emotionally-triggering stimuli or overt behavioural reactions were absent. e videos were chosen on the basis
of stimulus quality (e.g. source) and its clear association with an evoked response, with dierent stimuli selected
for humans and dogs to allow functional equivalence93,94 (e.g., most humans are not afraid of thunderstorms,
while this is a common fear trigger in dogs95–97). Videos of dog training, clinical/veterinary settings or with music
covering the audio were excluded. Only videos with minimum editing, high image quality (at least 720p), good
lighting and visibility of faces were selected. e putative emotion eliciting stimulus had to be apparent and clearly
identiable for at least part of the video. By including homemade/amateur videos with naturalistic and spontane-
ous behaviour we ensured that the responses were ecologically valid, less constricted, and more robust, especially
when compared to laboratory studies on emotions98.
To ensure that the emotionally-competent stimuli were present in the dog videos and were impacting their
behaviour, the rst author of this study (CC) selected, blinded and randomised the video order. Another author
(DM, specialist in veterinary behavioural medicine) relabelled all videos according to the emotion likely triggered
by the stimulus99. Only videos that had full agreement were included in the study.
FACS, control variables and reliability coding. All human and dog videos were coded with FACS20 and
DogFACS24, respectively, by a certied coder (CC) via the open source soware BORIS (Behavioural Observation
Research Interactive Soware) V2.96100. Each video duration was determined by the individual’s emotional
response to the trigger, i.e. from the starting of the behavioural display to the ending time point (neutral behav-
iour). For example, for the fear trigger of a thunderstorm, the trigger was present throughout the video, for all
thunderstorm videos analysed, and the duration of the video was considered while the response (ight/freeze,
etc.) of the individual was observed. Before starting the FACS coding, one or more neutral face frames were
selected for all individuals. e number of facial actions was identied by watching the videos frame-by-frame,
logged in BORIS and extracted for analysis. During speech/vocalizations, facial actions were coded only if they
were above level A intensity (as per FACS investigator guide recommendation101). To ensure an unbiased coding,
an independent DogFACS certied coder, unaware of the aim of the study or the emotional categorization of the
videos, re-coded a random 30% of the dog videos. Inter-coder reliability was thus obtained with 84% mean agree-
ment in the Intraclass Correlation Coecient (ICC(3,k), CI range of 0.80–0.87).
Since morphological dierences could potentially impact upon the production of facial actions in dogs, we
used cephalic type (brachycephalic, mesaticephalic, dolichocephalic), jowls length (none, short, long), ear mor-
phology (oppy, partially erect, erect) and breed (as per the British Kennel Club) as control variables. We only
analysed the eects of these control variables in the facial actions that were signicant in our main analysis. As an
additional control, we categorized all the videos for the perceived level of arousal (ve point Likert scale: 1 – very
low, 2 – low, 3 – moderate, 4 – high, 5 – very high) to ensure arousal was not a confounding variable for emo-
tion. ese control variables allowed us to increase the internal validity of our study, since it excludes alternative
www.nature.com/scientificreports/
8
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
explanations for the causal relationship between independent and dependent variables102. Dierent blinded cod-
ers recoded the whole sample for each of the control variables to assess mean agreement of inter-rater reliability
(ICC(3,k)), obtaining 94% (CI range of 0.91–096) for ears, 85% (CI range of 0.78–0.90), for jowls, and 87% (CI
range of 0.81–0.90) for arousal. Human videos were not coded for control or reliability purposes, instead, our
results were compared with the respective facial movements reported in the established literature.
Statistical analysis. Because the videos varied in duration (62 ± 45 s, mean ± s.d.) due to dierences in
the individuals’ responses to the triggers and the emotions examined, the number of facial actions (4.59 ± 8.97,
mean ± s.d.) was normalised into rates of visible time, i.e., the number of each AU was divided by the dura-
tion of the video where the face is visible. Variables with very low rates (<0.001) were excluded from analysis
(Supplementary materialS3). us, we included 30 human facial actions and 22 dog facial actions in the statistical
analysis (Supplementary TableS2). We performed Kolmogorov–Smirnov tests for normality and non-parametric
Levene’s test for homoscedasticity. Since all variables violated both assumptions in at least one of the groups,
non-parametric tests were used throughout the analysis. To investigate if dogs produced differential facial
actions in response to specic emotional triggers, we compared the facial actions displayed between emotions
and between each emotion and the control relaxation videos with Kruskal-Wallis (adjusted for ties) followed
by post-hoc pairwise multiple comparisons tests, with Dunn-Bonferroni correction. For our second hypothesis,
where we were interested in assessing if dogs were using the same facial actions as humans in response to an
equivalent emotional trigger, we performed Mann-Whitney tests, with exact p-value for the homologous facial
actions between species. For both hypotheses, the rate of facial movements was used as the dependent variable
and the emotion category was the independent variable. Additionally, we calculated eect sizes to consider and
minimize the risk of type 2 error in our interpretation of the results.
To analyse the potential eect of the control variables cephalic type, ear morphology, jowl length, breed, and
arousal level on the facial actions, we used Kruskal-Wallis tests (adjusted for ties) and post-hoc pairwise multiple
comparisons tests, with Dunn-Bonferroni correction. Here, we compared the rate of each facial action (depend-
ent variable) between the categories (independent variables) of the control variables, in order to ensure the dis-
tribution of facial actions was the same regardless of the dogs’ morphological variation. For ear morphology and
jowl length, we only performed the analysis for the relevant facial actions (e.g. jowl length only analysed for lower
face actions). All statistical analyses were performed using IBM SPSS v22, except for the eect sizes that were
computed with GPower v3.1.
Ethics. is study was approved by the School of Psychology Research Ethics Committee and follows the
Ethics Guidelines for Internet-mediated Research by the British Psychological Society.
Data availability. e datasets generated and analysed during the current study are available from the cor-
responding author on request.
References
1. Darwin, C. e expression of the emotions in man and animals. (D. Appleton and Company, 1896).
2. Eman, P. & Oster, H. Facial expressions of emotion. Annu. Rev. Psychol. 30, 527–554 (1979).
3. Fox, M. W. A comparative study of the development of facial expressions in canids: wolf, coyote and foxes. Behaviour 36, 49–73
(1970).
4. Chevalier-Solnio, S. Facial expression of emotion in nonhuman primates. In Darwin and Facial Expression: A Century of
esearch In Review (ed. Eman, P.) (Academic Press, 1973).
5. Tate, A. J., Fischer, H., Leigh, A. E. & endric, . M. Behavioural and neurophysiological evidence for face identity and face
emotion processing in animals. Philos. Trans. R. Soc. B Biol. Sci. 361, 2155–2172 (2006).
6. Leopold, D. A. & hodes, G. A comparative view of face perception. J. Comp. Psychol. 124, 233–251 (2010).
7. Scherer, . . What are emotions? And how can they be measured? Soc. Sci. Inf. 44, 695–729 (2005).
8. Dimberg, U. & Petterson, M. Facial reactions to happy and angry facial expressions: Evidence for right hemisphere dominance.
Psychophysiology 37, 693–696 (2000).
9. Surguladze, S. A. et al. ecognition accuracy and response bias to happy and sad facial expressions in patients with major
depression. Neuropsychology 18, 212–218 (2004).
10. Etco, N. L. & Magee, J. J. Categorical perception of facial expressions. Cognition 44, 227–240 (1992).
11. Eman, P. & Friesen, W. V. Felt, false, and miserable smiles. J. Nonverbal Behav. 6, 238–252 (1982).
12. Gervais, M. & Wilson, D. S. e evolution and functions of laughter and humor: A synthetic approach. Q. Rev. Biol. 80, 395–430
(2005).
13. Harris, C. & Alvarado, N. Facial expressions, smile types, and self-report during humour, ticle, and pain. Cogn. Emot. 19, 655–669
(2005).
14. Preuscho, S. “Laughter” and “Smile” in Barbary Macaques (Macaca sylvanus). Ethology 91, 220–236 (1992).
15. Waller, B. M. & Dunbar, . I. M. Dierential behavioural eects of silent bared teeth display and relaxed open mouth display in
chimpanzees (Pan troglodytes). Ethology 111, 129–142 (2005).
16. Waller, B. M. & Cherry, L. Facilitating play through communication: Signicance of teeth exposure in the gorilla play face. Am. J.
Primatol. 74, 157–164 (2012).
17. Palagi, E. Social play in bonobos (Pan paniscus) and chimpanzees (Pan troglodytes): Implications for natural social systems and
interindividual relationships. Am. J. Phys. Anthropol. 129, 418–426 (2006).
18. Palagi, E., Antonacci, D. & Cordoni, G. Fine-tuning of social play in juvenile lowland gorillas (gorilla gorilla gorilla). D ev.
Psychobiol. 49, 433–445 (2007).
19. Palagi, E. & Mancini, G. Playing with the face: Playful facial “chattering” and signal modulation in a money species (eropithecus
gelada). J. Comp. Psychol. 125, 11–21 (2011).
20. Eman, P., Friesen, W. & Hager, J. Facial Action Coding System (FACS): manual. (esearch Nexus, 2002).
21. Eman, P. & Friesen, W. V. A new pan-cultural facial expression of emotion. Motiv. Emot. 10, 159–168 (1986).
22. Eman, P. & osenberg, E. L. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action
Coding System (FACS). (Oxford University Press, 1997).
www.nature.com/scientificreports/
9
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
23. Caeiro, C. C., Waller, B. M., Zimmermann, E., Burrows, A. M. & Davila-oss, M. OrangFACS: A muscle-based facial movement
coding system for orangutans (Pongo spp.). Int. J. Primatol. 34, 115–129 (2013).
24. Waller, B. M. et al. Paedomorphic facial expressions give dogs a selective advantage. PLOS ONE 8, e82686 (2013).
25. Caeiro, C. C., Burrows, A. & Waller, B. M. Development and application of CatFACS: Are human cat adopters inuenced by cat
facial expressions? Appl. Anim. Behav. Sci. 189, 66–78 (2017).
26. Wathan, J., Burrows, A. M., Waller, B. M. & McComb, . EquiFACS: e Equine Facial Action Coding System. PLOS ONE 10,
e0131738 (2015).
27. Waller, B. M., Lembec, M., uchenbuch, P., Burrows, A. M. & Liebal, . GibbonFACS: A muscle-based facial movement coding
system for hylobatids. Int. J. Primatol. 33, 809–821 (2012).
28. Parr, L. A., Waller, B. M., Burrows, A. M., Gothard, . M. & Vic, S. J. Brief communication: MaqFACS: A muscle-based facial
movement coding system for the rhesus macaque. Am. J. Phys. Anthropol. 143, 625–630 (2010).
29. Julle-Danière, É. et al. MaqFACS (Macaque Facial Action Coding System) can be used to document facial movements in Barbary
macaques (Macaca sylvanus). PeerJ 3, e1248 (2015).
30. Vic, S. J., Waller, B. M., Parr, L. A., Pasqualini, M. C. S. & Bard, . A. A cross-species comparison of facial morphology and
movement in humans and chimpanzees using the Facial Action Coding System (FACS). J. Nonverbal Behav. 31, 1–20 (2007).
31. Waller, B. M., Misch, A., Whitehouse, J. & Herrmann, E. Children, but not chimpanzees, have facial correlates of determination.
Biol. Lett. 10, 20130974–20130974 (2014).
32. Parr, L. A., Waller, B. M. & Vic, S. J. New developments in understanding emotional facial signals in chimpanzees. Curr. D ir.
Psychol. Sci. 16, 117–122 (2007).
33. Burrows, A. M., Waller, B. M., Parr, L. A. & Bonar, C. J. Muscles of facial expression in the chimpanzee (Pan troglodytes):
descriptive, comparative and phylogenetic contexts. J. Anat. 208, 153–167 (2006).
34. Parr, L. A., Waller, B. M., Vic, S. J. & Bard, . A. Classifying chimpanzee facial expressions using muscle action. Emotion 7,
172–181 (2007).
35. uhne, F., Hößler, J. C. & Struwe, . Emotions in dogs being petted by a familiar or unfamiliar person: Validating behavioural
indicators of emotional states using heart rate variability. Appl. Anim. Behav. Sci. 161, 113–120 (2014).
36. Beo, M. Social play in coyotes, wolves, and dogs. BioScience 24, 225–230 (1974).
37. Cordoni, G., Nicotra, V. & Palagi, E. Unveiling the “secret” of play in dogs (Canis lupus familiaris): Asymmetry and signals. J.
Comp. Psychol. 130, 278–287 (2016).
38. Bolwig, N. Facial expression in primates with remars on a parallel development in certain carnivores (a preliminary report on
wor in progress). Behaviour 22, 167–192 (1964).
39. Diogo, ., Wood, B. A., Aziz, M. A. & Burrows, A. On the origin, homologies and evolution of primate facial muscles, with a
particular focus on hominoids and a suggested unifying nomenclature for the facial muscles of the Mammalia. J. Anat. 215,
300–319 (2009).
40. Lang, P. J., Davis, M. & Öhman, A. Fear and anxiety: animal models and human cognitive psychophysiology. J. Aect. Disord. 61,
137–159 (2000).
41. Berridge, . C. Comparing the emotional brains of humans and other animals. In Handbook of Aective Sciences (eds. Davidson,
. J., Scherer, . . & Goldsmith, H. H.) 25–51 (Oxford University Press, 2003).
42. Pansepp, J. Aective consciousness: Core emotional feelings in animals and humans. Conscious. Cogn. 14, 30–80 (2005).
43. Phelps, E. A. & LeDoux, J. E. Contributions of the amygdala to emotion processing: From animal models to human behavior.
Neuron 48, 175–187 (2005).
44. Pansepp, J. e basic emotional circuits of mammalian brains: Do animals have aective lives? Neurosci. Biobehav. Rev. 35,
1791–1804 (2011).
45. Guo, ., Meints, ., Hall, C., Hall, S. & Mills, D. Le gaze bias in humans, rhesus moneys and domestic dogs. Anim. Cogn. 12,
409–418 (2009).
46. Hansen, T. F., Pienaar, J. & Orzac, S. H. A comparative method for studying adaptation to a randomly evolving environment.
Evolution 1965–1977, https://doi.org/10.1111/j.1558-5646.2008.00412.x (2008).
47. Shanahan, T. Phylogenetic inertia and Darwin’s higher law. Stud. Hist. Philos. Sci. Part C Stud. Hist. Philos. Biol. Biomed. Sci. 42,
60–68 (2011).
48. Blomberg, S. P. & Garland, T. Tempo and mode in evolution: phylogenetic inertia, adaptation and comparative methods. J. Evol.
Biol. 15, 899–910 (2002).
49. Darwin, C. On the origin of species. (John Murray, 1859).
50. Hall, B. . Descent with modication: the unity underlying homology and homoplasy as seen through an analysis of development
and evolution. Biol. Rev. 78, 409–433 (2003).
51. Eman, P. An argument for basic emotions. Cogn. Emot. 6, 169–200 (1992).
52. Pansepp, J., Fuchs, T. & Iacobucci, P. e basic neuroscience of emotional experiences in mammals: e case of subcortical FEA
circuitry and implications for clinical anxiety. Appl. Anim. Behav. Sci. 129, 1–17 (2011).
53. Trezza, V., Baarendse, P. J. J. & Vanderschuren, L. J. M. J. e pleasures of play: pharmacological insights into social reward
mechanisms. Trends Pharmacol. Sci. 31, 463–469 (2010).
54. Berridge, . C. eward learning: einforcement, incentives, and expectations. Psychol. Learn. Motiv. 40, 223–278 (2000).
55. Charland, L. C. e natural ind status of emotion. Br. J. Philos. Sci. 53, 511–537 (2002).
56. Eman, P., Hager, J. C. & Friesen, W. V. e symmetry of emotional and deliberate facial actions. Psychophysiology 18, 101–106
(1981).
57. Cohn, J. F. & Schmidt, . L. e timing of facial motion in posed and spontaneous smiles. Int. J. Wavelets Multiresolution Inf.
Process. 2, 121–132 (2004).
58. aheja, J. L. & Gupta, J. Distinguishing Facial Expressions: Genuine Vs Fae. Int. J. Recent Trends Eng. Technol. 3 (2010).
59. Diogo, . et al. e head and nec muscles of the serval and tiger: Homologies, evolution, and proposal of a mammalian and a
veterinary muscle ontology. Anat. Rec. Adv. Integr. Anat. Evol. Biol. 295, 2157–2178 (2012).
60. Damasio, A. Emotions and feelings: A neurobiological perspective. In Feelings and Emotions: e Amsterdam Symposium (eds.
Manstead, A. S. ., Frijda, N. & Fischer, A.) 49–57 (Cambridge University Press, 2004).
61. Damasio, A. Fundamental feelings. Nature 413, 781 (2001).
62. Schmidt, . L. & Cohn, J. F. Human facial expressions as adaptations: Evolutionary questions in facial expression research. Am. J.
Phys. Anthropol. 116, 3–24 (2001).
63. Waller, B. M., Cray, J. J. & Burrows, A. M. Selection for universal facial emotion. Emotion 8, 435–439 (2008).
64. Beerda, B., Schilder, M. B., van Hoo, J. A., de Vries, H. W. & Mol, J. A. Behavioural, saliva cortisol and heart rate responses to
dierent types of stimuli in dogs. Appl. Anim. Behav. Sci. 58, 365–381 (1998).
65. Stellato, A. C., Flint, H. E., Widowsi, T. M., Serpell, J. A. & Niel, L. Assessment of fear-related behaviours displayed by companion
dogs (Canis familiaris) in response to social and non-social stimuli. Appl. Anim. Behav. Sci. 188, 84–90 (2017).
66. Lund, J. D. & Jørgensen, M. C. Behaviour patterns and time course of activity in dogs with separation problems. Appl. Anim. Behav.
Sci. 63, 219–236 (1999).
67. Beerda, B., Schilder, M. B., van Hoo, J. A. & de Vries, H. W. Manifestations of chronic and acute stress in dogs. Appl. Anim. Behav.
Sci. 52, 307–319 (1997).
www.nature.com/scientificreports/
10
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
68. Hecht, J. & Horowitz, A. Introduction to dog behaviour. In Animal Behavior for Shelter Veterinarians and Sta (eds. Weiss, E.,
Heather Mohan-Gibbons & Stephen Zawistowsi) (Wiley-Blacwell, 2015).
69. Vilà, C. et al. Multiple and ancient origins of the domestic dog. Science 276, 1687–1689 (1997).
70. Hare, B. & Tomasello, M. Human-lie social sills in dogs? Trends Cogn. Sci. 9, 439–444 (2005).
71. ono, V., Nagy, . & Milósi, Á. How do humans represent the emotions of dogs? The resemblance between the human
representation of the canine and the human aective space. Appl. Anim. Behav. Sci. 162, 37–46 (2015).
72. Hintze, S., Smith, S., Patt, A., Bachmann, I. & Würbel, H. Are eyes a mirror of the soul? What eye wrinles reveal about a horse’s
emotional state. PLOS ONE 11, e0164017 (2016).
73. Boissy, A. et al. Cognitive sciences to relate ear postures to emotions in sheep. Anim. Welf. 20, 47 (2011).
74. Swielim, G. E. A. Atlas - Anatomy of sheep. (e Academic Booshop, Egyptian Joint-Stoc Co., 2006).
75. Parr, L. A. & Waller, B. M. Understanding chimpanzee facial expression: insights into the evolution of communication. Soc. Cogn.
Aect. Neurosci. 1, 221–228 (2006).
76. eysers, C. & Perrett, D. I. Demystifying social cognition: a Hebbian perspective. Trends Cogn. Sci. 8, 501–507 (2004).
77. Bruce, V. & Young, A. W. Face perception. (Psychology Press, 2012).
78. Tami, G. & Gallagher, A. Description of the behaviour of domestic dog (Canis familiaris) by experienced and inexperienced
people. Appl. Anim. Behav. Sci. 120, 159–169 (2009).
79. Colombo, E. S. & Prato-Previde, E. Empathy and recognition of dogs’ (canis familiaris) emotions: a pilot focusing on vets. J. Ve t.
Behav. Clin. Appl. Res. 9, e18 (2014).
80. erswell, . J., Bennett, P., Butler, . L. & Hemsworth, P. H. Self-eported Comprehension atings of Dog Behavior by Puppy
Owners. Anthrozoos Multidiscip. J. Interact. People Anim. 22, 183–193 (2009).
81. Mariti, C. et al. Perception of dogs’ stress by their owners. J. Vet. Behav. Clin. Appl. Res. 7, 213–219 (2012).
82. Stoecel, L. E., Palley, L. S., Gollub, . L., Niemi, S. M. & Evins, A. E. Patterns of brain activation when mothers view their own child
and dog: An fMI study. PLoS ONE 9, e107205 (2014).
83. Törnqvist, H. et al. Comparison of dogs and humans in visual scanning of social interaction. R. Soc. Open Sci. 2, 150341 (2015).
84. ujala, M. V., ujala, J., Carlson, S. & Hari, . Dog Experts’ Brains Distinguish Socially elevant Body Postures Similarly in Dogs
and Humans. PLoS ONE 7, e39145 (2012).
85. Morris, P. H., Doe, C. & Godsell, E. Secondary emotions in non-primate species? Behavioural reports and subjective claims by
animal owners. Cogn. Emot. 22, 3–20 (2008).
86. Horowitz, A. Disambiguating the “guilty loo”: Salient prompts to a familiar dog behaviour. Behav. Processes 81, 447–452 (2009).
87. Pascalis, O. & elly, D. J. e origins of face processing in humans: Phylogeny and ontogeny. Perspect. Psychol. Sci. 4, 200–209
(2009).
88. ugaas, T. On Talking Terms with Dogs: Calming Signals. (Dogwise Publishing, 2005).
89. Shepherd, . Development of behaviour, social behaviour and communication in dogs. In BSAVA Manual of Canine and Feline
Behavioural Medicine (eds. Horwitz, D. & Mills, D. S.) 13–16 (2009).
90. Bloom, T. & Friedman, H. Classifying dogs’ (Canis familiaris) facial expressions from photographs. Behav. Processes 96, 1–10
(2013).
91. Davidson, J. . Use of benzodiazepines in social anxiety disorder, generalized anxiety disorder, and posttraumatic stress disorder.
J. Clin. Psychiatry 65, 29–33 (2004).
92. McDu, D. et al. Aectiva-MIT facial expression dataset (AM-FED): Naturalistic and spontaneous facial expressions collected
‘in-the-wild’. In 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops 881–888 https://doi.org/10.1109/
CVPW.2013.130 (2013).
93. Tomasello, M. & Call, J. Assessing the validity of ape-human comparisons: A reply to Boesch (2007). J. Comp. Psychol. 122, 449–452
(2008).
94. Green, G. & Saunders, . Stimulus equivalence. In Handbook of Research Methods in Human Operant Behavior (eds. Lattal, . &
Perone, M.) (Springer Science & Business Media, 2013).
95. McCobb, E., Brown, E., Damiani, . & Dodman, N. understorm phobia in dogs: an Internet survey of 69 cases. J. Am. Anim.
Hosp. Assoc. 37, 319–324 (2001).
96. Overall, . L. & Love, M. Dog bites to humans—demography, epidemiology, injury, and ris. J. Am. Vet. Med. Assoc. 218,
1923–1934 (2001).
97. Blacwell, E. J., Bradshaw, J. W. S. & Casey, . A. Fear responses to noises in domestic dogs: Prevalence, ris factors and co-
occurrence with other fear related behaviour. Appl. Anim. Behav. Sci. 145, 15–25 (2013).
98. Fernández-Dols, J.-M. & Crivelli, C. Emotion and expression: Naturalistic studies. Em ot. R ev. 5, 24–29 (2013).
99. Mills, D. S., Dube, M. B. & Zulch, H. Stress and pheromonatherapy in small animal clinical behaviour. (John Wiley & Sons, 2012).
100. Friard, O. & Gamba, M. BOIS: a free, versatile open-source event-logging soware for video/audio coding and live observations.
Methods Ecol. Evol. 7, 1325–1330 (2016).
101. Eman, P., Friesen, W. & Hager, J. FACS investigator’s guide. (es earch Nexus, 2002).
102. Brewer, M. B. & Crano, W. D. esearch design and issues of validity. in Handbook of Research Methods in Social and Personality
Psychology (eds. eis, H. T. & Judd, C. M.) 11–26 (Cambridge University Press, 2014).
103. Broom, D. M. & Johnson, . G. Stress and Animal Welfare. (Springer Science & Business Media, 1993).
104. O’Doherty, J. P., Deichmann, ., Critchley, H. D. & Dolan, . J. Neural responses during anticipation of a primary taste reward.
Neuron 33, 815–826 (2002).
105. Boissy, A. et al. Assessment of positive emotions in animals to improve their welfare. Physiol. Behav. 92, 375–397 (2007).
106. Held, S. D. E. & Špina, M. Animal play and animal welfare. Anim. Behav. 81, 891–899 (2011).
Acknowledgements
Mónica Costa, Satoko Matsubara, Pearl Gillum and Jane Harris for inter-reliability coding. School of Psychology
technicians for software support. Daniel Pincheira Donoso for his comments and discussion of points in
earlier versions of the manuscript. is study was funded by a University of Lincoln Research Investment Fund
studentship, awarded to C.C.C.
Author Contributions
C.C., K.G. and D.M. developed the study concept and design. C.C. and D.M. selected videos. C.C. coded videos
and analysed the data. C.C., K.G. and D.M. interpreted the data, and C.C. draed the initial manuscript. All
authors provided critical revisions and approved the nal version.
www.nature.com/scientificreports/
11
Scientific RepoRts | 7: 15525 | DOI:10.1038/s41598-017-15091-4
Additional Information
Supplementary information accompanies this paper at https://doi.org/10.1038/s41598-017-15091-4.
Competing Interests: e authors declare that they have no competing interests.
Publisher's note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional aliations.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International
License, which permits use, sharing, adaptation, distribution and reproduction in any medium or
format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Cre-
ative Commons license, and indicate if changes were made. e images or other third party material in this
article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the
material. If material is not included in the article’s Creative Commons license and your intended use is not per-
mitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the
copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
© e Author(s) 2017