ArticlePDF Available

Abstract and Figures

The perception of emotional expressions allows animals to evaluate the social intentions and motivations of each other. This usually takes place within species; however, in the case of domestic dogs, it might be advantageous to recognize the emotions of humans as well as other dogs. In this sense, the combination of visual and auditory cues to categorize others' emotions facilitates the information processing and indicates highlevel cognitive representations. Using a cross-modal preferential looking paradigm, we presented dogs with either human or dog faces with different emotional valences (happy/playful versus angry/aggressive) paired with a single vocalization from the same individual with either a positive or negative valence or Brownian noise. Dogs looked significantly longer at the face whose expression was congruent to the valence of vocalization, for both conspecifics and heterospecifics, an ability previously known only in humans. These results demonstrate that dogs can extract and integrate bimodal sensory emotional information, and discriminate between positive and negative emotions from both humans and dogs. © 2016 The Author(s) Published by the Royal Society. All rights reserved.
Content may be subject to copyright.
Cite this article: Albuquerque N, Guo K,
Wilkinson A, Savalli C, Otta E, Mills D. 2016
Dogs recognize dog and human emotions. Biol.
Lett. 12: 20150883.
Received: 20 October 2015
Accepted: 22 December 2015
Subject Areas:
behaviour, cognition
Canis familiaris, cross-modal sensory
integration, emotion recognition,
social cognition
Author for correspondence:
Kun Guo
Electronic supplementary material is available
at or
Animal behaviour
Dogs recognize dog and human emotions
Natalia Albuquerque1,3, Kun Guo2, Anna Wilkinson1, Carine Savalli4,
Emma Otta3and Daniel Mills1
School of Life Sciences, and
School of Psychology, University of Lincoln, Lincoln LN6 7DL, UK
Department of Experimental Psychology, Institute of Psychology, University of Sa
˜o Paulo, Sa
˜o Paulo 05508-030,
Department of Public Politics and Public Health, Federal University of Sa
˜o Paulo, Santos 11015-020, Brazil
KG, 0000-0001-6765-1957
The perception of emotional expressions allows animals to evaluate the
social intentions and motivations of each other. This usually takes place
within species; however, in the case of domestic dogs, it might be advan-
tageous to recognize the emotions of humans as well as other dogs. In
this sense, the combination of visual and auditory cues to categorize
others’ emotions facilitates the information processing and indicates high-
level cognitive representations. Using a cross-modal preferential looking
paradigm, we presented dogs with either human or dog faces with different
emotional valences (happy/playful versus angry/aggressive) paired with a
single vocalization from the same individual with either a positive or nega-
tive valence or Brownian noise. Dogs looked significantly longer at the face
whose expression was congruent to the valence of vocalization, for both con-
specifics and heterospecifics, an ability previously known only in humans.
These results demonstrate that dogs can extract and integrate bimodal sen-
sory emotional information, and discriminate between positive and
negative emotions from both humans and dogs.
1. Introduction
The recognition of emotional expressions allows animals to evaluate the social
intentions and motivations of others [1]. This provides crucial information
about how to behave in different situations involving the establishment and
maintenance of long-term relationships [2]. Therefore, reading the emotions
of others has enormous adaptive value. The ability to recognize and respond
appropriately to these cues has biological fitness benefits for both signaller
and the receiver [1].
During social interactions, individuals use a range of sensory modalities,
such as visual and auditory cues, to express emotion with characteristic changes
in both face and vocalization, which together produce a more robust percept
[3]. Although facial expressions are recognized as a primary channel for the
transmission of affective information in a range of species [2], the perception
of emotion through cross-modal sensory integration enables faster, more accu-
rate and more reliable recognition [4]. Cross-modal integration of emotional
cues has been observed in some primate species with conspecific stimuli,
such as matching a specific facial expression with the corresponding vocaliza-
tion or call [5 7]. However, there is currently no evidence of emotional
recognition of heterospecifics in non-human animals. Understanding heterospe-
cific emotions is of particular importance for animals such as domestic dogs,
who live most of their lives in mixed species groups and have developed mech-
anisms to interact with humans (e.g. [8]). Some work has shown cross-modal
capacity in dogs relating to the perception of specific activities (e.g. food-guard-
ing) [9] or individual features (e.g. body size) [10], yet it remains unclear
whether this ability extends to the processing of emotional cues, which
inform individuals about the internal state of others.
&2016 The Author(s) Published by the Royal Society. All rights reserved.
on January 13, 2016 from
Dogs can discriminate human facial expressions and
emotional sounds (e.g. [11–18]); however, there is still no evi-
dence of multimodal emotional integration and these results
relating to discrimination could be explained through simple
associative processes. They do not demonstrate emotional
recognition, which requires the demonstration of categoriz-
ation rather than differentiation. The integration of
congruent signals across sensory inputs requires internal cat-
egorical representation [19– 22] and so provides a means to
demonstrate the representation of emotion.
In this study, we used a cross-modal preferential looking
paradigm without familiarization phase to test the hypoth-
esis that dogs can extract and integrate emotional
information from visual (facial) and auditory (vocal)
inputs. If dogs can cross-modally recognize emotions, they
should look longer at facial expressions matching the
emotional valence of simultaneously presented vocalizations,
as demonstrated by other mammals (e.g. [5 7,21,22]). Owing
to previous findings of valence [5], side [22], sex [11,22] and
species [12,23] biases in perception studies, we also investi-
gated whether these four main factors would influence the
dogs’ response.
2. Material and methods
Seventeen healthy socialized family adult dogs of various breeds
were presented simultaneously with two sources of emotional
information. Pairs of grey-scale gamma-corrected human or
dog face images from the same individual but depicting different
expressions (happy/playful versus angry/aggressive) were pro-
jected onto two screens at the same time as a sound was
played (figure 1a). The sound was a single vocalization (dog
barks or human voice in an unfamiliar language) of either
positive or negative valence from the same individual, or a neu-
tral sound (Brownian noise). Stimuli (figure 1b) featured one
female and one male of both species. Unfamiliar individuals
and an unfamiliar language (Brazilian Portuguese) were used
to rule out the potential influence of previous experience with
model identity and human language.
Experiments took place in a quiet, dimly-lit test room and
each dog received two 10-trial sessions, separated by two
weeks. Dogs stood in front of two screens and a video camera
recorded their spontaneous looking behaviour. A trial consisted
of the presentation of a combination of the acoustic and visual
stimuli and lasted 5 s (see electronic supplementary material
for details). Each trial was considered valid for analyses when
the dog looked at the images for at least 2.5 s. The 20 trials pre-
sented different stimulus combinations: 4 face-pairs (2 human
and 2 dog models) 2 vocalizations (positive and negative
valence) 2 face positions (left and right), in addition to 4 con-
trol trials (4 face-pairs with neutral auditory stimulus).
Therefore, each subject saw each possible combination once.
We calculated a congruence index ¼(C2I)/T, where Cand
Irepresent the amount of time the dog looked at the congruent
(facial expression matching emotional vocalization, C) and
incongruent faces (I), and Trepresents total looking time (look-
ing left þlooking right þlooking at the centre) for the given
trial, to measure the dog’s sensitivity to audio-visual emotional
cues delivered simultaneously. We analysed the congruence
index across all trials using a general linear mixed model
(GLMM) with individual dog included in the model as a
random effect. Only emotion valence, stimulus sex, stimulus
species and presentation position (left versus right) were
included as the fixed effects in the final analysis because first-
and second-order interactions were not significant. The means
were compared to zero and confidence intervals were presented
for all the main factors in this model. A backward selection pro-
cedure was applied to identify the significant factors. The
normality assumption was verified by visually inspecting plots
of residuals with no important deviation from normality ident-
ified. To verify a possible interaction between the sex of
subjects and stimuli, we used a separate GLMM taking into
account these factors. We also tested whether dogs preferentially
looked at a particular valence throughout trials and at a particu-
lar face in the control trials (see the electronic supplementary
material for details of index calculation).
3. Results
Dogs showed a clear preference for the congruent face in
67% of the trials (n¼188). The mean congruence index
was 0.19 +0.03 across all test trials and was significantly
greater than zero (t
¼5.53; p,0.0001), indicating dogs
looked significantly longer at the face whose expression
matched the valence of vocalization. Moreover, we found a
consistent congruent looking preference regardless of the
stimulus species (dog: t
¼5.39, p,0.0001; human:
¼2.48, p¼0.01; figure 2a), emotional valence (negative:
¼5.01, p,0.0001; positive: t
¼2.88, p¼0.005;
figure 2b), stimulus gender ( female: t
¼4.42, p,0.0001;
male: t
¼3.45, p,0.001; figure 2c) and stimulus
position (left side: t
¼2.74, p,0.01; right side: t
5.14, p,0.0001; figure 2d). When a backwards selection pro-
cedure was applied to the model with the four main factors,
the final model included only stimulus species. The congru-
ence index for this model was significantly higher for
viewing dog rather than human faces (dog: 0.26 +0.05,
human: 0.12 +0.05, F
¼4.42; p¼0.04, figure 2a), indicat-
ing that dogs demonstrated greater sensitivity towards
conspecific cues. In a separate model, we observed no signifi-
cant interaction between subject sex and stimulus sex
¼1.33, p¼0.25) or main effects (subject sex: F
0.17, p¼0.68; subject stimulus: F
¼0.56, p¼0.45).
Dogs did not preferentially look at either of the facial
expressions in control conditions when the vocalization was
the neutral sound (mean: 0.04 +0.07; t
¼0.56; p¼0.58).
The mean preferential looking index was 20.05 +0.03,
which was not significantly different from zero (t
p¼0.13), indicating that there was no difference in the pro-
portion of viewing time between positive and negative
facial expressions across trials.
4. Discussion
The findings are, we believe, the first evidence of the inte-
gration of heterospecific emotional expressions in a species
other than humans, and extend beyond primates the
demonstration of cross-modal integration of conspecific
emotional expressions. These results show that domestic
dogs can obtain dog and human emotional information
from both auditory and visual inputs, and integrate them
into a coherent perception of emotion [21]. Therefore, it
is likely that dogs possess at least the mental prototypes
for emotional categorization (positive versus negative
affect) and can recognize the emotional content of these
expressions. Moreover, dogs performed in this way without
any training or familiarization with the models, suggesting Biol. Lett. 12: 20150883
on January 13, 2016 from
that these emotional signals are intrinsically important. This
is consistent with this ability conferring important adaptive
advantages [24].
Our study shows that dogs possess a similar ability to
some non-human primates in being able to match auditory
and visual emotional information [5], but also demonstrates
an important advance. In our study, there was not a strict tem-
poral correlation between the recording of visual and auditory
cues (e.g. relaxed dog face with open mouth paired with play-
ful bark), unlike the earlier research on primates (e.g. [5]). Thus
the relationship between the modalities was not temporally
contiguous, reducing the likelihood of learned associations
accounting for the results. This suggests the existence of a
robust categorical emotion representation.
Although dogs showed the ability to recognize both con-
specific and heterospecific emotional cues, we found that
they responded significantly more strongly towards dog
stimuli. This could be explained by a more refined mechanism
for the categorization of emotional information from conspeci-
fics, which is corroborated by the recent findings of dogs
showing a greater sensitivity to conspecifics’ facial expressions
[12] and a preference for dog over human images [23]. The
ability to recognize emotions through visual and auditory
cues may be a particularly advantageous social tool in a
highly social species such as dogs and might have been
exapted for the establishment and maintenance of long-term
relationships with humans. It is possible that during domesti-
cation, such features could have been retained and potentially
selected for, albeit unconsciously. Nonetheless, the communi-
cative value of emotion is one of the core components of the
process and even less-social domestic species, such as cats,
express affective states such as pain in their faces [25].
test space(a)
220 cm
0 0.5 1.0 1.5 2.0 2.5
time (s)
3.0 3.5 4.0 4.5 0 0.5 1.0 1.5 2.0 2.5
time (s)
frequency (kHz)
3.0 3.5 4.0 4.5
0 0.5 1.0 1.5 2.0 2.5
time (s)
3.0 3.5 4.0 4.5 0 0.5 1.0 1.5 2.0 2.5
time (s)
frequency (kHz)
3.0 3.5 4.0 4.5
140 cm
167 cm
Figure 1. (a) Schematic apparatus. R2: researcher, C: camera, S: screens, L: loudspeakers, P: projectors, R1: researcher. (b) Examples of stimuli used in the study:
faces (human happy versus angry, dog playful versus aggressive) and their correspondent vocalizations. Biol. Lett. 12: 20150883
on January 13, 2016 from
There has been a long-standing debate as to whether dogs
can recognize human emotions. Studies using either visual or
auditory stimuli have observed that dogs can show differen-
tial behavioural responses to single modality sensory inputs
with different emotional valences (e.g. [12,16]). For example,
¨ller et al. [13] found that dogs could selectively respond to
happy or angry human facial expressions; when trained with
only the top (or bottom) half of unfamiliar faces they gener-
alized the learned discrimination to the other half of the
face. However, these human-expression-modulated behav-
ioural responses could be attributed solely to learning of
contiguous visual features. In this sense, dogs could be discri-
minating human facial expressions without recognizing the
information being transmitted.
Our subjects needed to be able to extract the emotional
information from one modality and activate the correspond-
ing emotion category template for the other modality. This
indicates that domestic dogs interpret faces and vocalizations
using more than simple discriminative processes; they obtain
emotionally significant semantic content from relevant audio
and visual stimuli that may aid communication and social
interaction. Moreover, the use of unfamiliar Portuguese
words controlled for potential artefacts induced by a dog’s
previous experience with specific words. The ability to form
emotional representations that include more than one sensory
modality suggests cognitive capacities not previously demon-
strated outside of primates. Further, the ability of dogs to
extract and integrate such information from an unfamiliar
human stimulus demonstrates cognitive abilities not known
to exist beyond humans. These abilities may be fundamental
to a functional relationship within the mixed species social
groups in which dogs often live. Moreover, our results
may indicate a more widespread distribution of the ability
to spontaneously integrate multimodal cues among non-
human mammals, which may be key to understanding the
evolution of social cognition.
Ethics. Ethical approval was granted by the ethics committee in the
School of Life Sciences, University of Lincoln. Prior to the study, writ-
ten informed consent was obtained from the dogs’ owners and human
models whose face images and vocalizations were sampled as the
stimuli. We can confirm that both the human models have agreed
that their face images and vocalizations can be used for research and
related publications, and we have received their written consent.
Data accessibility. The data underlying this study are available from
Authors’ contribution. N.A., K.G., A.W. and D.M. conceived/designed the
study and wrote the paper. E.O. conceived the study. N.A. performed
the experiments. N.A. and C.S. analysed and interpreted the data.
N.A. prepared the figures. All authors gave final approval for publi-
cation and agree to be held accountable for the work performed.
Competing interests. We declare we have no competing interests.
Funding. Financial support for N.A. from Brazil Coordination for the
Improvement of Higher Education Personnel is acknowledged.
Acknowledgements. We thank Fiona Williams and Lucas Albuquerque for
assisting with data collection/double coding and figures preparation.
1. Schmidt KL, Cohn JF. 2001 Human expressions as
adaptations: evolutionary questions in facial
expression research. Am. J. Phys. Anthropol.33,
3–24. (doi:10.1002/ajpa.20001)
2. Parr LA, Winslow JT, Hopkins WD, de Waal FBM.
2000 Recognizing facial cues: individual
discrimination by chimpanzees (Pan troglodytes)
and rhesus monkeys (Macaca mulatta). J. Comp.
Psychol.114, 47–60. (doi:10.1037/0735-7036.114.
3. Campanella S, Belin P. 2007 Integrating face
and voice in person perception. Trends Cogn. Sci.11,
dog human
species of stimulus
negative positive
valence of stimulus
congruence index
(mean ± s.e.)
female male
sex of stimulus
left right
side of stimulus presentation
congruence index
(mean ± s.e.)
Figure 2. Dogs’ viewing behaviour (calculated as congruence index). (a) Species of stimulus; (b) valence of stimulus; (c) sex of stimulus; (d) side of stimulus
presentation. *p,0.05, **p,0.01, ***p,0.001. Biol. Lett. 12: 20150883
on January 13, 2016 from
535–543. (doi:10.1016/j.tics.2007.
4. Yuval-Greenberg S, Deouell LY. 2009 The dog’s
meow: asymmetrical interaction in cross-modal
object recognition. Exp. Brain Res.193, 603– 614.
5. Ghazanfar AA, Logothetis NK. 2003 Facial
expressions linked to monkey calls. Nature 423,
937–938. (doi:10.1038/423937a)
6. Izumi A, Kojima S. 2004 Matching vocalizations to
vocalizing faces in a chimpanzee (Pan troglodytes).
Anim. Cogn.7, 179–184. (doi:10.1007/s10071-004-
7. Payne C, Bachevalier J. 2013 Crossmodal integration
of conspecific vocalizations in rhesus macaques.
PLoS ONE 8, e81825. (doi:10.1371/journal.pone.
8. Nagasawa M, Mitsui S, En S, Ohtani N, Ohta M,
Sakuma Y, Onaka T, Mogi K, Kikusui T. 2015
Oxytocin-gaze positive loop and the coevolution of
human-dog bonds. Science 348, 333–336. (doi:10.
9. Farago
´T, Pongra
´cz P, Range F, Vira
´nyi Z, Miklo
´si A.
2010 ‘The bone is mine’: affective and referential
aspects of dog growls. Anim. Behav.79, 917– 925.
10. Taylor AM, Reby D, McComb K. 2011 Cross modal
perception of body size in domestic dogs (Canis
familiaris). PLoS ONE 6, e0017069. (doi:10.1371/
11. Nagasawa M, Murai K, Mogi K, Kikusui T. 2011 Dogs
can discriminate human smiling faces from blank
expressions. Anim. Cogn.14, 525– 533. (doi:10.
12. Racca A, Guo K, Meints K, Mills DS. 2012 Reading
faces: differential lateral gaze bias in processing
canine and human facial expressions in dogs and
4-year-old children. PLoS ONE 7, e36076. (doi:10.
13. Mu¨ller CA, Schmitt K, Barber ALA, Huber L. 2015
Dogs can discriminate emotional expressions of
human faces. Curr. Biol.25, 601– 605. (doi:10.
14. Buttelmann D, Tomasello M. 2013 Can domestic
dogs (Canis familiaris) use referential
emotional expressions to locate hidden food? Anim.
Cogn.16, 137–145. (doi:10.1007/s10071-012-
15. Flom R, Gartman P. In press. Does affective
information influence domestic dogs’ (Canis lupus
familiaris) point-following behavior? Anim.Cogn.
16. Fukuzawa M, Mills DS, Cooper JJ. 2005 The effect of
human command phonetic characteristics on
auditory cognition in dogs (Canis familiaris).
J. Comp. Psychol.119, 117– 120. (doi:10.1037/
17. Custance D, Mayer J. 2012 Empathic-like
responding by domestic dogs (Canis familiaris)to
distress in humans: an exploratory study. Anim.
Cogn.15, 851–859. (doi:10.1007/s10071-012-
18. Andics A, Ga
´csi M, Farago
´T, Kis A, Miklo
´si A. 2014
Voice-sensitive regions in the dog and human
brain are revealed by comparative fMRI. Curr.
Biol.24, 574–578. (doi:10.1016/j.cub.2014.
19. Kondo N, Izawa E-I, Watanabe S. 2012 Crows cross-
modally recognize group member but not non-
group members. Proc. R. Soc. B 279, 1937– 1942.
20. Silwa J, Duhamel J, Pascalis O, Wirth S. 2011
Spontaneous voice–face identity matching by
rhesus monkeys for familiar conspecifics and
humans. Proc. Natl Acad. Sci. USA 108, 1735– 1740.
21. Proops L, McComb K, Reby D. 2009 Cross-modal
individual recognition in domestic horses (Equus
caballus). Proc. Natl Acad. Sci. USA 106, 947– 951.
22. Proops L, McComb K. 2012 Cross-modal individual
recognition in domestic horses (Equus caballus)
extends to familiar humans. Proc. R. Soc. B 282,
3131–3138. (doi:10.1098/rspb.2012.0626)
23. Somppi S, To
¨rnqvist H, Ha
¨nninen L, Krause C, Vainio
O. 2014 How dogs scan familiar and inverted faces:
an eye movement study. Anim. Cogn.17, 793–803.
24. Guo K, Meints K, Hall C, Hall S, Mills D. 2009 Left
gaze bias in humans, rhesus monkeys and domestic
dogs. Anim. Cogn.12, 409–418. (doi:10.1007/
25. Holden E, Calvo G, Collins M, Bell A, Reid J, Scot EM,
Nolan AM. 2014 Evaluation of facial expression in
acute pain in cats. J. Small Anim. Pract.55,
615–621. (doi:10.1111/jsap.12283) Biol. Lett. 12: 20150883
on January 13, 2016 from
... In the past 20 years, an increasing number of studies have explored the cognitive abilities of domestic mammals toward humans, bringing to light sometimes unexpected sociocognitive skills 1 . For example, sheep recognize individual human faces 2 , goats know when humans are attentive to them 3 , and dogs and cats recognize and react to our emotions [4][5][6] . These findings enable us to improve human-animal interactions and animal welfare, as we better understand how our actions and emotions affect domestic mammals. ...
... To take this variation into account, for each horse, we calculated an attention index measuring the propensity to be more attentive to PDS than ADS relative to the total time spent being attentive to the screen. This index was defined as (A PDS -A ADS )/(A PDS + A ADS ), where A PDS is the time spent being attentive to the screen during PDS sections and A ADS is the time spent being attentive during ADS sections (see 4 ). This index varied from -1 to 1, with a negative value indicating that a horse was more attentive during ADS than PDS, and a positive value indicating the opposite. ...
Full-text available
In a recent experiment, we showed that horses are sensitive to pet-directed speech (PDS), a kind of speech used to talk to companion animals that is characterized by high pitch and wide pitch variations. When talked to in PDS rather than adult-directed speech (ADS), horses reacted more favorably during grooming and in a pointing task. However, the mechanism behind their response remains unclear: does PDS draw horses’ attention and arouse them, or does it make their emotional state more positive? In this study, we used an innovative paradigm in which female horses watched videos of humans speaking in PDS or ADS to better understand this phenomenon. Horses reacted diferently to the videos of PDS and ADS: they were signifcantly more attentive and their heart rates increased signifcantly more during PDS than during ADS. We found no diference in the expressions of negative or positive emotional states during PDS and ADS videos. Thus, we confrm that horses’ perception of humans can be studied by means of video projections, and we conclude that PDS attracts attention and has an arousing efect in horses, with consequences on the use of PDS in daily interactions with them.
... Discrimination of human's expression of emotions by animals has been shown to occur through facial cues in several domestic and captive species, such as dogs [15], horses (Equus caballus [16]), sheep (Ovis aries [17]), goats (Capra hircus [18]), giant pandas (Ailuropoda melanoleuca [19]), and chimpanzees (Pan troglodytes [20]). In contrast, evidence for discrimination/perception of human vocal expression of emotions is limited to dogs, horses, and cats [21]. ...
... In contrast, evidence for discrimination/perception of human vocal expression of emotions is limited to dogs, horses, and cats [21]. These three domesticated species display cross-modal recognition (visual, i.e., facial expression, and vocal, i.e., emotional non-verbal vocalizations or speech) of human emotions [15,[22][23][24]. In addition, dog fMRI studies [25,26] and horse behavioral experiments [27] suggest that dogs and horses can discriminate between positive and negative non-speech human vocalizations (e.g., growling and laughter). ...
Full-text available
Background Discrimination and perception of emotion expression regulate interactions between conspecifics and can lead to emotional contagion (state matching between producer and receiver) or to more complex forms of empathy (e.g., sympathetic concern). Empathy processes are enhanced by familiarity and physical similarity between partners. Since heterospecifics can also be familiar with each other to some extent, discrimination/perception of emotions and, as a result, emotional contagion could also occur between species. Results Here, we investigated if four species belonging to two ungulate Families, Equidae (domestic and Przewalski’s horses) and Suidae (pigs and wild boars), can discriminate between vocalizations of opposite emotional valence (positive or negative), produced not only by conspecifics, but also closely related heterospecifics and humans. To this aim, we played back to individuals of these four species, which were all habituated to humans, vocalizations from a unique set of recordings for which the valence associated with vocal production was known. We found that domestic and Przewalski’s horses, as well as pigs, but not wild boars, reacted more strongly when the first vocalization played was negative compared to positive, regardless of the species broadcasted. Conclusions Domestic horses, Przewalski’s horses and pigs thus seem to discriminate between positive and negative vocalizations produced not only by conspecifics, but also by heterospecifics, including humans. In addition, we found an absence of difference between the strength of reaction of the four species to the calls of conspecifics and closely related heterospecifics, which could be related to similarities in the general structure of their vocalization. Overall, our results suggest that phylogeny and domestication have played a role in cross-species discrimination/perception of emotions.
... Dogs have become a part of the human social environment over the course of domestication, and most of them interact with both humans and other dogs on a regular basis, plausibly suggesting that they have become adept at navigating themselves in both con-, and heterospecific vocal interactions. Accordingly, behavioural studies have shown that dogs can match humans' [7,8] and dogs' pictures with their vocalizations [8], as well as dog and human emotional vocalizations with the congruent facial expressions [9]. In recent years, dogs have also become an increasingly popular model species of comparative neuroscience owing to several different factors. ...
Full-text available
Recent advances in the field of canine neuro-cognition allow for the non-invasive research of brain mechanisms in family dogs. Considering the striking similarities between dog's and human (infant)'s socio-cognition at the behavioural level, both similarities and differences in neural background can be of particular relevance. The current study investigates brain responses of n = 17 family dogs to human and conspecific emotional vocalizations using a fully non-invasive event-related potential (ERP) paradigm. We found that similarly to humans, dogs show a differential ERP response depending on the species of the caller, demonstrated by a more positive ERP response to human vocalizations compared to dog vocalizations in a time window between 250 and 650 ms after stimulus onset. A later time window between 800 and 900 ms also revealed a valence-sensitive ERP response in interaction with the species of the caller. Our results are, to our knowledge, the first ERP evidence to show the species sensitivity of vocal neural processing in dogs along with indications of valence sensitive processes in later post-stimulus time periods.
... As all of these instruments are completed by a proxy, the proxy's ability to properly perform the assessment can influence the results of the instrument [28,29]. Although potential limitations may occur, in this study, the instruments were completed by dog handlers. ...
Full-text available
Background: Working dogs are at an increased risk of developing an orthopedic disease compared to companion dogs. This study aimed to evaluate functional and orthopedic index fitness in a Portuguese population of police working dogs. In an observational, prospective study, information on 165 dogs was collected. The age, sex, breed, specific work, and history of previous diagnosis of orthopedic disease were recorded for each patient. A copy of the Canine Orthopedic Index (COI), Hudson Visual Analogue Scale (HVAS), and Functional Assessment (FA) was collected for all dogs. COI, HVAS, and FA scores between breeds, work, age, sex, and history of a previous diagnosis of orthopedic disease were compared. Multiple regression was run to predict COI, HVAS, and FA scores from breeds, work, age, sex, and history of orthopedic disease. Correlations between items were determined with Pearson's correlation. A p < 0.05 was set. Results: The sample was composed of 92 males and 73 females, with a mean age of 5.2 ± 3.2 years. Four main dog breeds were represented, 60 Belgian Malinois Shepherd Dogs, 52 German Shepherd Dogs, 29 Labrador Retrievers, and 14 Dutch Shepherd Dog. A prevalence of diarrhea of 10.6% was determined, with 4% of dogs having liquid diarrhea. German Shepherd Dogs had significantly higher FA scores (p = 0.03). Dogs with a history of previous veterinary assistance due to orthopedic issues had significantly lower HVAS scores and higher scores with all remaining questionnaires (p < 0.01 for all). No differences were found between sexes or specific work. Age and a history of orthopedic disease contributed to the prediction of all scores. FA scores had a good correlation with COI and HVAS. Conclusion: This population of police working dogs has a good to excellent level of physical fitness. There was a relationship between increasing age, history of orthopedic disease, and worse scores with all questionnaires. All considered questionnaires could differentiate between animals with a previous history of orthopedic disease and sound dogs.
... It is expected that an animal that is dependent on humans for food, shelter and protection would learn to communicate with humans and develop sensitivity towards human given cues to better exploit available resources. The pet dog's ability to understand human communicative gestures is unrivalled in the animal world and multiple experiments have shown its proficiency in deciphering human emotional states 19,20 , recognising human faces 21,22 , and understanding vocal cues 23 . In fact, dogs are so uniquely adapted to the human environment that they understand us better than our closest living relative, the chimpanzee 24,25 . ...
Rapid urbanisation, leading to habitat loss is a major problem for biodiversity conservation. While urbanisation negatively affects the survival of many species, some species are well adapted to the urban environment, often depending on humans directly or indirectly for food and shelter. Such animals show various behavioural adaptations to anthropogenic disturbance, and there are even examples of some exploiting humans for their own advantage. In this review, we use some of these examples to highlight how cognitive and physiological underpinnings of urban-adaptation in some animals can help us to understand how they survive in the human jungle. We propose that more in-depth studies of urban-adapted species are necessary for nurturing biodiversity in the face of urbanisation, and building more sustainable cities for the future.
Zusammenfassung Seit der Neufassung des Tierschutzgesetzes (TierSchG) im Jahr 1972 steht der ethische Tierschutz im Mittelpunkt und Tiere sind um ihrer selbst willen geschützt (Deutscher Bundestag 6. Wahlperiode 1971). Die strafrechtliche Anerkennung der geschützten Interessen stellt § 17 TierSchG dar, welcher das Töten eines Wirbeltieres ohne vernünftigen Grund (§ 17 Nr. 1 TierSchG), das Zufügen aus Rohheit (§ 17 Nr. 2 a TierSchG) und von länger anhaltenden oder sich wiederholenden (§ 17Nr. 2 b TierSchG) erheblichen Schmerzen oder Leiden pönalisiert.Die Sanktionspraxis gem. § 17 TierSchG wurde anhand der Strafverfolgungsstatistik von 2002 bis 2018 sowie Akten bei der Staatsanwaltschaft Gießen aus 2016 und 2018 untersucht. Dabei wurde ein Augenmerk auf tatbegehende bzw. verdächtige Personen, die Art der tierschutzrelevanten Handlung und betroffene Tierarten, anzeigende Instanzen, Verurteilungen nach allgemeinem Strafrecht und TierSchG nach Höhe der Geldstrafen und Verwarnungen mit Strafvorbehalt gem. § 59 StGB gelegt.Dabei wurde festgestellt, dass Personen, die tierschutzrelevante Handlungen begehen, einen niedrigen ökonomischen Status haben, im Schnitt älter sind als tatbegehende Personen insgesamt und häufiger Frauen als insgesamt an diesen Straftaten beteiligt sind. Des Weiteren sind Hunde am häufigsten von tierschutzrelevanten Ermittlungsverfahren betroffen und es besteht ein signifikanter Zusammenhang zwischen dieser Tierart und Misshandlungstaten. Am häufigsten werden Anzeigen von Privatpersonen gestellt, resultierende Verfahren werden jedoch nicht signifikant häufiger eingestellt.Diese Erkenntnisse sind nicht nur bedeutend für den tierschutzrechtlichen Vollzug, sondern auch die Mensch-Tier-Beziehung, die eine gesamtgesellschaftliche Betrachtung der Sanktionspraxis vor dem Hintergrund des Art. 20 a GG begründet. Es bleibt zu hoffen, dass das Staatsziel Tierschutz zukünftig umfassende Umsetzung erfährt und Entscheidungen und Abwägungsprozesse in dubio pro animale (Tierärztliche Vereinigung für Tierschutz e. V. 2009) ausfallen.
Human-canine interactions reduce stress in humans although less is known about the effects of therapeutic animal-assisted programs on the welfare of therapy dogs. The current study sought to measure the behavioural effects of a canine interaction program on certified therapy dogs during an on-campus student stress buster event. A total of 25 therapy dog-handler teams and 1155 students participated in events during exam periods in December and April. Each event consisted of two sessions per day with each session divided into six 15 min interaction periods with four dogs and 20 students. Sessions were videoed for retrospective analysis of stress-related behaviours. Subsequent to the university event, a 5 min control period at the dog’s home was videoed. Behavioural data were analyzed using a repeated measures general linear mixed model. Lip licking (p = 0.0171), interactions between the dogs and their owner (p = 0.0014) and ears in the back position (p = 0.0127) increased during the periods within a session compared to control. The frequency of lip licking (p = 0.0092) and ears back (p = 0.0025) were higher in the dogs during the December event compared to the April event. No effect of the length of therapy dog certification (p > 0.19) or the number of therapy sessions the dogs attended in the previous week (p > 0.09) was observed on the frequency of any behaviours. The results demonstrate subtle behavioural signs of stress in dogs during an on-campus student stress buster event highlighting the importance for handlers to recognize and mitigate the situation to improve the welfare of therapy dogs during therapeutic settings.
Full-text available
During two retreats in 2017 and 2020, a group of international scientists convened to explore the Human-Animal Bond. The meetings, hosted by the Wallis Annenberg PetSpace Leadership Institute, took a broad view of the human-dog relationship and how interactions between the two may benefit us medically, psychologically or through their service as working dogs (e.g. guide dogs, explosive detection, search and rescue, cancer detection). This Frontiers’ Special Topic has collated the presentations into a broad collection of 14 theoretical and review papers summarizing the latest research and practice in the historical development of our deepening bond with dogs, the physiological and psychological changes that occur during human-dog interactions (to both humans and dogs) as well as the selection, training and welfare of companion animals and working dogs. The overarching goals of this collection are to contribute to the current standard of understanding of human-animal interaction, suggest future directions in applied research, and to consider the interdisciplinary societal implications of the findings.
The therapeutic benefits of Animal Assisted Therapy (AAT) are well-documented and AAT research often involves dogs. Despite growing research into the therapeutic value of therapy dogs' visitation within health and psychiatric contexts, research specifically into the integration of dogs into psychological therapies is limited. Current Dog Assisted Psychological Therapy (DAPT) research is novel and limited. DAPT research predominantly focuses on therapist perspectives and use quantitative or case study methodologies. Research into adults' experience of DAPT is limited. Therefore, this research explored the experiences of adults receiving DAPT, who self-identified as experiencing mental health difficulties. Specific aims were to gain a broad insight into the experienced opportunities, challenges and factors impacting participants' experience of therapy, to help inform the clinical implications of DAPT. Qualitative methodology was used by conducting semi-structured interviews with six participants sharing their experiences of DAPT. Interpretative Phenomenological Analysis (IPA) enabled exploration of important issues regarding participants' experiences. Five themes emerged: 1) Relationship with dog(s); 2) Providing a safe therapeutic atmosphere; 3) Distraction; 4) Facilitating personal insights; and 5) Concern for the dog's wellbeing. Relationships between the themes are illustrated, and the theoretical relevance to psychological models and clinical applications are discussed. This research demonstrated that, where clinically and ethically appropriate, DAPT can provide therapeutic opportunities to facilitate psychological therapies. The therapist's skills in managing interactions and potential distraction, the client's prior experiences of dogs and the dog's wellbeing are all important considerations to manage some of the identified challenges within DAPT. Further research is needed to inform practice guidelines, specifically in terms of which psychological models and patient groups DAPT might be best suited to, and further exploration of DAPT practice within different clinical and cultural populations.
Dog-assisted interventions (DAI) are those that include specially trained dogs in human health services. Often, the training methods employed to train animals for DAI are transmitted between trainers, so the latest scientific research on dog learning and cognition is not always taken into account. The present work aims to evaluate the impact that the main theories on the evolution of the dog have had both in promoting different training methods and in the relevance of behavior in the evolution of the skills of actual dogs. Then, an integrative method for the training of dogs is presented. This method takes into account the research on dog learning mechanisms and cognition processes, and effectively promotes the development of desirable behaviors for DAI during the dog’s ontogeny.
Full-text available
Several studies have examined dogs' (Canis lupus familiaris) comprehension and use of human communicative cues. Relatively few studies have, however, examined the effects of human affective behavior (i.e., facial and vocal expressions) on dogs' exploratory and point-following behavior. In two experiments, we examined dogs' frequency of following an adult's pointing gesture in locating a hidden reward or treat when it occurred silently, or when it was paired with a positive or negative facial and vocal affective expression. Like prior studies, the current results demonstrate that dogs reliably follow human pointing cues. Unlike prior studies, the current results also demonstrate that the addition of a positive affective facial and vocal expression, when paired with a pointing gesture, did not reliably increase dogs' frequency of locating a hidden piece of food compared to pointing alone. In addition, and within the negative facial and vocal affect conditions of Experiment 1 and 2, dogs were delayed in their exploration, or approach, toward a baited or sham-baited bowl. However, in Experiment 2, dogs continued to follow an adult's pointing gesture, even when paired with a negative expression, as long as the attention-directing gesture referenced a baited bowl. Together these results suggest that the addition of affective information does not significantly increase or decrease dogs' point-following behavior. Rather these results demonstrate that the presence or absence of affective expressions influences a dogs' exploratory behavior and the presence or absence of reward affects whether they will follow an unfamiliar adult's attention-directing gesture.
Full-text available
Human-like modes of communication, including mutual gaze, in dogs may have been acquired during domestication with humans. We show that gazing behavior from dogs, but not wolves, increased urinary oxytocin concentrations in owners, which consequently facilitated owners' affiliation and increased oxytocin concentration in dogs. Further, nasally administered oxytocin increased gazing behavior in dogs, which in turn increased urinary oxytocin concentrations in owners. These findings support the existence of an interspecies oxytocin-mediated positive loop facilitated and modulated by gazing, which may have supported the coevolution of human-dog bonding by engaging common modes of communicating social attachment. Copyright © 2015, American Association for the Advancement of Science.
Full-text available
Faces play an important role in communication and identity recognition in social animals. Domestic dogs often respond to human facial cues, but their face processing is weakly understood. In this study, facial inversion effect (deficits in face processing when the image is turned upside down) and responses to personal familiarity were tested using eye movement tracking. A total of 23 pet dogs and eight kennel dogs were compared to establish the effects of life experiences on their scanning behavior. All dogs preferred conspecific faces and showed great interest in the eye area, suggesting that they perceived images representing faces. Dogs fixated at the upright faces as long as the inverted faces, but the eye area of upright faces gathered longer total duration and greater relative fixation duration than the eye area of inverted stimuli, regardless of the species (dog or human) shown in the image. Personally, familiar faces and eyes attracted more fixations than the strange ones, suggesting that dogs are likely to recognize conspecific and human faces in photographs. The results imply that face scanning in dogs is guided not only by the physical properties of images, but also by semantic factors. In conclusion, in a free-viewing task, dogs seem to target their fixations at naturally salient and familiar items. Facial images were generally more attractive for pet dogs than kennel dogs, but living environment did not affect conspecific preference or inversion and familiarity responses, suggesting that the basic mechanisms of face processing in dogs could be hardwired or might develop under limited exposure.
Full-text available
Crossmodal integration of audio/visual information is vital for recognition, interpretation and appropriate reaction to social signals. Here we examined how rhesus macaques process bimodal species-specific vocalizations by eye tracking, using an unconstrained preferential looking paradigm. Six adult rhesus monkeys (3M, 3F) were presented two side-by-side videos of unknown male conspecifics emitting different vocalizations, accompanied by the audio signal corresponding to one of the videos. The percentage of time animals looked to each video was used to assess crossmodal integration ability and the percentages of time spent looking at each of the six a priori ROIs (eyes, mouth, and rest of each video) were used to characterize scanning patterns. Animals looked more to the congruent video, confirming reports that rhesus monkeys spontaneously integrate conspecific vocalizations. Scanning patterns showed that monkeys preferentially attended to the eyes and mouth of the stimuli, with subtle differences between males and females such that females showed a tendency to differentiate the eye and mouth regions more than males. These results were similar to studies in humans indicating that when asked to assess emotion-related aspects of visual speech, people preferentially attend to the eyes. Thus, the tendency for female monkeys to show a greater differentiation between the eye and mouth regions than males may indicate that female monkeys were slightly more sensitive to the socio-emotional content of complex signals than male monkeys. The current results emphasize the importance of considering both the sex of the observer and individual variability in passive viewing behavior in nonhuman primate research.
Full-text available
The importance of the face in social interaction and social intelligence is widely recognized in anthropology. Yet the adaptive functions of human facial expression remain largely unknown. An evolutionary model of human facial expression as behavioral adaptation can be constructed, given the current knowledge of the phenotypic variation, ecological contexts, and fitness consequences of facial behavior. Studies of facial expression are available, but results are not typically framed in an evolutionary perspective. This review identifies the relevant physical phenomena of facial expression and integrates the study of this behavior with the anthropological study of communication and sociality in general. Anthropological issues with relevance to the evolutionary study of facial expression include: facial expressions as coordinated, stereotyped behavioral phenotypes, the unique contexts and functions of different facial expressions, the relationship of facial expression to speech, the value of facial expressions as signals, and the relationship of facial expression to social intelligence in humans and in nonhuman primates. Human smiling is used as an example of adaptation, and testable hypotheses concerning the human smile, as well as other expressions, are proposed. Yrbk Phys Anthropol 44:3-24, 2001.
Full-text available
Although many studies have investigated domestic dogs' (Canis familiaris) use of human communicative cues, little is known about their use of humans' emotional expressions. We conducted a study following the general paradigm of Repacholi in Dev Psychol 34:1017-1025, (1998) and tested four breeds of dogs in the laboratory and another breed in the open air. In our study, a human reacted emotionally (happy, neutral or disgust) to the hidden contents of two boxes, after which the dog was then allowed to choose one of the boxes. Dogs tested in the laboratory distinguished between the most distinct of the expressed emotions (Happy-Disgust condition) by choosing appropriately, but performed at chance level when the two emotions were less distinct (Happy-Neutral condition). The breed tested in the open air passed both conditions, but this breed's differing testing setup might have been responsible for their success. Although without meaningful emotional expressions, when given a choice, these subjects chose randomly, their performance did not differ from that in the experimental conditions. Based on the findings revealed in the laboratory, we suggest that some domestic dogs recognize both the directedness and the valence of some human emotional expressions.
Faces are one of the most salient classes of stimuli involved in social communication. Three experiments compared face-recognition abilities in chimpanzees (Pan troglodytes) and rhesus monkeys (Macaco mulatto). In the face-matching task, the chimpanzees matched identical photographs of conspecifics' faces on Trial 1, and the rhesus monkeys did the same after 4 generalization trials. In the individual-recognition task, the chimpanzees matched 2 different photographs of the same individual after 2 trials, and the rhesus monkeys generalized in fewer than 6 trials. The feature-masking task showed that the eyes were the most important cue for individual recognition. Thus, chimpanzees and rhesus monkeys are able to use facial cues to discriminate unfamiliar conspecifics. Although the rhesus monkeys required many trials to learn the tasks, this is not evidence that faces are not as important social stimuli for them as for the chimpanzees.
The question of whether animals have emotions and respond to the emotional expressions of others has become a focus of research in the last decade [1-9]. However, to date, no study has convincingly shown that animals discriminate between emotional expressions of heterospecifics, excluding the possibility that they respond to simple cues. Here, we show that dogs use the emotion of a heterospecific as a discriminative cue. After learning to discriminate between happy and angry human faces in 15 picture pairs, whereby for one group only the upper halves of the faces were shown and for the other group only the lower halves of the faces were shown, dogs were tested with four types of probe trials: (1) the same half of the faces as in the training but of novel faces, (2) the other half of the faces used in training, (3) the other half of novel faces, and (4) the left half of the faces used in training. We found that dogs for which the happy faces were rewarded learned the discrimination more quickly than dogs for which the angry faces were rewarded. This would be predicted if the dogs recognized an angry face as an aversive stimulus. Furthermore, the dogs performed significantly above chance level in all four probe conditions and thus transferred the training contingency to novel stimuli that shared with the training set only the emotional expression as a distinguishing feature. We conclude that the dogs used their memories of real emotional human faces to accomplish the discrimination task. Copyright © 2015 Elsevier Ltd. All rights reserved.
During the approximately 18–32 thousand years of domestication [1], dogs and humans have shared a similar social environment [2]. Dog and human vocalizations are thus familiar and relevant to both species [3], although they belong to evolutionarily distant taxa, as their lineages split approximately 90–100 million years ago [4]. In this first comparative neuroimaging study of a nonprimate and a primate species, we made use of this special combination of shared environment and evolutionary distance. We presented dogs and humans with the same set of vocal and nonvocal stimuli to search for functionally analogous voice-sensitive cortical regions. We demonstrate that voice areas exist in dogs and that they show a similar pattern to anterior temporal voice areas in humans. Our findings also reveal that sensitivity to vocal emotional valence cues engages similarly located nonprimary auditory regions in dogs and humans. Although parallel evolution cannot be excluded, our findings suggest that voice areas may have a more ancient evolutionary origin than previously known.