Fig 2 - uploaded by Angelica Lim
Content may be subject to copyright.
Proposed essential features for an emotional contagion system, with related locations in the brain. 

Proposed essential features for an emotional contagion system, with related locations in the brain. 

Source publication
Article
Full-text available
Could a robot feel authentic empathy? What exactly is empathy, and why do most humans have it? We present a model which suggests that empathy is an emergent behavior with four main elements: a mirror neuron system, somatosensory cortices, an insula, and infant-directed " baby talk " or motherese. To test our hypothesis, we implemented a robot calle...

Context in source publication

Context 1
... on these findings in neuroscience, we offer an architecture for robot empathy in Fig. 1, and a summary of the features in Fig. 2. A robot with empathy should model at least these three areas of the brain: a) mirror neurons in the premotor cortex, b) the insula c) and the somatosensory cortex. These correspond to three functional modules in a robot ...

Similar publications

Conference Paper
Full-text available
This paper presents a robot architecture heavily inspired by neuropsychology, developmental psychology and research into "executive functions" (EF) which are responsible for the planning capabilities in humans. This architecture is presented in light of this inspiration, mapping the modules to the different functions in the brain. We emphasize the...
Conference Paper
Full-text available
This paper reports recent progress on modeling the grounded co-acquisition of syntax and semantics of locative spatial language in developmental robots. We show how a learner robot can learn to produce and interpret spatial utterances in guided-learning interactions with a tutor robot (equipped with a sys- tem for producing English spatial phrases)...
Article
Full-text available
Using an epigenetic model, in this paper we investigate the importance of sensorimotor experiences and environmental conditions in the emergence of more advanced cognitive abilities in an autonomous robot. We let the robot develop in three environments affording very different (physical and social) sensorimotor experiences: a “normal,” standard env...
Article
Full-text available
This article briefly reviews research in cognitive development concerning the nature of the human self. It then reviews research in developmental robotics that has attempted to retrace parts of the developmental trajectory of the self. This should be of interest to developmental psychologists, and researchers in developmental robotics. As a point o...
Conference Paper
Full-text available
Interdisciplinary research, drawing from robotics, artificial intelligence, neuroscience, psychology, and cognitive science, is a cornerstone to advance the state-of-the-art in multimodal human-robot interaction and neuro-cognitive mod-eling. Research on neuro-cognitive models benefits from the embodiment of these models into physical, humanoid age...

Citations

... It is necessary to exhibit a convincing flow of emotions and empathetic behavior for social robots to be accepted as companions by the humans [1,2]. Training a robot to mimic the emotion expressed by the human partner poses the challenge of authenticity; i.e. people can be disturbed by robot companions simply mimicking their emotions without feeling them [3,4]. The implementation of empathetic robot leads to the challenge of authenticity and then implementation of human acceptable empathetic robot issue comes into play [3]. ...
... Training a robot to mimic the emotion expressed by the human partner poses the challenge of authenticity; i.e. people can be disturbed by robot companions simply mimicking their emotions without feeling them [3,4]. The implementation of empathetic robot leads to the challenge of authenticity and then implementation of human acceptable empathetic robot issue comes into play [3]. Various psychological and neurological theories are proposed to explain the (empathic) behavior in humans and attempting to reproduce it in agents in developmental or epigenetic robotics [5,6]. ...
... Note that when empathy increases between the human and the robot, then a social relationship is developed. Developmental robotics [3] compares the robot to a six months old child and considers empathy as an emergent behavior based on four main components: (1) the gut feeling, which senses the body's condition (battery, temperature's sensors, etc.) (2) a module to associate stimuli to feelings, (3) a mirroring system and (4) infant direct baby talk. In this study, we present a prototype of an affective model based on the developmental robotic. ...
... Yet she would be clueless when facing a new situation. This is why researchers in moral robots are currently focusing on "empathic" robots, which are able to learn, rather than robots with moral norms (Asada 2015;lim and okuno 2015;Paiva, leite, Boukricha and Wachsmuth 2017). Focusing on moral development, and in line with what we previously said about the epistemological role of empathy in moral judgments, railton (2016) states that moral learning might require taking others' perspectives through empathy. ...
Article
In this paper we discuss Prinz’s Kantian arguments in “Is Empathy Necessary for Morality?” (2011). They purport to show that empathy is not necessary for morality because it is not part of the capacities required for moral competence and it can bias moral judgment. First, we show that even conceding Prinz his notions of empathy and moral competence, empathy still plays a role in moral competence. Second, we argue that moral competence is not limited to moral judgment. Third, we reject Prinz’s notion of empathy because it is too restrictive, in requiring emotional matching. We conclude that once morality and empathy are properly understood, empathy’s role in morality is vindicated. Morality is not reduced to a form of rational judgment, but it necessarily presupposes pro-social preferences and motivation and sensitivity to inter-subjective demands.
... Yet she would be clueless when facing a new situation. This is why researchers in moral robots are currently focusing on "empathic" robots, which are able to learn, rather than robots with moral norms (Asada 2015;lim and okuno 2015;Paiva, leite, Boukricha and Wachsmuth 2017). Focusing on moral development, and in line with what we previously said about the epistemological role of empathy in moral judgments, railton (2016) states that moral learning might require taking others' perspectives through empathy. ...
Article
Realistic looking humanoid love and sex dolls have been available on a somewhat secretive basis for at least three decades. But today the industry has gone mainstream with North American, European, and Asian producers using mass customization and competing on the bases of features, realism, price, and depth of product lines. As a result, realistic life size artificial companions are becoming more affordable to purchase and more feasible to patronize on a service basis. Sexual relations may be without equal when it comes to emotional intimacy. Yet, the increasingly vocal and interactive robotic versions of these dolls, feel nothing. They may nevertheless induce emotions in users that potentially surpass the pleasure of human-human sexual experiences. The most technologically advanced love and sex robots are forecast to sense human emotions and gear their performances of empathy, conversation, and sexual activity accordingly. I offer a model of how this might be done to provide a better service experience. I compare the nature of resulting “artificial emotions” by robots to natural emotions by humans. I explore the ethical issues entailed in offering love and sex robot services with artificial emotions and offer a conclusion and recommendations for service management and for further research.
Article
The aim of this work is to design an artificial empa-thetic system and to implement it into an EMotional RESpondent robot, called EMRES. Rather than mimic the expression detected in the human partner, the proposed system achieves a coherent and consistent emotional trajectory resulting in a more credible human-agent interaction. Inspired by developmental robotics the-ory, EMRES has an internal state and a mood, which contribute in the evolution of the flow of emotions; at every episode, the next emotional state of the agent is affected by its internal state, mood, current emotion and the expression read in the human partner. As a result, EMRES does not imitate, but it synchronizes to the emotion expressed by the human companion. The agent has been trained to recognize expressive faces of the FER2013 database and it is capable of achieving 78.3% performance with wild images. Our first prototype has been implemented into a robot, which has been created for this purpose. An empirical study run with university students judged in a positive way the newly proposed artificial empathetic system.
Article
Full-text available
Social robots are gradually entering children’s lives in a period when children learn about social relationships and exercise prosocial behaviors with parents, peers, and teachers. Designed for long-term emotional engagement and to take the roles of friends, teachers, and babysitters, such robots have the potential to influence how children develop empathy. This article presents a review of the literature (2010–2020) in the fields of human–robot interaction (HRI), psychology, neuropsychology, and roboethics, discussing the potential impact of communication with social robots on children’s social and emotional development. The critical analysis of evidence behind these discussions shows that, although robots theoretically have high chances of influencing the development of empathy in children, depending on their design, intensity, and context of use, there is no certainty about the kind of effect they might have. Most of the analyzed studies, which showed the ability of robots to improve empathy levels in children, were not longitudinal, while the studies observing and arguing for the negative effect of robots on children’s empathy were either purely theoretical or dependent on the specific design of the robot and the situation. Therefore, there is a need for studies investigating the effects on children’s social and emotional development of long-term regular and consistent communication with robots of various designs and in different situations.
Conference Paper
Full-text available
Abstract—Processing human affective behavior is important for developing intelligent agents that interact with humans in complex interaction scenarios. A large number of current approaches that address this problem focus on classifying emotion expressions by grouping them into known categories. Such strategies neglect, among other aspects, the impact of the affective responses from an individual on their interaction partner thus ignoring how people empathize towards each other. This is also reflected in the datasets used to train models for affective processing tasks. Most of the recent datasets, in particular, the ones which capture natural interactions (“in-the-wild” datasets), are designed, collected, and annotated based on the recognition of displayed affective reactions, ignoring how these displayed or expressed emotions are perceived. In this paper, we propose a novel dataset composed of dyadic interactions designed, collected and annotated with a focus on measuring the affective impact that eight different stories have on the listener. Each video of the dataset contains around 5 minutes of interaction where a speaker tells a story to a listener. After each interaction, the listener annotated, using a valence scale, how the story impacted their affective state, reflecting how they empathized with the speaker as well as the story. We also propose different evaluation protocols and a baseline that encourages participation in the advancement of the field of artificial empathy and emotion contagion. Index Terms—Empathy, Dyadic Interactions, Affective Behaviour
Article
Full-text available
Service robots and artificial intelligence promise to increase productivity and reduce costs, prompting substantial growth in sales of service robots and research dedicated to understanding their implications. Nevertheless, marketing research on this phenomenon is scarce. To establish some fundamental insights related to this research domain, the current article seeks to complement research on robots’ human-likeness with investigations of the factors that service managers must choose for the service robots implemented in their service setting. A three-part framework, comprised of robot design, customer features, and service encounter characteristics, specifies key factors within each category that need to be analyzed together to determine their optimal adaptation to different service components. Definitions and overlapping concepts are clarified, together with previous knowledge on each variable and research gaps that need to be solved. This framework and the final research questions provide a research agenda to guide scholars and help practitioners implement service robots successfully.