Conference Paper

Using Bio-electrical Signals to Influence the Social Behaviours of Domesticated Robots

DOI: 10.1145/1514095.1514167 Conference: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction, HRI 2009, La Jolla, California, USA, March 9-13, 2009
Source: DBLP


Several emerging computer devices read bio-electrical signals (e.g., electro-corticographic signals, skin biopotential or facial muscle tension) and translate them into computer- understandable input. We investigated how one low-cost commercially-available device could be used to control a domestic robot. First, we used the device to issue direct motion commands; while we could control the device somewhat, it proved difficult to do reliably. Second, we interpreted one class of signals as suggestive of emotional stress, and used that as an emotional parameter to influence (but not directly control) robot behaviour. In this case, the robot would react to human stress by staying out of the person's way. Our work suggests that affecting behaviour may be a reasonable way to leverage such devices.

Download full-text


Available from: Saul Greenberg,
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This article shows the experiences carried out in the context of human/robot communication, on the basis of brain bio-electrical signals, with the application of the available technologies and interfaces which have facilitated the reading of the user's brain bio-electrical signals and their association to explicit commands that have allowed the control of biped and mobile robots through the adaptation of communication devices. Our work presents an engineering solution, with the application of technological bases, the development of a high-and low-level communication framework, the description of experiments and the discussion of the results achieved in field tests.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Emotional intelligence is the ability to process information about one’s own emotions and the emotions of others. It involves perceiving emotions, understanding emotions, managing emotions and using emotions in thought processes and in other activities. Emotion understanding is the cognitive activity of using emotions to infer why an agent is in an emotional state and which actions are associated with the emotional state. For humans, knowledge about emotions includes, in part, emotional experiences (episodic memory) and abstract knowledge about emotions (semantic memory). In accordance with the need for more sophisticated agents, the current research aims to increase the emotional intelligence of software agents by introducing and evaluating an emotion understanding framework for intelligent agents. The framework organizes the knowledge about emotions using episodic memory and semantic memory. Its episodic memory learns by storing specific details of emotional events experienced firsthand or observed. Its semantic memory is a lookup table of emotion-related facts combined with semantic graphs that learn through abstraction of additional relationships among emotions and actions from episodic memory. The framework is simulated in a multi-agent system in which agents attempt to elicit target emotions in other agents. They learn what events elicit emotions in other agents through interaction and observation. To evaluate the importance of different memory components, we run simulations with components “lesioned”. We show that our framework outperformed Q-learning, a standard method for machine learning.
    Autonomous Agents and Multi-Agent Systems 01/2014; 28(1). DOI:10.1007/s10458-012-9214-9 · 1.25 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The widespread adoption of personal service robots will likely depend on how well they interact with users. This chapter was motivated by a desire to facilitate the design of usable personal service robots. Toward that end, this chapter reviews the literature concerning people interacting with personal service robots. First, ongoing research related to the design of personal service robots is discussed. This material is organized around generic activities that would take place when a user initiates interaction with a future personal service robot, for example, understanding the robot’s affordances or its cognitive capabilities, as well as when a personal service robot initiates interaction with a user, for example, understanding the user’s intent or engaging and communicating with the user. Second, research areas that deserve more attention from the human-robot interaction community are discussed, for example, understanding when people do and do not treat robots as if they were people. Throughout the chapter, recommendations for the design of future personal service robots are offered along with recommendations for future research.
    Reviews of Human Factors and Ergonomics 09/2011; 7(1):100-148. DOI:10.1177/1557234X11410388
Show more