Conference Paper

GazeRoboard: Gaze-communicative Guide System in Daily Life on Stuffed-toy Robot with Interactive Display Board

ATR Intell. Robot. & Commun. Labs., Tokyo
DOI: 10.1109/IROS.2008.4650692 Conference: Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on
Source: IEEE Xplore


In this paper, we propose a guide system for daily life in semipublic spaces by adopting a gaze-communicative stuffed-toy robot and a gaze-interactive display board. The system provides naturally anthropomorphic guidance through a) gaze-communicative behaviors of the stuffed-toy robot (ldquojoint attentionrdquo and ldquoeye-contact reactionsrdquo) that virtually express its internal mind, b) voice guidance, and c) projection on the board corresponding to the userpsilas gaze orientation. The userpsilas gaze is estimated by our remote gaze-tracking method. The results from both subjective/objective evaluations and demonstration experiments in a semipublic space show i) the holistic operation of the system and ii) the inherent effectiveness of the gaze-communicative guide.

10 Reads
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a videophone conversation support system by the behaviors of a companion robot and the switching of camera images in coordination with the user's conversational attitude toward the communication. In order to maintain a conversation and to achieve comfortable communication, it is necessary to understand a user's conversational states, which are whether the user is talking (taking the initiative) and whether the user is concentrating on the conversation. First, a) the system estimates the user's conversational state by a machine learning method. Next, b-1) the robot appropriately expresses its active listening behaviors, such as nodding and gaze turns, to compensate for the listener's attitude when she/he is not really listening to another user's speech, b-2) the robot shows communication-evoking behaviors (topic provision) to compensate for the lack of a topic, and b-3) the system switches the camera images to create an illusion of eye-contact corresponding to the current context of the user's attitude. From empirical studies, a detailed experiment, and a demonstration experiment, i) both the robot's active listening behaviors and the switching of the camera image compensate for the other person's attitude, ii) the topic provision function is effective for awkward silences, and iii) elderly people prefer long intervals between the robot's behaviors.
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose and evaluate a video communication system that compensates for user's uncongenial attitudes by coordinating the robot's behaviors and media control of the video. The system facilitates comfortable video communications between elderly or disabled people by an assistant robot for each user that expresses (a) active listening behaviors to compensate for the listener's attitude when he/she is not really listening to another user's talking and (b) a cover-up behavior (gaze turned to the user) to divert attention from the other user's uncongenial attitude when that person is not looking at the talking user but toward the robot at her/his side; this behavior is performed by coordinating the automatic switching of cameras to give the impression that the congenial person is still looking at the user. The results obtained in the system evaluation show the significant effectiveness of this design approach using the robot's behavior and media control of the video to compensate for the problems in video communication that we aimed to overcome.
    2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan; 01/2010
  • [Show abstract] [Hide abstract]
    ABSTRACT: Context data are updated frequently due to the dynamic changes of the various sensor values and the situations of application entities. Without a proper management, the stored contexts will become different from those of the real-world. Those invalid contexts will cause context inconsistency problems and thus should be eliminated at the right time and in an appropriate manner. In this paper, we propose a context inconsistency management scheme based on context elimination rules that describe the semantics of context invalidity to solve context inconsistency problems. The proposed rule-based scheme will enable users to easily specify elimination conditions for inconsistent contexts. Our performance evaluation shows that the rule processing overhead is compensated for by virtue of the well-maintained repository of the stored contexts.
    IEEE/IFIP 8th International Conference on Embedded and Ubiquitous Computing, EUC 2010, Hong Kong, China, 11-13 December 2010; 01/2010
Show more