GazeRoboard: Gaze-communicative guide system in daily life on stuffed-toy robot with interactive display board
ABSTRACT In this paper, we propose a guide system for daily life in semipublic spaces by adopting a gaze-communicative stuffed-toy robot and a gaze-interactive display board. The system provides naturally anthropomorphic guidance through a) gaze-communicative behaviors of the stuffed-toy robot (ldquojoint attentionrdquo and ldquoeye-contact reactionsrdquo) that virtually express its internal mind, b) voice guidance, and c) projection on the board corresponding to the userpsilas gaze orientation. The userpsilas gaze is estimated by our remote gaze-tracking method. The results from both subjective/objective evaluations and demonstration experiments in a semipublic space show i) the holistic operation of the system and ii) the inherent effectiveness of the gaze-communicative guide.
Conference Proceeding: User impressions of a stuffed doll robot's facing direction in animation systems.[show abstract] [hide abstract]
ABSTRACT: This paper investigates the effect on user impressions of the body direction of a stuffed doll robot in an animation system. Many systems that combine a computer display with a robot have been developed, and one of their applications is entertainment, for example, an animation system. In these systems, the robot, as a 3D agent, can be more effective than a 2D agent in helping the user enjoy the animation experience by using spatial characteristics, such as body direction, as a means of expression. The direction in which the robot faces, i.e., towards the human or towards the display, is investigated here. User impressions from 25 subjects were examined. The experiment results show that the robot facing the display together with a user is effective for eliciting good feelings from the user, regardless of the user's personality characteristics. Results also suggest that extroverted subjects tend to have a better feeling towards a robot facing the user than introverted ones.Proceedings of the 9th International Conference on Multimodal Interfaces, ICMI 2007, Nagoya, Aichi, Japan, November 12-15, 2007; 01/2007
- [show abstract] [hide abstract]
ABSTRACT: ABSTRACT The aim of this paper is to clarify the relationship betweenProceedings of the 8th International Conference on Multimodal Interfaces, ICMI 2006, Banff, Alberta, Canada, November 2-4, 2006; 01/2006
- [show abstract] [hide abstract]
ABSTRACT: To build smart human interfaces, it is necessary for a system to know a user's intention and point of attention. Since the motion of a person's head pose and gaze direction are deeply related with his/her intention and attention, detection of such information can be utilized to build natural and intuitive interfaces. We describe our real-time stereo face tracking and gaze detection system to measure head pose and gaze direction simultaneously. The key aspect of our system is the use of real-time stereo vision together with a simple algorithm which is suitable for real-time processing. Since the 3D coordinates of the features on a face can be directly measured in our system, we can significantly simplify the algorithm for 3D model fitting to obtain the full 3D pose of the head compared with conventional systems that use monocular camera. Consequently we achieved a non-contact, passive, real-time, robust, accurate and compact measurement system for head pose and gaze direction4th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2000), 26-30 March 2000, Grenoble, France; 01/2000