Conference Paper

A study of a retro-projected robotic face and its effectiveness for gaze reading by humans.

DOI: 10.1145/1734454.1734471 Conference: Proceedings of the 5th ACM/IEEE International Conference on Human Robot Interaction, HRI 2010, Osaka, Japan, March 2-5, 2010
Source: DBLP

ABSTRACT Reading gaze direction is important in human-robot interactions as it supports, among others, joint attention and non-linguistic interaction. While most previous work focuses on implementing gaze direction reading on the robot, little is known about how the human partner in a human-robot interaction is able to read gaze direction from a robot. The purpose of this paper is twofold: (1) to introduce a new technology to implement robotic face using retro-projected animated faces and (2) to test how well this technology supports gaze reading by humans. We briefly discuss the robot design and discuss parameters influencing the ability to read gaze direction. We present an experiment assessing the user's ability to read gaze direction for a selection of different robotic face designs, using an actual human face as baseline. Results indicate that it is hard to recreate human-human interaction performance. If the robot face is implemented as a semi sphere, performance is worst. While robot faces having a human-like physiognomy and, perhaps surprisingly, video projected on a flat screen perform equally well and seem to suggest that these are the good candidates to implement joint attention in HRI.

1 Bookmark
 · 
103 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents an innovative motion system that is used to control the motions and animations of a social robot. The social robot Probo is used to study Human-Robot Interactions (HRI), with a special focus on Robot Assisted Therapy (RAT). When used for therapy it is important that a social robot is able to create an "illusion of life" so as to become a believable character that can communicate with humans. The design of the motion system in this paper is based on insights from the animation industry. It combines operator-controlled animations with low-level autonomous reactions such as attention and emotional state. The motion system has a Combination Engine, which combines motion commands that are triggered by a human operator with motions that originate from different units of the cognitive control architecture of the robot. This results in an interactive robot that seems alive and has a certain degree of "likeability". The Godspeed Questionnaire Series is used to evaluate the animacy and likeability of the robot in China, Romania and Belgium.
    International Journal of Advanced Robotic Systems 05/2014; 11(72). · 0.82 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: PAPILLON is a technology for designing highly expressive animated eyes for interactive characters, robots and toys. Expressive eyes are essential in any form of face-to-face communication [2] and designing them has been a critical challenge in robotics, as well as in interactive character and toy development.
    ACM SIGGRAPH 2013 Emerging Technologies; 07/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present a technology for designing curved display surfaces that can both display information and sense two dimensions of human touch. It is based on 3D printed optics, where the surface of the display is constructed as a bundle of printed light pipes, that direct images from an arbitrary planar image source to the surface of the display. This effectively decouples the display surface and image source, allowing to iterate the design of displays without requiring changes to the complex electronics and optics of the device. In addition, the same optical elements also direct light from the surface of the display back to the image sensor allowing for touch input and proximity detection of a hand relative to the display surface. The resulting technology is effective in designing compact, efficient displays of a small size; this has been applied in the design of interactive animated eyes.
    Proceedings of the 26th annual ACM symposium on User interface software and technology; 10/2013

Full-text (2 Sources)

View
40 Downloads
Available from
Jun 10, 2014