K. Imanishi

Waseda University, Edo, Tōkyō, Japan

Are you K. Imanishi?

Claim your profile

Publications (4)0 Total impact

  • Source
    M. Zecca · N. Endo · K. Itoh · K. Imanishi · M. Saito · N. Nanba · H. Takanobu · A. Takanishi
    [Show abstract] [Hide abstract]
    ABSTRACT: Personal Robots and Robot Technology (RT)-based assistive devices are expected to play a major role in Japan's elderly-dominated society, both for joint activities with their human partners and for participation in community life. These new devices should be capable of smooth and natural adaptation and interaction with their human partners and the environment, should be able to communicate naturally with humans, and should never have a negative effect on their human partners, neither physical nor emotional. To achieve this smooth and natural integration between humans and robots, we need first to investigate and clarify how these interactions are carried out. Therefore, we developed the portable Bioinstrumentation System WB-1R (Waseda Bioinstrumentation system no.l Refined), which can measure the movements of the head, the arms, the hands (position, velocity, and acceleration), as well as several physiological parameters (electrocardiogram, respiration, perspiration, pulse wave, and so on), to objectively measure and understand the physical and physiological effects of the interaction between robots and humans. In this paper we present our development of the head and hands motion capture systems as additional modules for the Waseda Bioinstrumentation system No.1 (WB-1). The preliminary experimental results, given the inexpensiveness of the systems, are good for our purposes.
    Full-text · Conference Paper · Oct 2007
  • K. Itoh · H. Miwa · Y. Onishi · K. Imanishi · K. Hayashi · A. Takanishi
    [Show abstract] [Hide abstract]
    ABSTRACT: Personal robots, which are expected to become popular in future, are required to be active in joint work and community life with humans. Therefore, we have been developing new mechanisms and functions for a humanoid robot with the ability to communicate with humans in a human-like manner. During human communication, humans distinguish individuals by facial features such as silhouette, position and the shape of each part, and gauge mental state based on facial expressions. Many researchers have developed robots that can express their emotions using facial expressions. However, a robot that can change individual facial aspects has not yet been researched. We consider it important for a personal robot to express not only its emotions but faces which appeal to a partner. Therefore, we developed a face robot WD-(Waseda-Docomo face robot No-1) that expresses various faces by changing the facial feature-points. In this study, a mask of the average face was made in order to express complicated shapes with small control points. The position, number and movable range of feature-points were optimized. In addition, we developed 3-DOFs drive unit and controlled each feature-point, not at a point but on a plane
    No preview · Conference Paper · Jan 2006
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In a society getting older year by year, Robot technology (RT) is expected to play an important role. In order to achieve this objective, the new generation of personal robots should be capable of a natural communication with humans by expressing human-like emotion. In this sense, the hands play a funda-mental role in communication, because they have grasping, sensing and emotional expression ability. This paper presents the recent results of the collaboration be-tween the Takanishi Lab of Waseda University, Tokyo, Japan, and the Arts Lab of Scuola Superiore Sant'Anna, Pisa, Italy, and RoboCasa in a biologically-inspired approach for the development of a new humanoid hand. In particular, the grasping and gestural capabilities of the novel anthropomorphic hand for humanoid robotics RCH-1 (RoboCasa Hand No.1) are presented.
    Full-text · Article · Jan 2006
  • [Show abstract] [Hide abstract]
    ABSTRACT: There are personal equations in the facial features of a silhouette, position and shape of each part; eyes, eyebrows, nose, mouth and ears. Humans recognize individuality from facial features, and they express their mental states by facial expressions. Researchers have developed a robot that has a personality based on its experience dynamically and a robot that expresses emotion. We consider that a robot can communicate with a human more naturally, when it not only expresses its emotion with the face but also has facial features. However, robots that express facial features by changing feature-points have not been developed. Therefore, we developed a face robot that expresses its facial features. We propose both the mechanical model and the optical model to express the feature-points more clearly. We introduced the push-pull method with piano wires as the mechanical model and a projection method with a projector as the optical model. In the mechanical model, the face is covered with a mesh of piano wires and radio control type servomotors are used as actuators to control the facial feature-points. We change the shape of the face by pushing and pulling feature-points on the mesh with actuators. In the optical model, we project the images of several persons on the mechanical model. We describe the mechanical model and the optical model of the face robot.
    No preview · Conference Paper · Oct 2004