Conference PaperPDF Available

Identification and Production of Simple Tactile Gestures


Abstract and Figures

This paper describes a method for identifying and producing three “simple” tactile gestures of “tap,” “poke” and “push”. The gesture data is collected from a large sample size of over 200 people using a small humanoid robot fitted with 648 capacitive sensors. A temporal pressure profile is produced for each tactile gesture and reproduced by the robot. The profiles show a measurable difference in both the pressure applied and the duration of the applied pressure between each tactile gesture. The human profile has a much greater variation in both time and pressure compared with the gestures reproduced by the robot.
Content may be subject to copyright.
Identification and Production of “simple” Tactile Gestures
Peter Gibbons
Cognitive Robotics Research Centre
University of Wales
Newport, Wales
Torbjørn S. Dahl
Cognitive Robotics Research Centre
University of Wales
Newport, Wales
Owain Jones
Intelligent Robotics Group
University of Aberystwyth
Aberystwyth, Wales
This paper describes a method for identifying and producing three
“simple” tactile gestures of “tap,” “poke” and “push”. The gesture
data is collected from a large sample size of over 200 people using
a small humanoid robot fitted with 648 capacitive sensors. A
temporal pressure profile is produced for each tactile gesture and
reproduced by the robot. The profiles show a measurable
difference in both the pressure applied and the duration of the
applied pressure between each tactile gesture. The human profile
has a much greater variation in both time and pressure compared
with the gestures reproduced by the robot.
Tactile Interaction between humans is one of the most meaningful
forms of non-verbal communication [1] and central to human
social life [2]. Touch can be used to convey a variety of emotions
from anger, fear, sadness and disgust to love, h appiness, gratitude
and sympathy [3]. It is also essential for our development from
before we are born through infancy and into adulthood [4, 5].
Whilst considered as one of the most powerful senses [6] touch
has received relatively little research compared to other senses.
Although more recently is has attracted interest from disciplines
of cognitive and social psychology and neuroscience [7]. Across
different cultures, ages and genders tactile behavior and gestures
have many differences as well as many similarities [8-10]. An
obvious example is when people greet each other for example, by
shaking hands, hugging, touching cheeks, rubbing noses, patting
each other on the back or kissing the back of the hand. Touch also
plays an important role in the areas of object detection [11], and
providing awareness about our own bodies [12].
Humanoid robotics has an increasing interest in tactile Human
Robot Interaction (HRI) [13]. This is mainly driven by the
technological advances which are enabling increasingly
sophisticated tactile sensing skins to be developed [14, 15]. The
resolution of skin sensors has reached a point where patterns of
touch can now be captured and evaluated [15-18]. Other sensor
capabilities such, as pressure, temperature, vibration, joint
position and movement are also being incorporated in skin
The aim of our research is to produce simple tactile gestures using
a small h umanoid Nao robot. In terms of simple tactile gestures
we refer to “simple” meaning “passive” and “static” sensing on
the skin [19]. This is also classified as “tactile perception” where
only cutaneous information is used [20]. The “static” refers to the
location of the tactile gesture as being fixed. Morrison et al. [21]
describe a ‘‘simple’’ touch as a touch that ‘involves brief,
intentional contact to a relatively restricted location on the body
surface of the receiver during a social interaction’. Our
experiment focused on the simple tactile gestures “tap,” “poke”
and “push” investigating the physical temporal and spatial
patterns of the tactile pressure.
The tactile gestures chosen are commonly used and widely
accepted gestures. Iwata and Sugano [22] selected 10 verbs ‘that
clearly represent psychological and physical aspects of Physical
InterFerence and intended contACT (PIFACT) states such as
“stroke,” “seize,” and “nudge”’. Naya et al. [23] investigated the
touch behaviors of “slap,” “pat,” “scratch,” “stroke” and “tickle”.
Whilst Chang et al. [24] investigated the gestures of “stroke,”
“slap,” “poke” and “pat” for a study into emotional touch. In
another study Stiehl and Breazeal [25] use a huggable bear to
classify nine different touch interactions that convey affective or
social content. These were “tickle,” “poke,” “scratch,” “slap,”
“pet,” “pat,” “rub,” “squeeze” and “contact”. And, Taichi et al.
[26] categorized the h aptic interactions of "slap," "pat," "poke,"
"push," "grasp," "stroke," "touch," "skim," "shake" and "hold" at
different locations on a robots body.
Figure 1. Experimental setup for human tactile g esture
recognition using a Nao robot fitted with skin sensors and
a GUI showing the activation of the sensors when touched.
2. Experimental Setup
2.1 Robot Skin
The robot skin has been specifically manufactured for the Nao
robot as part of the European RoboSkin project [27]. It is made up
of 6 patches each containing an array of 9 equilateral triangles
30mm long. Within the triangle there are 12 equally spaced 4mm
diameter capacitive pressure sensors called taxels making a total
there are 648 skin sensors. Each forearm has two skin patches
mirrored through the centre of the forearm and each upper arm
has a single skin patch anatomically located over the bicep
muscle. The sensors are covered by a 3mm layer of silicone foam
which deforms to increase the capacitance between the conductor
(human hand or metal object) and the sensor. The taxels are read
at a frequency of 50Hz giving a decreasing 8 bit reading as the
applied force increases. Figure1 shows the Nao robot fitted with
the robot skin patches and a two dimension GUI representation of
the taxel layout of the right upper arm. The brighter area shows
the response of the taxels when touched.
2.2 Tactile Gesture Data Collection
The aim of the research is to produce simple tactile gestures using
the Nao robot. T his requires knowing the temporal and pressure
profile for each gesture. The temporal information is pressure
applied over time rather than as a measure of frequency which is
sometimes associated w ith tactile perception [28]. The three
simple tactile gestures, “tap,” “poke” and “push” were
investigated using 202 participants from two local schools. The
pupils were aged between 12 to 16 years old with a 55% male to
45% female split. Each participant was asked in turn to “tap,”
“poke” and “push” the robot on any part of the skin. The data was
collected from two seconds prior to the start of contact until
approximately two seconds after the contact had finished. Each
experiment lasted less than 10 seconds with an actual contact time
of less than a second. Taxel values from all six skin patches were
sampled at a rate of 50Hz.
2.3 Tactile Gesture Production
The tactile gestures were created using two Nao robots. One robot
produced the gestures and the second robot recorded the gesture.
The data was collected and analyzed using the same method as for
the human data. The gestures were produced perpendicular to the
skin patch on the upper right arm of the robot. The profile of the
tactile gesture was controlled varying the duration over which the
initial pressure was applied to the skin. This allowed the impulse
of the tactile gesture to be controlled. The next phase of the
profile was recreated by varying the duration of constant pressure
before finally varying the duration over which the pressure was
3. Data Analysis
The taxels were calibrated over a two second window prior to the
onset of touch by adding the difference between the average
individual taxel value and the overall average skin value for all
taxels. The calibration removed any static differences between
individual taxels values but did not remove small fluctuations due
to noise. This was removed by applying a threshold of 12% to all
taxel values which was determined by prior empirical evaluation.
A minimum contact area threshold, the size to that of a child’s
fingertip, was also applied to remove any noise from a single taxel
that was above the minimum pressure threshold. Throughout the
experiments the data did not experience any thermal drift [27]
although after every few experiments the hardware had to be reset
due to an increase in general noise across specific skin areas.
Within the contact areas data outside the first and last frame of
contact was removed along with multiple consecutive touches.
Each taxel has a corresponding two dimension Cartesian location
on its skin patch. The taxel coordinates and the taxel values were
interpolated onto a regular grid with a resolution of 1mm spacing
for each anatomical skin area, as shown in Figure 2. This enabled
the maximum pressure value and location to be accurately
calculated. A profile for each tactile gesture from every
participant was produced by plotting the maximum pressure value
for each frame in the contact period. The profile was then
normalized and an overall gesture profile produced using the
average maximum pressure and average duration for all
4. Results
Figure 3 gives a summary of the average maximum pressure
(taxel value) and average duration (ms) for each of the tactile
gestures “tap,” “poke” and push” produced by the human and
reproduced by the robot. At present the robot skin is still a
prototype and as such there is no exact correlation between the
Figure 2
. Graphical representation of a “poke” tactile gesture
on the right arm skin patch with noise threshold
maximum force highlighted.
Figure 3. Average and standard deviation
for tap,” “poke
and “push” tactile gestures. Average maximum pressure (top
and average duration (
taxel vales and the actual force applied. Therefore the pressure
values are quoted as the sensor values. The most noticeable
difference between human tactile gestures and the robot
reproduction of the tactile gestures was the much smaller standard
deviations for the robot data. Since a robot is generally much
better than a human at both accuracy and repeatability this is not a
surprising result. An independent two-sample t-test with unequal
averages and population was performed average maximum
pressure and average duration. The result was a 99.9% significant
difference. This shows that each tactile gesture has its own unique
set of parameter values.
A profile of the data collected for each tactile gesture for both
human and robot is shown in Figure 4. The data is displayed as an
individual profile for each participant and robot trial. The profiles
graphically illustrate the large difference in the standard deviation
between the human data and the robot data for all the gestures.
The profiles do not appear smooth which indicates there was a
problem with the sampling rate. Whilst the sampling rate was
monitored at 50Hz the data readings suggest that there was a
lower sample rate. The “poke” tactile gesture the robot shows that
occasional pressure value exceeding the average maximum value.
This was due to small movements in the robot arm shifting the
contact point directly over a single taxel rather than across or
between multiple taxels. The profiles show there is noticeable
difference in the both the applied pressure and the duration the
pressure is applied between the different tactile gestures. This is
clearly visible in Figure 5 which shows the overall profile for each
of the tactile gestures. The data has been normalized then
multiplied by the average maximum pressure and average
duration before fitting a Loess locally weighted curve to the data.
The “simple” tactile gestures can be identified by the pressure
applied over time. A “tap” has a relatively light average pressure
(value of 55) applied over a short average duration of 134ms. The
“poke” gesture has a strong average pressure (value of 95) and
last for an average duration of 288ms. The “push” gesture has a
medium average pressure (value of 76) and an average duration
491ms. The values of pressure over time provide the necessary
data required to reproduce a tactile gesture using the robot.
The tactile gestures reproduced by the robot all roughly follow the
same profiles produced by the human data. The main difference is
the flatness of the profile once the pressure has been applied. This
reflects the simple method implemented on the robot to recreate
the profiles.
5. Conclusions and Future Work
From a sample size of 202 participants three “simple” tactile
gestures of “tap,” “poke” and “push” were categorized using the
parameters of pressure and time. Each gesture varied significantly
from one another in both the amount of pressure applied and
duration of applied pressure. A “tap” tactile gesture lasted for
approximately 0.125 seconds, a “poke” for 0.25 seconds and a
“push” for 0.5 seconds. The robot skin does not currently have an
exact relationship between the sensor reading and the applied
force so an exact measurement of the applied pressure for each
gesture can only be given in terms of the difference in the sensor
readings. Using a temporal profile of the pressure applied, each
tactile gesture was reproduced on the Robot. This study has
shown that “simple” human tactile gestures can be categorized by
the pressure applied over a time. It then enables “simple” tactile
gestures to be recreated by a robot.
Figure 4. Profiles of the maximum pressure for each contact
frame for all participants and robot trials for the tactile
gestures; “tap,” (top) “poke” (middle) and “push” (bottom).
Figure 5. Profile of average pressure applied over time for
the tactile g estures of “tap, “poke” and “push” produced
by a human (top) and robot (bottom).
A natural extension of this w ork would be to incorporate more
complex tactile gestures such as a “stroke,” a “touch,” a “pinch”
or a grab”. A method such as blob detection could be used to
identify these complex tactile gestures. Another useful addition to
this work would be to evaluate whether robot produced gestures
can be recognized by a humans. To reproduce more complex
tactile gestures would require a robot with greater dexterity. The
RoboSkin project has designed and manufactured skin sensors for
the arms, hands and fingers of the iCub humanoid platform [27].
Having the sensors on the finger tips and palm of the robot will
provide an ideal platform n ot only producing more complex
tactile gestures but also producing them on a variety of surfaces.
The research leading to these results h as received funding from
the European Commission’s Seventh Framework Programme
(FP7/2007-2013) under grant agreement ROBOSKIN ICTFP7-
[1] Barnett, K. E. 1972. A theoretical construct of the concepts
of touch as they relate to nursing. Nursing Research, 21, 2
(March 1972), 102-109.
[2] Hertenstein, M. J. Keltner, D., App, B., Bulleit, B. A., and
Jaskolka, A. R. 2006. Touch communicates distinct
emotions. Emotion. 6, 3 (Aug. 2006), 528-533.
[3] Hertenstein, M. J., Holmes, R., McCullough, M., and
Keltner, D. 2009. The communication of emotion via touch.
Emotion. 9, 4 (Aug. 2009), 566-573.
[4] Hertenstein, M. J. 2002. Touch: Its communicative functions
in infancy. Human Development, 45, 2 (March 2002), 70-94.
[5] Montagu, A. 1986 Touching: The Human Significance of the
Skin. Harper & Row, NY, 1986.
[6] Field, T. 2001. Touch. MIT Press, Cambridge, MA.
[7] Gallace, A. and Spence, C. 2010. The science of
interpersonal touch: An overview. Neuroscience &
Biobehavioral Reviews. 34, 2 (Feb. 2010), 246-259.
[8] Hall, J. 2011. Gender and status patterns in social touch. On
The Handbook of Touch: Neuroscience, Behavioral, and
Health Perspectives, M. Hertenstein and S. Weiss, Eds.
Springer, NY, 329-360.
[9] McDaniel, E. and Andersen, P. A. 1998. International
patterns of interpersonal tactile communication: A field
study. J. of Nonverbal Behavior. 22, 1 (March 1998), 59-75.
[10] Anderson, P. A. 2011. Tactile traditions: Cultural differences
and similarities in haptic communication. In The Handbook
of Touch: Neuroscience, Behavioral, and Health
Perspectives, M. Hertenstein and S. Weiss, Eds. Springer,
NY, 351-369.
[11] Raju, B. I. and Srinivasan, M. A. 1999. Encoding and
decoding of shape in tactile sensing. Touch Lab, M IT,
Cambridge, Technical Report, RLE TR-630, Sept. 1999.
[12] Serino, A. and Haggard, P. 2010. Touch and the body
Neuroscience and Biobehavioral Reviews, 34, 2 (Feb. 2010),
[13] Argall, B. D. and Billard, A. G. 2010. A survey of tactile
human-robot interactions. Robotics and Autonomous
Systems, 58, 10 (Oct. 2010), 1159-1176.
[14] Lee, M. H. 2000. Tactile sensing: New directions, new
challenges. The Int. J. of Robotics Research, 19, 7 (July
2000), 636-643.
[15] Dahiya, R. S., Metta, G., Valle, M., and Sandini, G. 2010.
Tactile sensing—From humans to humanoids. IEEE Tran. on
Robotics, 26, 1 (Feb. 2010), 1-20.
[16] Ohmura, Y. and Kuniyoshi, Y. 2007. Humanoid robot which
can lift a 30kg box by whole body contact and tactile
feedback. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots
and System (San Diego, USA Oct. 29-Nov. 2 , 2007). IROS
2007. IEEE, 1136-1141.
[17] Wada, K. and Shibata, T. 2007. Social effects of robot
therapy in a care house - change of social n etwork of the
residents for two months. In Proc. of the IEEE Int. Conf. on
Robotics and Automation (Roma, Italy, April 10-14, 2007).
ICRA 2007. IEEE, 1250-1255.
[18] Stiehl, W. D., Lieberman, J., Breazeal, C., Basel, L., Lalla,
L., and Wolf, M. 2005. Design of a therapeutic robotic
companion for relational, affective touch. In IEEE Int.
Workshop on Robot and Human Interactive Communication,
(Nashville, TN 37203 USA, August 13-15, 2005). ROMAN
2005. IEEE 408-415.
[19] Jones, L. A. and Lederman, S. J. 2006. Tactile sensing. In
Human Hand Function. Oxford University Press, NY, 44-74.
[20] Loomis, J. M. and Lederman, S. J. 1986. Tactual perception.
In Handbook of Perception and Human Performance Vol. II,
K. Boff, L. Kaufman, and J. Thomas, Eds. John Wiley and
Sons, NY, chapter 31.
[21] Morrison, I., Löken, L. S., and Olausson, H. 2010. The skin
as a social organ. Experimental Brain Research. 204, 3 (July
2010), 305-314.
[22] Iwata, H. and Sugano, S. 2005. Human-robot-contact-state
identification based on tactile recognition. IEEE Trans.on
Robotics. 52, 6 (Dec. 2005), 1468-1477.
[23] Naya, F., Yamato, J., and Shinozawa, K. 1999. Recognizing
human touching behaviors using a haptic interface for a pet-
robot. In Proc. IEEE Int. Conf. on S ystems, Man, and
Cybernetics (Tokyo, Japan, October 12-15, 1999). SMC
1999. IEEE, 2, 1030-1034.
[24] Chang, J., MacLean, K., and Yohanan, S. 2010. Gesture
recognition in the haptic creature. In Proc. Int. Conf. on
Haptics: Generating and perceiving tangible sensations,
Part I, (Amsterdam, Netherlands, July 8-10, 2010).
EuroHaptics 2010. Springer, Lecture Notes in Computer
Science 6191, 385-391.
[25] Stiehl, W. D. and Breazeal, C. 2005. Affective Touch for
Robotic Companions. In Proc. Int. Conf. on Affective
Computing and Intelligent Interaction, (Beijing, China,
October 22-24, 2005). ACII 2005. Springer, 747-754.
[26] Taichi, T., Takahiro, M., Hiroshi, I., and Norihiro, H. 2006.
Automatic categorization of haptic interactions - What are
the typical haptic interactions between a human and a robot?
In Proc. 6th IEEE-RAS Int. Conf. on Humanoid Robots,
(Genova, Italy, Dec. 5-6, 2006), HUMANOIDS 2006. IEEE,
[27] Schmitz, A., Maiolino, P., Maggiali, M., Natale, L., Cannata,
G., and Metta, G. 2011. Methods and Technologies for the
Implementation of Large-Scale Robot Tactile Sensors. IEEE
Trans. on Robotics, 27, 3, (June 2011), 389-400.
[28] Greenspan, J. D. and Bolanowski, S. J. 1996. The
psychophysics of tactile perception and its peripheral
physiological basis. In Pain and Touch, L. Kruger, Ed.
Academic Press, San Diego, CA, 25-103.
... Efforts for having a similar tactile interaction with robots are therefore increasing. Much research in this area, which often goes under the name of affective touch, tries to identify the location and the type of received touches and classify them to distinguish social messages like hugs, petting or tickling [39,38,60,50,4,21,29]. This information then can possibly be used to change the internal state of the robot. ...
Full-text available
With close interaction between humans and robots expected to become more and more frequent in the near future, tactile interaction is receiving increasing interest. Many advances were made in the fields of tactile sensing and touch classification. Robot’s reactions to touches are usually decided by the robot’s designers and fit to a particular purpose. However, very little investigation has been directed to the movements that common people expect from robots being touched. This paper provides an initial step in this direction. Responses that people expect from a humanoid being touched were collected. These responses were then classified by automatically grouping similar responses. This allows the identification of distinct types of responses. Evaluation of how this grouping matches common sense were then performed. Results showed strong correlation between the automatic grouping and common sense, providing support to the idea that the automatically identified types of responses correspond to a plausible classification of robot’s responses to touch.
... Tactile interaction between robots is not an area being widely explored, but recent advances in tactile sensor technology have made reliable skin sensors available on humanoid platforms such as the iCub and the Nao [6], [7]. For example, using a Nao robot fitted with tactile skin sensors, Gibbons and Dahl [8] performed a human-robot interaction study by comparing the human generated pressure profiles with the hard coded ones produced by a second humanoid robot. ...
The existence of cortical hierarchies has long since been established and the advantages of hierarchical encoding of sensor-motor data for control, have long been recognized. Less well understood are the developmental processes whereby such hierarchies are constructed and subsequently used. This paper presents a new algorithm for encoding sequential sensor and actuator data in a dynamic, hierarchical neural network that can grow to accommodate the length of the observed interactions. The algorithm uses a developmental robotics methodology as it extends the Constructivist Learning Architecture, a computational theory of infant cognitive development. This paper presents experimental data demonstrating how the extended algorithm goes beyond the original theory by supporting goal oriented control. The domain studied is the encoding and reproduction of tactile gestures in humanoid robots. In particular, we present results from using a Programming by Demonstration approach to encode a stroke gesture. Our results demonstrate how the novel encoding enables a Nao humanoid robot with a touch sensitive fingertip to successfully encode and reproduce a stroke gesture in the presence of perturbations from internal and external forces.
Follicular structures in skin combine sensing and actuation in a soft and compliant continuous surface. We have developed a tactile display device inspired by this structure, using a Dielectric Elastomer Actuator (DEA). DEAs allow for combined sensing and actuation, making possible two-way tactile communication between the user and the device. The device can obtain tactile information about the environment, or a user touching it, and it can also present tactile information to the user. We characterise the sensing properties of the tactile display device, and perform classification of tactile stimuli. We demonstrate two-way tactile interaction between a user and the device.
Full-text available
Even though the sense of touch is crucial for humans, most humanoid robots lack tactile sensing. While a large number of sensing technologies exist, it is not trivial to incorporate them into a robot. We have developed a compliant “skin” for humanoids that integrates a distributed pressure sensor based on capacitive technology. The skin is modular and can be deployed on nonflat surfaces. Each module scans locally a limited number of tactile-sensing elements and sends the data through a serial bus. This is a critical advantage as it reduces the number of wires. The resulting system is compact and has been successfully integrated into three different humanoid robots. We have performed tests that show that the sensor has favorable characteristics and implemented algorithms to compensate the hysteresis and drift of the sensor. Experiments with the humanoid robot iCub prove that the sensors can be used to grasp unmodeled, fragile objects.
Conference Paper
Full-text available
This paper presents the preliminary results of classifying human touching behaviors using a haptic interface for a pet-like robot. The haptic interface uses gridded pressure-sensitive conductive ink sheets. Features of the measured pressure data are determined for classification in terms of 1) absolute values, 2) spatial distributions and 3) the temporal differences in measured pressure patterns. Touching behaviors include “slap,” “pat,” “stroke” and so forth. The experimental results show that a reliable classification of these touching patterns can be accomplished by using the sensor sheet and pressure features. The results of classification can be used as reward signals for reinforcement learning in controlling the behaviors of a pet-like robot that interacts with humans
This chapter examines the fundamental aspects of tactile psychophysics. The chapter also reviews the role of mechanoreceptive afferent neurons in mediating the psychophysical phenomena. This has proven to be a particularly fruitful avenue during the last two decades of research. The chapter focuses on tactile perceptions that are derived from the information provided by mechanoreceptors innervating the skin, and in some cases, subcutaneous tissues. Cutaneous mechanoreceptors are defined functionally as those elements of the peripheral nervous system (PNS) that are selectively responsive to non-noxious mechanical stimulation of the skin. A large body of data exists concerning the neurophysiological properties of cutaneous mechanoreceptors and their afferent fibers. This is reviewed in the chapter. The review encompasses the scientific advances over the last two decades related to tactile perception and its peripheral physiological basis. There are two significant features of this progress. First, the comparison between psychophysical and neurophysiological data is found to be valuable in experiments employing well-controlled and reproducible stimuli. Second, the development of more sophisticated models of somatosensation has provided greater insight into the neural basis of tactile perception, and has allowed for a wider range of experimental questions to be considered.
The communicative functions that the tactile modality serves in infancy have been severely neglected by researchers. The present article highlights the importance of touch by addressing two questions. First, what is communicated to infants by touch from their caregivers? In addition to the common notion that touch regulates arousal levels, it is argued that touch is capable of communicating valenced and discrete emotions as well as specific information. Second, how does meaning come about from the touch that adults administer to infants? This question is addressed by discussing specific qualities and parameters of touch and three mechanisms by which infants gain meaning from touch. Empirical evidence is provided and hypotheses are made regarding each of these questions. Furthermore, a preliminary model of tactile communication is presented based upon the literature on touch, as well as the conceptual framework outlined in the article.
An awareness of how touch is employed in communicative interactions among peoples of different nations can be a critical requisite to effective inter-cultural communication. This study examined cross-sex, interpersonal, public touch to determine whether (1) the number of body areas touched varied between members of different societies; (2) the type of relationship between dyadic partners influenced tactile behavior; and (3) the amount of total body areas touched for each society correlated with latitude of origin. Variation in interpersonal touch as a function of nationality was confirmed. Results also confirmed that touch between dyads from an international sample was affected by type of relationship. Correspondence in the occurrence of tactile behavior among dyads from similar latitudes of origin was not confirmed. Findings call into question the designation of Northern European and U.S. cultures as non-contact.
Conference Paper
Robot therapy for elderly residents in a care house has been conducted since June 2005. Two therapeutic seal robots were introduced and activated for over 9 hours every day to interact with the residents. This paper presents a progress report of this experiment. In order to investigate the psychological and social effects of the robots, each subject was interviewed by using the free pile sort method, and their subjective social network was analysed. In addition, the activities of the residents in public areas were recorded by video cameras during daytime hours (8:30-18:00) for over 2 months. Then, their social network was analysed from the video data objectively. The results showed that the density of the social networks was increased through interaction with the robots subjectively and objectively.
Conference Paper
We present realization of a humanoid which can lift a heavy object by whole body contact. Most humanoid motions are limited to the posture of the end-effectors only landing. In principle these humanoids can not do natural motion. If a humanoid robot is allowed arbitrary contact with the surrounding objects, it can improve the performance and operate a heavier object. We propose a "whole body contact motion" of a humanoid robot. It is defined as a control of contact state of a humanoid robot which has the distributed tactile sensors. We develop conformable and scalable tactile skin and an adult-size humanoid with a smooth surfaces for arbitrary contact. We install the skin on the entire surfaces of the humanoid. Finally we describe the humanoid lifting a 30kg box by tactile feedback.