Content uploaded by Owain Jones
Author content
All content in this area was uploaded by Owain Jones on Dec 07, 2014
Content may be subject to copyright.
Identification and Production of “simple” Tactile Gestures
Peter Gibbons
Cognitive Robotics Research Centre
University of Wales
Newport, Wales
Peter.Gibbons@newport.ac.uk
Torbjørn S. Dahl
Cognitive Robotics Research Centre
University of Wales
Newport, Wales
Torbjorn.Dahl@newport.ac.uk
Owain Jones
Intelligent Robotics Group
University of Aberystwyth
Aberystwyth, Wales
odj@aber.ac.uk
ABSTRACT
This paper describes a method for identifying and producing three
“simple” tactile gestures of “tap,” “poke” and “push”. The gesture
data is collected from a large sample size of over 200 people using
a small humanoid robot fitted with 648 capacitive sensors. A
temporal pressure profile is produced for each tactile gesture and
reproduced by the robot. The profiles show a measurable
difference in both the pressure applied and the duration of the
applied pressure between each tactile gesture. The human profile
has a much greater variation in both time and pressure compared
with the gestures reproduced by the robot.
1. INTRODUCTION
Tactile Interaction between humans is one of the most meaningful
forms of non-verbal communication [1] and central to human
social life [2]. Touch can be used to convey a variety of emotions
from anger, fear, sadness and disgust to love, h appiness, gratitude
and sympathy [3]. It is also essential for our development from
before we are born through infancy and into adulthood [4, 5].
Whilst considered as one of the most powerful senses [6] touch
has received relatively little research compared to other senses.
Although more recently is has attracted interest from disciplines
of cognitive and social psychology and neuroscience [7]. Across
different cultures, ages and genders tactile behavior and gestures
have many differences as well as many similarities [8-10]. An
obvious example is when people greet each other for example, by
shaking hands, hugging, touching cheeks, rubbing noses, patting
each other on the back or kissing the back of the hand. Touch also
plays an important role in the areas of object detection [11], and
providing awareness about our own bodies [12].
Humanoid robotics has an increasing interest in tactile Human
Robot Interaction (HRI) [13]. This is mainly driven by the
technological advances which are enabling increasingly
sophisticated tactile sensing skins to be developed [14, 15]. The
resolution of skin sensors has reached a point where patterns of
touch can now be captured and evaluated [15-18]. Other sensor
capabilities such, as pressure, temperature, vibration, joint
position and movement are also being incorporated in skin
technologies.
The aim of our research is to produce simple tactile gestures using
a small h umanoid Nao robot. In terms of simple tactile gestures
we refer to “simple” meaning “passive” and “static” sensing on
the skin [19]. This is also classified as “tactile perception” where
only cutaneous information is used [20]. The “static” refers to the
location of the tactile gesture as being fixed. Morrison et al. [21]
describe a ‘‘simple’’ touch as a touch that ‘involves brief,
intentional contact to a relatively restricted location on the body
surface of the receiver during a social interaction’. Our
experiment focused on the simple tactile gestures “tap,” “poke”
and “push” investigating the physical temporal and spatial
patterns of the tactile pressure.
The tactile gestures chosen are commonly used and widely
accepted gestures. Iwata and Sugano [22] selected 10 verbs ‘that
clearly represent psychological and physical aspects of Physical
InterFerence and intended contACT (PIFACT) states such as
“stroke,” “seize,” and “nudge”’. Naya et al. [23] investigated the
touch behaviors of “slap,” “pat,” “scratch,” “stroke” and “tickle”.
Whilst Chang et al. [24] investigated the gestures of “stroke,”
“slap,” “poke” and “pat” for a study into emotional touch. In
another study Stiehl and Breazeal [25] use a huggable bear to
classify nine different touch interactions that convey affective or
social content. These were “tickle,” “poke,” “scratch,” “slap,”
“pet,” “pat,” “rub,” “squeeze” and “contact”. And, Taichi et al.
[26] categorized the h aptic interactions of "slap," "pat," "poke,"
"push," "grasp," "stroke," "touch," "skim," "shake" and "hold" at
different locations on a robots body.
Figure 1. Experimental setup for human tactile g esture
recognition using a Nao robot fitted with skin sensors and
a GUI showing the activation of the sensors when touched.
2. Experimental Setup
2.1 Robot Skin
The robot skin has been specifically manufactured for the Nao
robot as part of the European RoboSkin project [27]. It is made up
of 6 patches each containing an array of 9 equilateral triangles
30mm long. Within the triangle there are 12 equally spaced 4mm
diameter capacitive pressure sensors called taxels making a total
there are 648 skin sensors. Each forearm has two skin patches
mirrored through the centre of the forearm and each upper arm
has a single skin patch anatomically located over the bicep
muscle. The sensors are covered by a 3mm layer of silicone foam
which deforms to increase the capacitance between the conductor
(human hand or metal object) and the sensor. The taxels are read
at a frequency of 50Hz giving a decreasing 8 bit reading as the
applied force increases. Figure1 shows the Nao robot fitted with
the robot skin patches and a two dimension GUI representation of
the taxel layout of the right upper arm. The brighter area shows
the response of the taxels when touched.
2.2 Tactile Gesture Data Collection
The aim of the research is to produce simple tactile gestures using
the Nao robot. T his requires knowing the temporal and pressure
profile for each gesture. The temporal information is pressure
applied over time rather than as a measure of frequency which is
sometimes associated w ith tactile perception [28]. The three
simple tactile gestures, “tap,” “poke” and “push” were
investigated using 202 participants from two local schools. The
pupils were aged between 12 to 16 years old with a 55% male to
45% female split. Each participant was asked in turn to “tap,”
“poke” and “push” the robot on any part of the skin. The data was
collected from two seconds prior to the start of contact until
approximately two seconds after the contact had finished. Each
experiment lasted less than 10 seconds with an actual contact time
of less than a second. Taxel values from all six skin patches were
sampled at a rate of 50Hz.
2.3 Tactile Gesture Production
The tactile gestures were created using two Nao robots. One robot
produced the gestures and the second robot recorded the gesture.
The data was collected and analyzed using the same method as for
the human data. The gestures were produced perpendicular to the
skin patch on the upper right arm of the robot. The profile of the
tactile gesture was controlled varying the duration over which the
initial pressure was applied to the skin. This allowed the impulse
of the tactile gesture to be controlled. The next phase of the
profile was recreated by varying the duration of constant pressure
before finally varying the duration over which the pressure was
released.
3. Data Analysis
The taxels were calibrated over a two second window prior to the
onset of touch by adding the difference between the average
individual taxel value and the overall average skin value for all
taxels. The calibration removed any static differences between
individual taxels values but did not remove small fluctuations due
to noise. This was removed by applying a threshold of 12% to all
taxel values which was determined by prior empirical evaluation.
A minimum contact area threshold, the size to that of a child’s
fingertip, was also applied to remove any noise from a single taxel
that was above the minimum pressure threshold. Throughout the
experiments the data did not experience any thermal drift [27]
although after every few experiments the hardware had to be reset
due to an increase in general noise across specific skin areas.
Within the contact areas data outside the first and last frame of
contact was removed along with multiple consecutive touches.
Each taxel has a corresponding two dimension Cartesian location
on its skin patch. The taxel coordinates and the taxel values were
interpolated onto a regular grid with a resolution of 1mm spacing
for each anatomical skin area, as shown in Figure 2. This enabled
the maximum pressure value and location to be accurately
calculated. A profile for each tactile gesture from every
participant was produced by plotting the maximum pressure value
for each frame in the contact period. The profile was then
normalized and an overall gesture profile produced using the
average maximum pressure and average duration for all
participants.
4. Results
Figure 3 gives a summary of the average maximum pressure
(taxel value) and average duration (ms) for each of the tactile
gestures “tap,” “poke” and “push” produced by the human and
reproduced by the robot. At present the robot skin is still a
prototype and as such there is no exact correlation between the
Figure 2
. Graphical representation of a “poke” tactile gesture
on the right arm skin patch with noise threshold
and
maximum force highlighted.
Figure 3. Average and standard deviation
for “tap,” “poke”
and “push” tactile gestures. Average maximum pressure (top
)
and average duration (
bottom
).
taxel vales and the actual force applied. Therefore the pressure
values are quoted as the sensor values. The most noticeable
difference between human tactile gestures and the robot
reproduction of the tactile gestures was the much smaller standard
deviations for the robot data. Since a robot is generally much
better than a human at both accuracy and repeatability this is not a
surprising result. An independent two-sample t-test with unequal
averages and population was performed average maximum
pressure and average duration. The result was a 99.9% significant
difference. This shows that each tactile gesture has its own unique
set of parameter values.
A profile of the data collected for each tactile gesture for both
human and robot is shown in Figure 4. The data is displayed as an
individual profile for each participant and robot trial. The profiles
graphically illustrate the large difference in the standard deviation
between the human data and the robot data for all the gestures.
The profiles do not appear smooth which indicates there was a
problem with the sampling rate. Whilst the sampling rate was
monitored at 50Hz the data readings suggest that there was a
lower sample rate. The “poke” tactile gesture the robot shows that
occasional pressure value exceeding the average maximum value.
This was due to small movements in the robot arm shifting the
contact point directly over a single taxel rather than across or
between multiple taxels. The profiles show there is noticeable
difference in the both the applied pressure and the duration the
pressure is applied between the different tactile gestures. This is
clearly visible in Figure 5 which shows the overall profile for each
of the tactile gestures. The data has been normalized then
multiplied by the average maximum pressure and average
duration before fitting a Loess locally weighted curve to the data.
The “simple” tactile gestures can be identified by the pressure
applied over time. A “tap” has a relatively light average pressure
(value of 55) applied over a short average duration of 134ms. The
“poke” gesture has a strong average pressure (value of 95) and
last for an average duration of 288ms. The “push” gesture has a
medium average pressure (value of 76) and an average duration
491ms. The values of pressure over time provide the necessary
data required to reproduce a tactile gesture using the robot.
The tactile gestures reproduced by the robot all roughly follow the
same profiles produced by the human data. The main difference is
the flatness of the profile once the pressure has been applied. This
reflects the simple method implemented on the robot to recreate
the profiles.
5. Conclusions and Future Work
From a sample size of 202 participants three “simple” tactile
gestures of “tap,” “poke” and “push” were categorized using the
parameters of pressure and time. Each gesture varied significantly
from one another in both the amount of pressure applied and
duration of applied pressure. A “tap” tactile gesture lasted for
approximately 0.125 seconds, a “poke” for 0.25 seconds and a
“push” for 0.5 seconds. The robot skin does not currently have an
exact relationship between the sensor reading and the applied
force so an exact measurement of the applied pressure for each
gesture can only be given in terms of the difference in the sensor
readings. Using a temporal profile of the pressure applied, each
tactile gesture was reproduced on the Robot. This study has
shown that “simple” human tactile gestures can be categorized by
the pressure applied over a time. It then enables “simple” tactile
gestures to be recreated by a robot.
Figure 4. Profiles of the maximum pressure for each contact
frame for all participants and robot trials for the tactile
gestures; “tap,” (top) “poke” (middle) and “push” (bottom).
Figure 5. Profile of average pressure applied over time for
the tactile g estures of “tap,” “poke” and “push” produced
by a human (top) and robot (bottom).
A natural extension of this w ork would be to incorporate more
complex tactile gestures such as a “stroke,” a “touch,” a “pinch”
or a “grab”. A method such as blob detection could be used to
identify these complex tactile gestures. Another useful addition to
this work would be to evaluate whether robot produced gestures
can be recognized by a humans. To reproduce more complex
tactile gestures would require a robot with greater dexterity. The
RoboSkin project has designed and manufactured skin sensors for
the arms, hands and fingers of the iCub humanoid platform [27].
Having the sensors on the finger tips and palm of the robot will
provide an ideal platform n ot only producing more complex
tactile gestures but also producing them on a variety of surfaces.
6. ACKNOWLEDGMENTS
The research leading to these results h as received funding from
the European Commission’s Seventh Framework Programme
(FP7/2007-2013) under grant agreement ROBOSKIN ICTFP7-
231500.
7. REFERENCES
[1] Barnett, K. E. 1972. A theoretical construct of the concepts
of touch as they relate to nursing. Nursing Research, 21, 2
(March 1972), 102-109.
[2] Hertenstein, M. J. Keltner, D., App, B., Bulleit, B. A., and
Jaskolka, A. R. 2006. Touch communicates distinct
emotions. Emotion. 6, 3 (Aug. 2006), 528-533.
[3] Hertenstein, M. J., Holmes, R., McCullough, M., and
Keltner, D. 2009. The communication of emotion via touch.
Emotion. 9, 4 (Aug. 2009), 566-573.
[4] Hertenstein, M. J. 2002. Touch: Its communicative functions
in infancy. Human Development, 45, 2 (March 2002), 70-94.
[5] Montagu, A. 1986 Touching: The Human Significance of the
Skin. Harper & Row, NY, 1986.
[6] Field, T. 2001. Touch. MIT Press, Cambridge, MA.
[7] Gallace, A. and Spence, C. 2010. The science of
interpersonal touch: An overview. Neuroscience &
Biobehavioral Reviews. 34, 2 (Feb. 2010), 246-259.
[8] Hall, J. 2011. Gender and status patterns in social touch. On
The Handbook of Touch: Neuroscience, Behavioral, and
Health Perspectives, M. Hertenstein and S. Weiss, Eds.
Springer, NY, 329-360.
[9] McDaniel, E. and Andersen, P. A. 1998. International
patterns of interpersonal tactile communication: A field
study. J. of Nonverbal Behavior. 22, 1 (March 1998), 59-75.
[10] Anderson, P. A. 2011. Tactile traditions: Cultural differences
and similarities in haptic communication. In The Handbook
of Touch: Neuroscience, Behavioral, and Health
Perspectives, M. Hertenstein and S. Weiss, Eds. Springer,
NY, 351-369.
[11] Raju, B. I. and Srinivasan, M. A. 1999. Encoding and
decoding of shape in tactile sensing. Touch Lab, M IT,
Cambridge, Technical Report, RLE TR-630, Sept. 1999.
[12] Serino, A. and Haggard, P. 2010. Touch and the body
Neuroscience and Biobehavioral Reviews, 34, 2 (Feb. 2010),
224-236.
[13] Argall, B. D. and Billard, A. G. 2010. A survey of tactile
human-robot interactions. Robotics and Autonomous
Systems, 58, 10 (Oct. 2010), 1159-1176.
[14] Lee, M. H. 2000. Tactile sensing: New directions, new
challenges. The Int. J. of Robotics Research, 19, 7 (July
2000), 636-643.
[15] Dahiya, R. S., Metta, G., Valle, M., and Sandini, G. 2010.
Tactile sensing—From humans to humanoids. IEEE Tran. on
Robotics, 26, 1 (Feb. 2010), 1-20.
[16] Ohmura, Y. and Kuniyoshi, Y. 2007. Humanoid robot which
can lift a 30kg box by whole body contact and tactile
feedback. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots
and System (San Diego, USA Oct. 29-Nov. 2 , 2007). IROS
2007. IEEE, 1136-1141.
[17] Wada, K. and Shibata, T. 2007. Social effects of robot
therapy in a care house - change of social n etwork of the
residents for two months. In Proc. of the IEEE Int. Conf. on
Robotics and Automation (Roma, Italy, April 10-14, 2007).
ICRA 2007. IEEE, 1250-1255.
[18] Stiehl, W. D., Lieberman, J., Breazeal, C., Basel, L., Lalla,
L., and Wolf, M. 2005. Design of a therapeutic robotic
companion for relational, affective touch. In IEEE Int.
Workshop on Robot and Human Interactive Communication,
(Nashville, TN 37203 USA, August 13-15, 2005). ROMAN
2005. IEEE 408-415.
[19] Jones, L. A. and Lederman, S. J. 2006. Tactile sensing. In
Human Hand Function. Oxford University Press, NY, 44-74.
[20] Loomis, J. M. and Lederman, S. J. 1986. Tactual perception.
In Handbook of Perception and Human Performance Vol. II,
K. Boff, L. Kaufman, and J. Thomas, Eds. John Wiley and
Sons, NY, chapter 31.
[21] Morrison, I., Löken, L. S., and Olausson, H. 2010. The skin
as a social organ. Experimental Brain Research. 204, 3 (July
2010), 305-314.
[22] Iwata, H. and Sugano, S. 2005. Human-robot-contact-state
identification based on tactile recognition. IEEE Trans.on
Robotics. 52, 6 (Dec. 2005), 1468-1477.
[23] Naya, F., Yamato, J., and Shinozawa, K. 1999. Recognizing
human touching behaviors using a haptic interface for a pet-
robot. In Proc. IEEE Int. Conf. on S ystems, Man, and
Cybernetics (Tokyo, Japan, October 12-15, 1999). SMC
1999. IEEE, 2, 1030-1034.
[24] Chang, J., MacLean, K., and Yohanan, S. 2010. Gesture
recognition in the haptic creature. In Proc. Int. Conf. on
Haptics: Generating and perceiving tangible sensations,
Part I, (Amsterdam, Netherlands, July 8-10, 2010).
EuroHaptics 2010. Springer, Lecture Notes in Computer
Science 6191, 385-391.
[25] Stiehl, W. D. and Breazeal, C. 2005. Affective Touch for
Robotic Companions. In Proc. Int. Conf. on Affective
Computing and Intelligent Interaction, (Beijing, China,
October 22-24, 2005). ACII 2005. Springer, 747-754.
[26] Taichi, T., Takahiro, M., Hiroshi, I., and Norihiro, H. 2006.
Automatic categorization of haptic interactions - What are
the typical haptic interactions between a human and a robot?
In Proc. 6th IEEE-RAS Int. Conf. on Humanoid Robots,
(Genova, Italy, Dec. 5-6, 2006), HUMANOIDS 2006. IEEE,
490-496.
[27] Schmitz, A., Maiolino, P., Maggiali, M., Natale, L., Cannata,
G., and Metta, G. 2011. Methods and Technologies for the
Implementation of Large-Scale Robot Tactile Sensors. IEEE
Trans. on Robotics, 27, 3, (June 2011), 389-400.
[28] Greenspan, J. D. and Bolanowski, S. J. 1996. The
psychophysics of tactile perception and its peripheral
physiological basis. In Pain and Touch, L. Kruger, Ed.
Academic Press, San Diego, CA, 25-103.