Content uploaded by Esko O. Dijk
All content in this area was uploaded by Esko O. Dijk on Nov 23, 2014
Content may be subject to copyright.
PROCEEDINGS OF THE EUROHAPTICS 2010
HAPTIC AND AUDIO-VISUAL STIMULI:
ENHANCING EXPERIENCES AND INTERACTION
Amsterdam, July 7, 2010
Anton Nijholt, Esko O. Dijk, Paul M.C. Lemmens, Steven Luitjens (eds.)
CIP GEGEVENS KONINKLIJKE BIBLIOTHEEK, DEN HAAG
Nijholt, A., Dijk, E.O., Lemmens, P.M.C., Luitjens, S.
Special Symposium at Conference EuroHaptics 2010
Proceedings of Special Symposium at EuroHaptics 2010
Haptic and Audio-Visual Stimuli: Enhancing Experiences and Interaction
A. Nijholt, E.O. Dijk, P.M.C. Lemmens, S. Luitjens (eds.)
Amsterdam, Universiteit Twente, Faculteit Elektrotechniek, Wiskunde en Informatica
CTIT Workshop Proceedings Series WP10-01
trefwoorden: Audio, video and haptic stimulation; Inﬂuence of temporal and spatial patterns of (haptic) stimuli,
Relaxation, easing the mind, comforting; Applications of biofeedback; Intelligence and algorithms
to optimize the user experience.
© Copyright 2010; Universiteit Twente, Enschede
Ms. C. Bijron
University of Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
P.O. Box 217
NL 7500 AE Enschede
tel: +31 53 4893740
fax: +31 53 4893503
Druk- en bindwerk: Ipskamp Drukkers, Enschede.
The intention of the symposium on Haptic and Audio-visual stimuli at the EuroHaptics 2010 conference is to deepen the
understanding of the effect of combined Haptic and Audio-visual stimuli. The knowledge gained will be used to enhance
experiences and interactions in daily life. To this end, a number of key lectures have been organized and accompanying
papers can be found in this proceedings. With the lectures and the interaction between the researchers at the sympo-
sium, we aim to initiate a community interested in multimodal stimulation involving haptic elements with an emphasis on
experiences for entertainment, well-being and relaxation.
Multimodal stimulation is capable of creating strong effects on users, because the effects of the various stimuli can en-
force each other. It can be used to enhance entertainment experiences, as well as well-being and relaxation experiences.
By default, multimodal stimulation often only considers visual with auditory stimulation, because these senses are most
prominent in our environment. However, humans have at least three more senses that can be used to create multimodal
sensations: touch, taste, and smell. The latter two are technologically difﬁcult to implement but stimulation and feedback
using the tactile sense is rapidly becoming more prevalent. An example application is to use haptic and tactile actuator
elements to provide the player of a game with a more thrilling experience. In this case, tactile stimulation is provided that
is linked to the visual and auditory information in the game and together, these stimuli create very strong experiences.
A deep understanding of the requirements to create a convincing multimodal experience is needed to create an experience
that is more than just the sum of the elements. This understanding is needed on the level of individual sensory modalities
but also on the interactions of information processing in each modality and covers aspects of the relative contribution of
individual modalities, aspects of timing and synchronization, et cetera. This especially applies to the tactile modality that
is relatively unexplored in the context of multimodal stimulation. Work on these topics is being carried out for separate
modalities [1-3] and also for multimodal stimulation  and covers topics as diverse as intensity, spatial distribution,
timing, tactile perception [1,5], tactile displays , et cetera. However, the effects multimodal stimulation including haptic
elements on the user experiences and interactions has not yet been thoroughly studied in the context of entertainment,
well-being, and relaxation applications.
About This Symposium
In this special symposium we address the speciﬁc effects of combined (multi-sensory) stimuli that aim to achieve total
effects that are more than just the sum of their elements. Topics range from basic elements such as mutual timing in
audio, video, and haptic stimuli, through actuator technologies, to how such "more than the sum of the elements" effects
of multimodal stimuli are created in a user’s perception and how to evaluate these experiences and perceptions.
Our guiding hypothesis is that an optimal user experience will be obtained by taking into account human perception, careful
personalization, and intelligent optimization. The latter should be based on both general knowledge of human perception,
and on (measured or inferred) knowledge of the individual user. Research on human perception will provide information on
the basic capabilities and limitations of individual modalities but also on how combined information processing in multiple
modalities operates. To this end we have planned a number of key lectures on the technologies employed, the psychological
and physiological sensitivities of people and the algorithms used to optimize the effect of multimodal stimuli. We have
been able to invite researchers working on the following topics:
• Haptic illusions
• Relaxation using haptic stimulation
• Mediated social touch
• Audiotactile interaction
• Personalized tactile feedback
• Tactile stimulation for entertainment
These presentations and the interaction between researchers could initiate a community of researchers who are interested
in multimodal stimulation involving haptic elements with particular emphasis on experiences in entertainment, well-being,
and relaxation. In these proceedings of the symposium you can ﬁnd the contributions of most of the key speakers. Short
summaries of these contributions follow below.
These proceedings start with a (preliminary) position paper ("Audio-tactile Stimuli to Improve Health and Well-being") by
Esko Dijk and his co-authors. The paper aims at deﬁning a research area where auditory and tactile stimulation, possibly
enhanced with visual information and stimuli, is combined and applied to improve people’s health and well-being. It
is argued that these combined stimuli can have effects on the human body and mind by, for example, reducing stress,
improving alertness or promoting sleep. Presently there is a variety of low-cost and miniature tactile actuators on the
market. They ﬁnd application in mobile phones, but also in jackets that provide dynamic and spatial tactile patterns on the
human body. Audio-tactile patterns can be designed for many applications, for example, for navigation, for entertainment or
for health and well-being purposes. The paper brieﬂy surveys research results on audio-tactile stimuli, available technology,
and audio-tactile composition. Scientiﬁc challenges are identiﬁed that need to be explored in order to design personalized
audio-tactile systems that adapt to their users either off-line or online. The aim of this position paper is to create a research
community for answering these challenges.
Clearly, before being able to interpret the effect of audio-tactile stimuli it is necessary to know the effect of uni-modal
stimuli. For example, how can tactile stimuli induce emotions? In "Tactile Experiences" Paul Lemmens and his co-authors
take William James’ viewpoint that every emotion has a distinct bodily reaction. They reversed this observation and
studied whether providing bodily stimuli while watching appropriate video clips could induce or enhance an emotion. The
design of an emotion jacket is described. The jacket provides tactile sensations on the torso with the help of sixty-four
actuators (eccentric rotating-mass motors) embedded in stretchable fabric. Various tactile emotion patterns were designed
for video clips that were chosen to elicit certain emotional responses. In a user study participants viewed the clips with and
without the emotion patterns projected onto their bodies. Questionnaires were used and psychophysiological responses
were recorded in order to obtain information about the emotional experience and immersion. The results convinced the
authors that adding the tactile emotion patterns enhanced the emotional experience of the viewers.
Hendrik Richter’s contribution ("Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces") builds further
on the recent trends of using haptic feedback for touch screen interaction. In this application area, the touch and visual
senses come together. While current systems can mostly provide haptic feedback for only a single point of interaction
(i.e. ﬁnger), he proposes a ﬁrst extension to multi-touch surfaces. A second extension is also proposed to take away one
important restriction of current solutions, namely that the haptic feedback is always given at the location of interaction on
the screen. It is proposed to spatially disunite the body-part of interaction (ﬁnger, hand) and the resulting tactile feedback,
potentially leading to completely new touch screen interaction paradigms using haptics. Firstly, feedback can be given at
multiple body locations and/or using multiple actuation means (an approach called multi-haptics) and secondly the haptic
feedback can be personalized to each user in collaborative scenarios where multiple users are interacting on the same touch
surface. Three prototypes that have been used for initial explorations in this domain are described.
Valeria Occelli ("Assessing Audiotactile Interactions: Spatiotemporal Factors and Role of Visual Experience") provides a
well founded overview of her work on the interaction of hearing and touch; an interaction that happens often in daily life
but that has received relatively little attention in scientiﬁc literature. She has studied monkeys, patients with brain damage,
blind people, and a non-patient population and shows that the location at which crossmodal audio-tactile stimulations are
presented strongly inﬂuences how much attention is given to the stimulus. Locations directly behind the head attract most
attention. These stimuli in peri-personal space attract less attention and, with increasing distance from the body, the number
of resources allocated to the stimuli also decreases. Moreover, Occelli shows that certain types of sound interact with the
effects of spatial location: pure tones have different effects than white noise has.
Antal Haans and Wijnand IJsselsteijn ("Combining mediated social touch with vision: From self-attribution to telepres-
ence?") investigate the topic of mediated social touch, id est interpersonal touch over a distance by means of tactile display
technology. They investigate combining mediated touch with vision, allowing people simultaneously to both feel and see
how a remote partner is touching them. Adding another sensory modality (in this case vision) for the person receiving the
touches, can potentially increase a user’s sense of "being in the same environment" with the remote partner. The paper
conﬁrms this effect and also shows that adding vision can increase the perceived naturalness of the mediated touches. This
serves as a good example of how perceived quality in one modality can be increased by adding congruent stimuli in an-
other modality. The experimental ﬁndings illustrate that visual feedback, especially when the visual shows a resemblance
to a human body being touched, can improve mediated social touch. As such, the results of this work could be seen as
ingredients for future systems that improve people’s well-being, by facilitating closer contact with loved ones even though
they may be far away.
In the paper "’Breathe with the Ocean’: A System for Relaxation using Combined Audio and Haptic Stimuli" by Esko
Dijk and Alina Weffers, a breathing guidance system is introduced that uses audio, haptic, and visual stimuli and that was
created for the purpose of relaxing a user. The authors provide evidence from the literature that audio stimuli (in particular
music), haptic stimuli (in the forms of vibrations), and visual stimuli can induce relaxation. The breathing guidance system
makes use of a Touch Blanket, an actuation device developed by Philips that can provide haptic patterns on body parts.
The blanket contains 176 small vibration motors arranged in a 2D matrix. ’Haptic waves’ synchronized with audio can
move up and down the body in various cycles. These cycles can be ﬁxed (e.g., taking a rate similar to a breathing pace
that is associated with relaxation), they can follow the breathing behavior of the user or they can guide the user to an
optimal breathing behavior taking into account respiration and heart rate. Results of a ﬁrst evaluation of these approaches
Stefania Seraﬁn et al., in "Identiﬁcation of virtual grounds using virtual reality haptic shoes and sound synthesis", report
on an experiment using a combination of haptic and auditory stimuli to simulate the sensation of walking on different
kinds of surfaces, for example beach sand, gravel, metal etc.. Haptic stimulation was provided by actuators mounted in
the soles of shoes. Audio stimulation was generated by using physical models of walking combined with sounds recorded
during walking on various kinds of surfaces. Both stimuli were coupled to the physical action of walking by sensors in
the soles of the shoes. The aim of this interesting experiment was to ﬁnd out about the enhancement of the sensation of
walking by adding the haptic feedback to the auditory one. From the user tests it appeared that the main role in creating
the sensation and recognizing the kind of surface was the auditory stimulation. Although in some cases haptic stimulation
Finally, Saskia Bakker et al. ("Design for the Periphery") work on the topic of designing for the periphery which revolves
around design technology interactions in such a way that only peripheral attention is needed to process and carry out these
interactions. As a foundation for their work, they discuss the notion of calm technology, and attention theory as developed
in psychological literature. Calm technology is technology that works in the background, not demanding our attention,
and that can be attended by peripheral attention. Bakker et al. propose that interaction design for calm technology should
be guided by principles from psychological theories of attention such that the interaction with the technology can be done
without requiring major attentional effort. Because humans effortlessly interact with tangible objects, the haptic modality
seems a candidate with a lot of potential for interaction design in the periphery.
During the symposium some presentations were given that could not be included in these proceedings. George VanDoorn
gave a talk entitled "Haptics Can Lend a Hand to a Bionic Eye." and Maud Marchal gave a talk on "Pseudo-Haptics". In
addition to the oral presentations there were demonstrations of tactile vests by Sense-Company, Tilburg, in the Netherlands,
and by the Hogeschool voor de Kunsten, Utrecht, in the Netherlands.
We are grateful to the EuroHaptics 2010 organizers for allowing us to organize this special symposium under the ﬂag of
the EuroHaptics conference. A special word of thanks goes to Hendri Hondorp who took on the role of technical editor of
 G.A. Gescheider, J.H. Wright, and R.T. Verrillo, Information-Processing Channels in the Tactile Sensory System, New York: Psy-
chology Press, 2009.
 D. Marr, Vision: a computational investigation into the human representation and processing of visual information, Cambridge MA:
MIT Press, 2010.
 E. Zwicker and H. Fastl, Psychoacoustics: Facts and Models, Berlin: Springer-Verlag,1990.
 C. Spence, J. Ranson, and J. Driver, "Cross-modal Selective Attention: On the Difﬁculty of Ignoring Sounds at the Locus of Visual
Attention," Perception & Psychophysics, vol. 62, pp. 410-424, 2000.
 R.W. Cholewiak, "The Perception of Tactile Distance: Inﬂuences of Body Site, Space, and Time," Perception, vol. 28, pp. 851-875.
 J.B.F. Van Erp, "Tactile displays for navigation and orientation: perception and behavior", Ph.D. thesis, Utrecht University, 2007.
Anton Nijholt, Esko O. Dijk, Paul M.C. Lemmens & Steven Luitjens Enschede/Eindhoven, July 2010
Program and Organizing Committee
Anton Nijholt Human Media Interaction, University of Twente, The Netherlands
Esko O. Dijk Philips Research Eindhoven, The Netherlands
Paul M.C. Lemmens Philips Research Eindhoven, The Netherlands
Steven Luitjens Philips Research Eindhoven, The Netherlands
Dirk Brokken Philips Research Eindhoven, The Netherlands
Jan van Erp TNO Human Factors, Soesterberg, The Netherlands
Hendri Hondorp Human Media Interaction, University of Twente, The Netherlands
Lynn Packwood Human Media Interaction, University of Twente, The Netherlands
Charlotte Bijron Human Media Interaction, University of Twente, The Netherlands
09.45 Design for the Periphery
Saskia Bakker, Elise van den Hoven, Berry Eggen
Eindhoven University of Technology, The Netherlands
10.15 Pseudo-Haptics (preliminary title)
Maud Marchal, Anatole Lécuyer
INRIA, Rennes Cedex, France
11.00 Combining Mediated Social Touch with Vision: From Self-attribution to Telepresence?
Antal Haans, Wijnand A. IJsselsteijn
Eindhoven University of Technology, The Netherlands
11.30 Haptics Can Lend a Hand to a Bionic Eye
George VanDoorn, Barry Richardson
Monash University, Churchill, Australia
12.00 Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces
University of Munich, Germany
13.45 Assessing Audiotactile Interactions: Spatiotemporal Factors and Role of Visual Experience
University of Trento, Italy
14.15 Demonstrations and Talks by Sense-Company and HKU
Ewoud Kuyper, Sense-Company, Tilburg, The Netherlands
Gerard van Wolferen, Hogeschool voor de Kunsten, Utrecht
15.30 Tactile Experiences
Paul M.C. Lemmens, Dirk Brokken, Floris M.H. Crompvoets, Jack van den Eerenbeemd, Gert-Jan de Vries
Philips Research, Eindhoven, The Netherlands
16.00 ’Breathe with the Ocean’: A System for Relaxation using Combined Audio and Haptic Stimuli
Esko Dijk, Alina Weffers-Albu
Philips Research, Eindhoven, The Netherlands
16.30 Identiﬁcation of virtual grounds using virtual reality haptic shoes and sound synthesis
Stefania Seraﬁn, Luca Turchet, Rolf Nordahl, Smilen Dimitrov
Medialogy, Aalborg University, Copenhagen, Denmark
17.15 Discussion, Conclusions, Future
Position Paper: Audio-tactile stimuli to improve health and well-being . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Esko O. Dijk, Anton Nijholt, Jan B.F. van Erp, Ewoud Kuyper, Gerard van Wolferen
Tactile Experiences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Paul M.C. Lemmens, Dirk Brokken, Floris M.H. Crompvoets, Jack van den Eerenbeemd, Gert-Jan de Vries
Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Assessing Audiotactile Interactions: Spatiotemporal Factors and Role of Visual Experience . . . . . . . . . . . . . . . . . . . 29
Combining Mediated Social Touch with Vision: From Self-Attribution to Telepresence? . . . . . . . . . . . . . . . . . . . . . . . 35
Antal Haans, Wijnand A. IJsselsteijn
Breathe with the Ocean: a System for Relaxation using Audio, Haptic and Visual Stimuli . . . . . . . . . . . . . . . . . . . . . 47
Esko O. Dijk, Alina Weffers
Identiﬁcation of Virtual Grounds using Virtual Reality Haptic Shoes and Sound Synthesis . . . . . . . . . . . . . . . . . . . . . 61
Stefania Seraﬁn, Luca Turchet, Rolf Nordahl, Smilen Dimitrov, Amir Berrezag, Vincent Hayward
Design for the Periphery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Saskia Bakker, Elise van den Hoven, Berry Eggen
List of authors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
to improve health and well-being
A preliminary position paper
Esko O. Dijk
Jan B.F. van Erp
Gerard van Wolferen
Philips Research, 34, Eindhoven, The Netherlands
University of Twente, Enschede, The Netherlands
TNO Human Factors, Soesterberg, The Netherlands
Sense Company BV, Tilburg, The Netherlands
Utrecht School of the Arts (HKU), Hilversum, The Netherlands
From literature and through common experience it is known that stimulation of the tactile (touch) sense or
auditory (hearing) sense can be used to improve people's health and well-being. For example, to make
people relax, feel better, sleep better or feel comforted. In this position paper we propose the concept of
combined auditory-tactile stimulation and argue that it potentially has positive effects on human health
and well-being through influencing a user's body and mental state. Such effects have, to date, not yet been
fully explored in scientific research. The current relevant state of the art is briefly addressed and its
limitations are indicated. Based on this, a vision is presented of how auditory-tactile stimulation could be
used in healthcare and various other application domains. Three interesting research challenges in this
field are identified: 1) identifying relevant mechanisms of human perception of combined auditory-tactile
stimuli; 2) finding methods for automatic conversions between audio and tactile content; 3) using
measurement and analysis of human bio-signals and behavior to adapt the stimulation in an optimal way to
the user. Ideas and possible routes to address these challenges are presented.
1.1 Improving health and well-being through touch and hearing
People perceive the world through their senses: sight, smell, taste, hearing and touch. The tactile (touch)
sense is an important one: it is in fact the first sense to develop in the womb. Touch can give people
strong emotional experiences  and is vital for health and well-being . Tactile stimulation or
somatosensory stimulation, applying touch to the human body, is often used as a way to make people feel
better or to reduce stress. Examples range from basic touch to comfort someone, massaging techniques,
whole-body vibration training and physiotherapy to alternative treatments such as acupressure, Reiki and
Besides touch, the sense of hearing is also used to make people feel better: one can listen to spoken
encouragements, relaxing music, or nature sounds to sleep better. Music therapy [2,3] is an established
practice and has been extensively investigated in the scientific community.
The scientific literature shows evidence that specific methods of stimulation of the auditory (hearing)
or tactile senses can indeed effectively reduce stress and muscle tension, increase well-being, or promote
sleep. Furthermore, there are indications [4-7] that stimulating the two senses of hearing and touch at the
same time can have stronger effects on the human body and mind than stimulating only one of these
senses at a time. Hence a promising area of scientific research is the use of a combination of sound heard
and touch felt by a user at the same time to influence the user's body and mind in a positive way.
The research area discussed in this paper we refer to as combined auditory-tactile stimulation and its
effects on human health, well-being, body state and mental state. However, little scientific work has been
done so far in this field. The aim of this paper is to present our vision on this research field of auditory-
tactile stimulation, and to present research challenges and opportunities that we have identified.
Although we often refer to the term health as a goal of the systems we investigate, we do not mean to
replace established treatment methods with new ones. Instead, in healthcare contexts the goal of our
approach is to augment the existing care and treatment methods where possible by stimulating well-being,
relaxation or sleep.
1.3 Example applications
One particular use case is a small relaxation room in a care institute where a user can sit in a comfortable
chair with their eyes closed. Light, music and sounds are played in the room, and the user feels gentle taps
on the body and calming oscillations. The chair senses how the individual user reacts to these stimuli in
real-time. During a session the stimuli are composed by an intelligent system in such a way that the
combined effect is optimally relaxing for this user. Maybe one user prefers taps, while another prefers
gentle vibrations. And each user may have a personal level of intensity and patterns that he/she likes best.
A similar use case could be envisioned for people with autism, analogous to a multisensory
environment investigated earlier in the MEDIATE project .
Another use case example is in the home (consumer) environment. Imagine a user at home, who wants to
relax after a busy day at work. He owns a multisensory relaxation/entertainment system that consists of a
blanket with integrated tactile actuators (e.g. [5,6]) and headphones. The system provides a combination
of sounds, music and tactile stimulation that is designed to relax. After a session of 20 minutes, the user
feels much more relaxed than before.
1.3.3 Public transport
In public transport, it is vital that train drivers are alert during their work shift. However, the working
hours in this profession are often irregular, inducing the risk of decreased alertness at times when it is
most needed. The largest Dutch railway company NS has already experimented  with special power-
nap relaxation rooms, which have the multi-sensory stimulation product AlphaSphere  installed. The
goal is to enable personnel such as train drivers to have a quick, 25-minute rest e.g. during their break, in
order to increase alertness during their work.
1.4 Structure of the paper
To be able to clearly outline our vision and the research challenges ahead in Section 3, we first provide an
overview in Section 2 of the current relevant state of the art and its limitations. Section 4 ends with
discussion and conclusions.
2 Current state of the art
The present section does not aim to be a complete overview or review of the state of the art. Rather, we
briefly sketch the research and application fields that are considered relevant, with the help of a few key
references. We expect that this Special Symposium at EuroHaptics 2010, Haptic and Audio-Visual
2 Esko O. Dijk, Anton Nijholt, Jan B.F. van Erp, Ewoud Kuyper, Gerard van Wolferen
Stimuli: Enhancing Experiences and Interaction, or possible follow-up events will contribute to a more
complete overview and hence an improved vision for the future of auditory-tactile stimulation.
2.1 Stimulating the sense of touch
Stimulating the tactile sense can give people strong emotional experiences and is vital for health and well-
being . Interpersonal touch is known to be an important element of human love and social bonding.
Tactile stimulation is used today in methods to reduce stress or muscle tension, train the body, or to make
people feel good, feel cared for, happy, energized, sleep better or simply more relaxed [10-14]. There are
studies on the subjective pleasantness of touch  and studies on the mental, health-related and bodily
effects of low-frequency vibration [10-13,15].
These methods can involve a human performing the stimulation, a machine, or a human helped by a
machine. Of course it cannot be expected that touch by a machine, in general, will have as similar effect
to touch applied by a human. But on the other hand, the properties of machine-generated or machine-
mediated touch are still being actively researched. A recent result  suggests that the effect size of
machine-produced touch in a specific experimental situation could be similar to that of touch performed
by a human, although more research would be needed to substantiate such hypotheses.
2.2 Touch actuators
To fully understand the opportunities in the field of auditory-tactile stimulation, it is helpful to look at
tactile actuation (i.e. touch stimulation) technology. In recent years, advances in actuators and embedded
computing have enabled a wide range of machine-driven methods for tactile stimulation. The strong
growth of haptic (tactile feedback) technology in the mobile phone market has brought a variety of small
mechanical actuators onto the market. Such actuators are used in for example jackets ( or Figure 1) that
can stimulate different points on the upper body or a blanket  that can provide tactile stimuli to the
whole body. Miniature actuators can be combined  with larger actuators, which enables interesting
compositions of effects. Today, a large variety of tactile effects can be achieved relatively easily, at low
cost and be suitable for daily use situations.
The types of tactile actuators that we currently consider for our purposes are:
1. Miniature ERM (Eccentric Rotating Mass) vibration motors, used in many cell phones. These do
not offer precise independent control of frequency or amplitude of tactile effects. Used in [4,5].
2. Small tactile transducers, capable of playing effects with precise frequency/amplitude control.
Used in some cell phones and in .
3. Larger tactile transducers, used in certain home cinema products and theme parks for powerful
bass effects (called "rumblers" or "shakers"). Also used in .
4. Common bass loudspeakers, sometimes used as an alternative to option 3 above. Used in .
5. Actuator systems for providing mechanical displacement or pressure on the body. For example
solenoids, rotary driven pistons or pneumatic/hydraulic actuators. Motion is used by .
See Figure 1 (left side) for an example product: the Feel The Music Suit created by Sense Company
[sense-company.nl] with the Utrecht School of Arts [hku.nl] and TNO [tno.nl].
2.3 Stimulating the sense of hearing
Like touch, the human sense of hearing is often used in methods for health and well-being. Examples are
relaxation music, nature sounds, or self-help audio guides. In the literature, the effects of music and
therapeutic use of music have been well investigated (e.g. [2,3]). Various audio products exist for well-
being and mental state influencing, including so-called brainwave entrainment methods such as binaural
beats or isochronic tones.
Position Paper: Audio-tactile stimuli to improve health and well-being 3
Figure 1: Dutch minister of Economic Affairs wearing a Feel The Music Suit at a public event (top-
left). On the bottom left, design sketches of the suit. On the right, a demonstration of the TNO tactile
dance suit showing coordinated dance movements by three users.
A good example of innovative audio content with well-being application is Meditainment  (=
Meditation + Entertainment), which combines relaxing music, ambient soundscapes, nature sounds, voice
coaching and guided meditation and visualisation techniques. One particular use of this content which is
reportedly being investigated is pain management for hospital patients. Typically, the audio content in
existing products such as Meditainment is static, id est not interacting with the user nor automatically
adapting to the user. One of our hypotheses is that making this content more adaptive to the user could
make audio stimulation much more effective and attractive.
2.4 Combined stimulation of touch and hearing: auditory-tactile stimulation
An interesting concept is to combine stimulation of the sense of hearing with the sense of touch. If each
one alone can have positive effects, can the combination be even more effective or more enjoyable? See
Figure 2 for an impression of this stimulation approach. As an example, the Meditainment audio content
presented in the previous section does not include tactile stimuli – could tactile stimuli significantly
increase the effectiveness of such content? Next, we look at what the scientific literature tells us about the
health and well-being effects of combined stimulation.
Some work has been done on a specific method of combined auditory-tactile stimulation called vibro-
acoustics [2,4,5]. Here, a tactile (vibration) effect is directly derived from the lower frequencies of music
and played by one or two tactile actuators. Experimental results suggest that this combination may work
well for relaxation or sleep. An experiment  with playing the didgeridoo, which evokes vibrations in
the upper body along with sounds, shows promise as a specific medical treatment.
4 Esko O. Dijk, Anton Nijholt, Jan B.F. van Erp, Ewoud Kuyper, Gerard van Wolferen
Figure 2: Impression of auditory-tactile stimulation of a user
Initial experiments at Philips Research with a combination of ambient sounds, relaxing music and
patterns on a tactile blanket  also appear to be promising. The results suggest that the different
modalities can mutually strengthen each other to provide a total experience that people really like and
On the other hand, a great deal of knowledge does exist in literature about the mutual interactions of
the auditory, tactile and visual modalities, so-called cross-modal effects. This knowledge can be roughly
categorized into in number of sub-areas [20,21]: perception and sensory thresholds, information
processing performance, spatial attention, navigation/orienting in spaces, synaesthesia, neural plasticity,
aging, perceptual illusions and sensory substitution. But the aspects of health, well-being and pleasantness
are only addressed to a limited degree in this existing work.
2.5 Audio-tactile composition
A related area of research and creative work is audio-tactile composition [17,22,23], which refers to
composing a musical piece or audible (ring) tone and at the same time composing tactile vibration effects
that a user can feel. This is starting to become a commercially viable area, due to the rapid growth of
haptics features in cell phones. However, in this field a number of topics have not yet been addressed in
1. A link to human health and well-being has not been investigated;
2. Automatic audio to tactile conversion methods, suitable for driving multiple tactile actuators at
the same time have not yet been investigated in a well-being context;
3. There is lack of well-founded composition tools to compose audio-tactile experiences, especially
when considering applications, like ours, that are outside the limited context of mobile phone
haptic ringtone composition.
Position Paper: Audio-tactile stimuli to improve health and well-being 5
3 Beyond the state of the art
The aim of the scientific research identified in the previous section is to employ a combination of sound
heard by a user and touch felt by a user at the same time, to influence the user's state of body and mind in
a positive way.
We conclude from the previous section that stimulating the human senses of hearing and touch at the
same time has great potential but needs further scientific study. We also found that existing approaches
for auditory, tactile and auditory-tactile stimulation for health and well-being use fixed content that does
not adapt to, nor interact with, the individual user. Content here refers to the combination of sounds and
touch effects and how these are arranged in time and how touch stimulation patterns are arranged across
the body. The adaptivity that a software-based solution for driving the auditory-tactile stimulation would
provide, has not yet been exploited anywhere. To make our vision more concrete, we have provided
example use cases in Section 1.3.
In this section, we first present in Section 3.2 three key scientific challenges/questions that have been
identified. In the subsequent sections 3.3-3.5 the research topics are presented in somewhat more detail.
Section 3.7 concludes by sketching a vision of the type of system that we believe is interesting for the
research community to work towards, combining the results that should come out of the three research
3.2 Scientific challenges
Within the wider area of auditory-tactile stimulation, we specifically want to highlight the following three
1. What are the mechanisms of human perception of combined auditory-tactile stimuli, and how
can these mechanisms be modeled and used by a software-based system to influence the state of
the human body and mind towards a desired state?
2. What are good methods for conversion between the audio domain and the tactile domain, in this
context? Conversion here refers to converting content, for example music, from one domain to
another, but also at a meta level to converting methods or paradigms from one domain to
another. For example, how could a paradigm from music composition be converted into a
paradigm for tactile composition? Tactile to music conversion is also something we consider.
3. How should measurement and interpretation of a user's biosignals and behavior during a
stimulation session be done, to adapt the stimuli in an optimal way? Adaptation should help to
better and faster achieve the users' well-being goals.
To start addressing these scientific challenges, we will outline a number of related potential research
directions in the remaining text of Section 3.
3.3 Topic 1: Effects of multimodal stimuli on health and well-being
In the first proposed research topic, the goal is to study multi-actuator tactile stimulation of the human
body and the effect of this stimulation on human health and well-being, alone and in combination (cross-
modal effects) with the sense of hearing. Also other senses such as smell and vision may have to be taken
into account here.
For a user, desired states can be (depending on the application) relaxed, peaceful, sleepy, engaged,
dreamy, satisfied, active, et cetera. The mechanisms of human perception here may include auditory-
tactile sensory illusions. Sensory illusions - perceiving things that are not really physically there - can be a
very powerful way to evoke emotions.
A first step is to investigate existing literature on auditory and tactile perception and the related
stimulation methods that use touch and hearing. Based on experimental tests, requirements from the
application field, and findings from literature, one could apply an iterative, user-centered design and
research process to come up with auditory-tactile stimuli that are likely to have a certain health or well-
being related effect. These effects will then have to be investigated in user tests. Artificial Intelligence
(AI) techniques such as rule-based systems, machine learning, personalization and (real-time) adaptation
6 Esko O. Dijk, Anton Nijholt, Jan B.F. van Erp, Ewoud Kuyper, Gerard van Wolferen
need to be investigated and employed to design models and systems that make it possible to link user
characteristics and user experience to the properties of temporal and spatial patterns of auditory-tactile
stimuli. Results of this type of research can then be applied in the work described under Topic 3 in
The models just mentioned may use or incorporate existing models of human mental state, known
from literature. One example is the well-known valence/arousal model proposed by Russell . This
model could be used to represent the known arousal-decreasing effects of certain tactile stimuli as
described in [10,13].
3.4 Topic 2: Audio to tactile and tactile to audio conversion methods
The second research topic focuses on conversions between the audio domain and the tactile domain. One
purpose is for example automatic conversion of existing music and non-music audio content to
corresponding tactile stimuli. The audio and generated tactile stimuli can then be played simultaneously,
creating a combined auditory-tactile user experience. By using the music content as the basic ingredient, a
potentially large number of auditory-tactile compositions can be created from existing music.
Automatic translation methods of audio to corresponding tactile stimuli will have to take into account
the (well-being) effects on the human and the methods will have to be suitable for multi-actuator tactile
stimulation systems. Translation should be done in such a way that the user's health goal (e.g. muscle
relaxation, sleep, energizing, etc.) and other goals (e.g. pleasantness, compositional coherence) are
achieved. Conversion methods may include detecting the structural and symbolic expression of a piece of
music, and using this information such that the tactile composition will reflect the same expression.
At a more general level, we also consider the possibility of conversion of methods and paradigms
between the audio and tactile domains. For example, the existing knowledge on music composition and
musical expression and communication of meaning could possibly be “translated” into approaches for
tactile composition and tactile expression and communication of meaning. For music to tactile conversion
and vice versa, a musical ontology can be used as a basis. Specifically, the system of Schillinger  is a
candidate. Schillinger explored the mathematical foundations of music, and was particularly inspired by
Fourier analysis and synthesis.
The topic of studying conversion methods that translate tactile stimuli into audio or music seems less
obvious at first sight. However, we also envision useful applications here such as translating an existing
tactile massage pattern that works well into matching music, in order to strengthen the psychological
effect of the stimulation on the user.
3.5 Topic 3: Audio-tactile systems that adapt to the user based on sensor information
The third research topic involves so-called closed-loop systems or biocybernetic loop : a sensory
stimulation system, in which biosignals and behavior from a user are measured and used to adapt the
stimuli that this user receives. To do the adaptation properly, relevant user influencing strategies should
be used. Based on measurements on the user state, the system can then select the optimal influencing
User-adaptive methods have an added potential to be more effective, and at the same time more
appealing to the user. This potential is still untapped today. Besides having the possibility of explicit
multimodal interaction [27,28,29] with a user, interactive content can also be created by implicit
personalization of auditory-tactile experiences. This is very useful in cases where the user cannot be
expected to actively interact a lot with a system, foe example for elderly, people with impairments, or
hospitalized people with temporary impairments. Research into multimodal (vision, hearing, touch,
speech, gesture) user interfaces that optimally combine explicit and implicit interaction to provide the best
level of personalization during a session could be a part of this research theme.
The biosignals that can be sensed and used for an adaptive system may include brain signals (EEG),
heart signals (ECG), respiration, or skin conductivity (SCL); but also behavioral signals such as the user's
movements, speech utterances or facial expressions during a session.
A particular research challenge for the use of biosignals in an automated system is that there are large
inter-person variations. A system using a person's biosignals, would first have to learn about the user, id
est calibrate its interpretation of signals towards the current user. This calibration challenge is also part of
the research area that we propose.
Position Paper: Audio-tactile stimuli to improve health and well-being 7
3.6 Synergies in the three research topics
The above three research activities may require close cooperation mutually. Also they involve a
cooperation between the field of artistic content creation on the one hand, and on the other hand scientific
areas such as haptics, perception, brain and cognition. The perceptual mechanisms studied in topic #1 are
on the one hand linked to lower-level brain mechanisms and to cognition, but on the other hand also to
topics such as aesthetic perception. The audio-tactile conversion in topic #2 is primarily linked to
composition and to the arts, but can only succeed if the physical and mental health goals are respected –
topics related to multi-sensory perception, brain and cognition. Similarly for topic #3: although
measurement and interpretation of user state mainly links to psychophysiology, perception, brain and
cognition, it is also necessary to have influencing strategies in place that guide a user towards a desired
state. These influencing methods will probably have a strong creative/artistic component in them.
3.7 Vision: Sensing + algorithms + content = optimal personalized experience
Combining the work proposed in the above three topics, we can sketch a vision to work towards. With
recent advances in ICT such as low-cost embedded data processing, solid-state storage growth, ubiquitous
networking, and recent progress in unobtrusive brain and biosignal sensors, a novel type of sensory
stimulation system becomes feasible. This type of system will in real-time adapt a stimulation session
towards an optimal, personalized experience for the current user. This personalization can be based on a
generic model of a user and his/her mental state, which is continuously updated based on sensor
interpretation and data mining. Here, the data is sensed (preferably in an unobtrusive way) from the user
during a stimulation session.
The measured signals and their interpretation can be used to construct a software model of the current
user state, which may describe current estimated levels of relaxation, sleepiness and comfort. Based on
the model, influencing strategies can be chosen to help achieve the user's health and well-being goals.
Artificial Intelligence methods could be used effectively in construction of the user state model and in
the optimal selection of influencing strategies. This would be a novel approach beyond the current state of
art for auditory-tactile or tactile stimulation. In addition, this approach could be extended in the future to
include also stimulating the visual sense (with images, video or light) or olfactory sense.
4 Discussion and next steps
4.1 General conclusions
In this position paper we have introduced auditory-tactile stimulation as a possible means to increase
health and well-being, applicable in various application areas. The existing state of the art has been
briefly addressed and based on our findings so far we conclude that there is clear potential for innovation
in auditory-tactile stimulation approaches. Three specific areas for further research have been identified.
Finally, a vision is presented of a software-based learning system that can automatically or semi-
automatically adapt to the individual user based on general knowledge plus sensor information obtained
from the user during a session. The system will then decide which specific auditory or tactile stimuli to
render, to optimally achieve the user's goals.
4.2 Relevance of the proposed topics
If we take a broader look at society as a whole, one trend is that due to an aging population, Western
economies are increasingly struggling with increasing healthcare costs and shortage of healthcare
personnel. Therefore, there is a growing need for preventive healthcare. Preventive care is a useful
instrument, not only to improve the quality of people’s lifes, but also to partly avoid the cost of expensive
regular treatments. The results of the research we propose could be applied for preventive healthcare, and
can therefore have a positive impact on society.
Other potential users could be the healthcare workers themselves. Due to the ageing trend and
economic constraints, their work will become ever more efficiency-oriented, time-pressured and stressful.
Looking outside the domain of healthcare, other potential user groups can be identified who increasingly
8 Esko O. Dijk, Anton Nijholt, Jan B.F. van Erp, Ewoud Kuyper, Gerard van Wolferen
have to cope with highly stressful events occurring at work or who have to work under pressure. For
example public transport personnel, school teachers, fire fighters or police officers. All these user groups
could benefit from innovative new ways of coping with stress, or ways of inducing relaxation or sleep,
quickly and on-demand. Auditory-tactile stimulation holds this promise.
One concrete example where results can be applied on the shorter term is the small enterprise Sense
Company, an active supplier of sensory stimulation solutions to care organizations. Their portfolio
includes tactile stimulation products. Other examples would be providing enjoyable relaxation solutions
for hospital patients or medical personnel, or products that help to manage pain for chronically ill users at
As another example of application of results, the Utrecht School of the Arts (HKU) has already done
various projects with partner organizations over the past years, aiming at people with special needs such
as people who are deafblind. They investigated how these people can benefit from musical and rhythmic
4.3 Creating a research community
By organizing the Special Symposium "Haptic and Audio-Visual Stimuli: Enhancing Experiences and
Interaction" as part of the EuroHaptics 2010 conference, the authors aim to start the process of gathering a
research community around the topic of tactile (haptic) stimulation combined with other modalities, for
applications in healthcare, well-being, entertainment and user interaction. We plan to organize a follow-
up event around these topics in the future, possibly as a workshop linked to an existing conference.
This work was partially supported by the ITEA2 Metaverse1 (www.metaverse1.org) Project.
 Gallace, A., Spence, C. The science of interpersonal touch: an overview, Neurosci Biobehav Rev.
 Wigram, A.L. The effects of vibroacoustic therapy on clinical and non-clinical populations, PhD
thesis, St. George's Hospital Medical School, London University, 1996.
 de Niet G, Tiemens B, Lendemeijer B, Hutschemaekers G. Music-assisted relaxation to improve sleep
quality: meta-analysis. J Adv Nurs. 2009 Jul;65(7):1356-64, 2009.
 Lemmens, P., Crompvoets, F., Brokken, D., van den Eerenbeemd, J., and de Vries, G.-J. A body-
conforming tactile jacket to enrich movie viewing. In the Proceedings of the 3rd joint EuroHaptics
Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
(Salt Lake City, UT, Mar. 18-20), or WorldHaptics '09, 2009.
 Dijk, E.O., Weffers-Albu, A., & De Zeeuw, T., A tactile actuation blanket to intensify movie
experiences with personalised tactile effects. Demonstration papers proc. 3rd International
Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN), Amsterdam,
2009. www.eskodijk.nl/haptics or wwwhome.ewi.utwente.nl/~dennisr/intetain/INTETAINsupplementary.pdf
 Dijk, E.O., Weffers, M.A., Breathe with the Ocean: A System for Relaxation using Combined Audio
and Haptic Stimuli, Proc. Special Symposium on Haptic and Audio-Visual Stimuli: Enhancing
Experiences and Interaction, of EuroHaptics 2010, July 7th, Amsterdam, The Netherlands, 2010,
 Sha-art, AlphaSphere relaxation chair or "Spaceship for the inner journey", website http://www.sha-
art.com, May 2010.
 Parés, N., Masri, P., van Wolferen, G., Creed, C. Achieving Dialogue with Children with Severe
Autism in an Adaptive Multisensory Interaction: The "MEDIATE" Project, IEEE Trans. on
Visualisation and Computer Graphics, 11(6), November 2005.
Position Paper: Audio-tactile stimuli to improve health and well-being 9
 van Panhuis, B. "Bijtanken met een dutje", newspaper Trouw July 17th, 2008.
 Wigram, A.L. The effects of vibroacoustic therapy on clinical and non-clinical populations, PhD
thesis, St. George's Hospital Medical School, London University, 1996.
 Prisby, R.D., et al. Effects of whole body vibration on the skeleton and other organ systems in man
and animal models: What we know and what we need to know, Ageing Research Reviews 7 319-329,
 Kvam, M.H. The Effect of Vibroacoustic Therapy, Physiotherapy, June, 83(6) 290-295, 1997.
 Patrick, G. The effects of vibroacoustic music on symptom reduction, IEEE Engineering in Medicine
and Biology Magazine, 18(2) 97-100, 1999. DOI 10.1109/51.752987
 Essick, G.K., et al. Quantitative assessment of pleasant touch, Neuroscience and Biobehavioural
Reviews 34 192-203, 2010.
 Puhan, M.A. et al. Didgeridoo playing as alternative treatment for obstructive sleep apnoea syndrome:
randomised controlled trial, BMJ Feb 4;332(7536):266-70, 2006. PMID: 16377643
 Haans, A. and IJsselsteijn. W.A. The Virtual Midas Touch: Helping Behavior After a Mediated Social
Touch, IEEE Trans. on Haptics 2(3) 136-140 July-Sept, 2009.
 Gunter, E. Skinscape: A Tool for Composition in the Tactile Modality, M.Sc. thesis, Department of
Electrical Engineering and Computer Science, MIT, 2001.
 Virtual Relaxation Solutions website, http://www.vrelaxation.com/ , May 2010.
 Meditainment ltd., website http://www.meditainment.com, May 2010.
 Calvert, G. (ed.), The handbook of multisensory processes. MIT Press, MA, USA, 2004.
 Grünwald, M. (ed.), Human Haptic Perception – Basics and Applications. Birkhäuser Verlag,
 Chang, A. and O'Sullivan, C. An Audio-Haptic Aesthetic Framework Influenced by Visual Theory,
Proc. HAID 2008, LNCS Vol. 5270/2008, pp. 70-80, 2008.
 van Erp, J.B.F. and M.M.A. . Distilling the underlying dimensions of tactile melodies.
Proceedings of Eurohaptics 2003, pp. 111-120.
 Russell, J. A. A circumplex model of affect. Journal of personality and social psychology, 39, 1161 –
 Schillinger, J. The Schillinger System of Musical Composition. Da Capo Press music reprint series,
1977. ISBN: 0306775522. Reprint of 1941 ed. published by C. Fischer, New York, 1941. See also
 Serbedzija, N.B., S.H. Fairclough. Biocybernetic loop: from awareness to evolution. Proc. 11th Conf.
on Evolutionary Computation, Trondheim, Norway, pp. 2063-2069, 2009.
 Cao, Y. and Theune, M. and Nijholt, A. Towards Cognitive-Aware Multimodal Presentation: The
Modality Effects in High-Load HCI. In: Engineering Psychology and Cognitive Ergonomics. 8th
International Conference, EPCE 2009, Held as part of HCI International 2009, 19-24 July, San
Diego, USA. pp. 3-12. Lecture Notes in Computer Science 5639. Springer Verlag, 2009.
 Pantic, M. and Nijholt, A. and Pentland, A. and Huang, T.S. Human-Centred Intelligent Human-
Computer Interaction (HCI²): how far are we from attaining it? Int. J. of Autonomous and Adaptive
Communications Systems, 1 (2). pp. 168-187, 2008.
 van Gerven, M. and Farquhar, J. and Schaefer, R. and Vlek, R. and Geuze, J. and Nijholt, A. and
Ramsay, N. and Haselager, P. and Vuurpijl, L. and Gielen, S. and Desain, P. The Brain-Computer
Interface Cycle. Journal of Neural Engineering, 6 (4). pp.1-10, 2009.
10 Esko O. Dijk, Anton Nijholt, Jan B.F. van Erp, Ewoud Kuyper, Gerard van Wolferen
Paul M.C. Lemmens, Dirk Brokken, Floris M.H. Crompvoets,
Jack van den Eerenbeemd, Gert-Jan de Vries
Philips Research, High Tech Campus 34, NL-5656 AE Eindhoven, The Netherlands
A simple touch can result in a profound and deep experience. Tactile communication has been used in
information displays or to increase the entertainment value of arcade and pc games. The study of
communication of emotion via tactile stimulation started only recently. We have built an emotion jacket as
a research prototype to study the communication of emotions with vibrotactile stimulation. We recreated
bodily feelings related to emotional experiences (e.g., a shiver down one’s spine) as tactile stimulation
patterns and showed that these emotion patterns, in a movie-viewing context, can increase emotional
immersion with the content being viewed.
Being touched can be a very powerful experience and its effects can range from a scary and unnerving
unexpected hand on a shoulder, to a soothing, relaxing massage, or to a reinvigorating hug, with many
more gradients in between . The tactile modality, contained within the human skin, is the largest
sensory modality (in terms of surface) that humans have. In the womb it starts to develop the earliest of
all other senses and is most developed at birth . This strong and old connection between tactile
sensations and the intimacy and safety of the womb could be considered one of the reasons why a touch
can evoke such powerful emotions. Touches sooth and arouse infants, and a touch can also regulate an
infant’s state . Touches may even be a basic need like food, water, or sleep . Following the principle
of equipotentiality, it may be that touch is a particularly strong medium for children to communicate their
(emotional) state .
Compared to modalities like vision and audition, scientific study of the properties of the tactile
sensory system, and its communicative abilities in particular, was scarce for a long period of time .
Recently, Van Erp and colleagues used the information processing properties of the tactile modality to
create informative tactile displays to provide additional navigational cues to airplane and helicopter pilots.
They successfully used tactile stimulation to prevent overloading the visual and auditory senses that are
already highly taxed in a cockpit context [4,5].
To improve the quality of the experiences that they generate, the entertainment industry has been
using the tactile modality for some time. One can think of the arm-wrestling machine or a racing
simulator with force feedback in its steering that can both be found in any arcade hall. More recently, the
availability of tactile stimulation systems for personal entertainment systems like game consoles and pc’s
has increased [5,6]. The available technology ranges from simple rumblers and force feedback systems in
joysticks to full torso vests containing multiple air bladders that quickly inflate upon impact in, for
instance, first-person shooters . Unfortunately, however, both the research into the informational
properties of tactile information as well as studies investigating the effects of added tactile stimulation in
entertainment context  neglect the aforementioned strong and intimate link between tactile stimulation
and their effect on emotions.
Some work into this area can be found in interaction and design research on virtual mediated touch
(see Haans et al.  for a review). A lot of that work focused on using the inherent intimacy or closeness
of a touch to improve virtual communication between humans on a personal level [9-11]. The Lovebomb
, on the other hand, provided the ability to anonymously indicate one’s emotional state in public
spaces and among strangers. Cramer showed that embodied agents and robots were judged less credible
when their empathic responses were incongruent with that of their user and that touches from pro-active
agents resulted in a less machine-like character . Another work that involves connecting emotion and
tactile stimulation is the Emoti-chair that uses a model of the human cochlea to provide tactile actuation
based on the processing of, for instance, music .
The code of what makes a touch communicate a happy, sad, angry, or other emotional message is far
from clear. Hertenstein and his group at the DePauw University have been working on code of tactile
stimulation and emotion [2,15]. They showed that strangers could accurately (ranging from 48% to 83%)
decode distinct emotions when touched by another person. Moreover, when Hertenstein et al. video
recorded these touches and showed them to another group of participants, these participants also
recognized the intended emotion with high accuracy. For specific emotions, percentages of correctly
recognized emotions ranged from 38% to 71% .
In our own work we have taken a different approach to the study of the communication of emotion via
tactile stimulation. We worked from William James’ observation that every emotion has a distinct bodily
reaction [16-18] and have listed various bodily reactions that are a result of an emotional experience. For
instance, we considered responses like a shiver down one’s spine, having butterflies in your stomach, a
racing heart beat, etc. We then reversed James’ idea and studied whether providing one of these bodily
reactions could actually induce an emotion.
Note that it is important to stress the difference between enhancing emotional experiences compared
to enhancing movie effects. For instance, in a movie scene in which Bruce Lee is surrounded by evil
henchmen, one can try to enhance the experience by converting the visually presented punches and kicks
into tactile sensations. This is the movie-effects approach. On the other hand, one could also try and
provide a tactile experience of the anxiety that Bruce Lee feels in such a tight situation and the relief once
he won the fight and survived. This latter approach of enhancing emotional experiences is the one that we
have taken in the current study.
We asked whether tactally recreating these bodily reactions and using them as stimuli could enhance
the emotional experience of watching movie content. By measuring psychophysiological signals that
change due to changes in emotional state as well as taking questionnaire responses regarding emotional
state, we measured the emotional state of viewers before, during, and after movie viewing. We expected
that these responses would show responses indicative of deeper immersion when comparing a film clip
without and with tactile actuation.
In the next section, we describe the emotion jacket that we developed to be able to project tactile
sensations on the torso of our viewers. In section 3, the user test to evaluate our idea is briefly described
and we round up with some conclusions based on our findings.
2 Body–Conforming Jacket With Tactile Actuators
The main design criteria for the emotion jacket were: ability to stimulate back and front of the human
torso and arms, being battery powered, having smooth integration of electronics with the fabric for good
aesthetics, good accessibility of electronics, and being light weight. The design aimed to enable
projection of tactile patterns on the entire torso while keeping the electronical design low on complexity.
This resulted in a jacket with 64 uniformly distributed actuators in a layout covering the entire torso with
roughly 15 cm distance between neighboring actuators (see Figure 1).
A stretchable fabric was chosen to create a tight fit. This ensured that the actuators were close enough
to the skin for the best tactile sensation possible. Small, medium, large, and extra large vests were built to
accommodate different sizes.
2.1 Electronics Design
For the actuators we opted for pancake–shaped (coin type) generic eccentric rotating–mass (ERM) motors
because they were light weight, thin, and inexpensive compared to other offerings. A disadvantage is that
we were limited to vibrotactile stimulation only. The ERM motors are glued onto the back of custom-
made PCB’s that were connected to the segment-driver PCB’s using thin flexible wires. Each segment-
driver PCB controlled 4 motors.
The driver segments were daisy chained to form a serial bus that starts and terminated at a custom–
made interface PCB. This PCB combined the SPI-bus from the external USB-to-SPI interface with the
12 Paul M.C. Lemmens et al.
power supply line from the two AA batteries. The electronics design was a compromise between number
of cables, number of drivers and the limit of the (flexible) cabling in terms of throughput of current
The jacket was operated on 2 AA-sized batteries. With rechargeable batteries that each deliver 2500
mAh, the jacket had an operational lifetime of 1.5 hours when continuously driving 20 motors at the same
2.2 Textile Integration
The jacket consisted of two layers: an outer lining and an inner lining. The PCB’s were sewn onto the
inner lining using holes along their outer perimeter and were then covered by the outer lining for
protection and aesthetics. At the bottom side of the torso and at the end of the sleeves the linings were not
sewn together for easy access to the electronic components. The photograph on the right of Figure 1
shows the shirt inside-out, exposing the PCBs and wires. The total weight of one vest, including
electronics and batteries, was approximately 700 grams.
2.3 Designing Tactile Stimuli
The actuators in the jacket are controlled from a PC using a LabVIEW™ (National Instruments, Austin,
TX, USA) software interface that was developed in–house. The whole system has been principally
designed to be able to use a 10 ms resolution for specifying changes in the tactile stimuli; in practice, we
often reverted to a 20 ms resolution.
The LabVIEW application allowed us to generate tactile stimuli on various levels of granularity. First,
we created different types of shapes. Shapes have, in principle, an unlimited duration and the amplitude
specified for each 10 ms step reflects the intensity of the vibration of the motors. Example shapes are sine
waves, block waves, sawtooth waves, etc. Thus, shapes define the vibration intensity over time, and are
the building blocks for patterns.
Patterns specify at what point in time a particular motor has to render the given shape. An example
pattern is a series of sine wave shapes that run from the left wrist over the shoulder to the right wrist.
Patterns thus define the spatial and temporal distribution of vibrations over the torso.
Finally, these patterns were played back on the emotion jacket at predetermined times. For the present
study, most of the patterns were based tactile sensations that are linked to common sayings like having
butterflies in your stomach, or having a shiver down one’s spine.
Figure 1. Outer lining (left) and jacket turned inside-out showing
the inner lining with electronics and wires (right-hand panel). Red
(thicker) wiring is the serial bus that connects all segment drivers
and provides communication and power; white (thinner) wiring
connects motor PCB’s to the respective segment driver.
Tactile Experiences 13
3 User Study
So, starting with James’ idea that each emotion has a distinct bodily component [16-18], we tried to
recreate these bodily sensations to see whether they could be used to induce an emotion. We set up a user
test in which participants had to view clips of movie content that was validated to elicit a certain
emotional response. We created tactile emotion patterns for each of these clips, on the one hand, by
reverting to common wisdoms like shivers down one’s spine, a racing heartbeat, exploding with anger
and, an arm around your shoulders, or a sigh of relief (etc.). On the other hand, we also created patterns
that were specific to particular movies.
Fourteen participants (age range from 24 to 58, 4 females) viewed each clip in two versions: first the
original version without tactile emotion patterns and on a second viewing the emotion patterns were
projected onto their body. The participant(s) wore the vest during both viewings. The presentation order
of the clips was randomized, although we made sure that each clip was first shown without tactile
Before and after each movie clip, the participants had to self-rate their emotional state using the Self-
Assessment Manikin (SAM; ) which is a pictorial questionnaire that can be used to assess the
positivity/negativity, level of arousal, and level of dominance/potency of an emotion. We also used a
questionnaire that was employed in earlier work to determine immersion experiences in TV applications.
This questionnaire included elements of emotional experience and immersion . We expected that the
after measurement would show indications of increased emotions (for instance, by a higher score on the
During the viewing, psychophysiological responses related to changes in emotional changes were
recorded. Examples of these responses are the electrodermal response, heart rate, skin temperature, or
respiration [21-25]. Again, we expected that these responses would shower larger deflections (i.e.,
indications of stronger emotional experiences) when the participants were viewing the movie clips with
the emotion patterns present.
3.1 Questionnaire Data
The data of the SAM did not show that participants indicated higher positive states for positive clips
when emotion patterns were present during viewing. Similarly, we did not find an effect of increased
emotional arousal during movie viewing with emotion patterns present. Because we used a 5-point SAM,
we presume that this absence of significant findings was due to the limited number of options to indicate
changes. That is, the effects of the additional tactile emotion patterns appeared to be rather subtle and
required a more detailed scale that enabled scoring of fine-grained differences.
For the immersion questionnaire, we obtained several significant effects that indicated that
participants felt more involved in the movie viewing. Participants had more intense experiences, felt more
drawn into the movie scenes, and felt that all senses were stimulated at the same time (see Table 1). Note
that, interestingly, a question from the immersion questionnaire that explicitly taxed the experience of
emotions did not reach the level of significance. We will return on this point in the discussion.
Table 1. List of relevant questions from the immersion questionnaire  with average scores (on a 5–point scale)
for the actuation absent and actuation present conditions. * indicates significantly different from actuation absent at p
My experience was intense.
I had a sense of being in the
I felt that all my senses were
stimulated at the same time.
14 Paul M.C. Lemmens et al.
3.2 Psychophysiological Responses
We observed significant effects on two of the three psychophysiological responses that we recorded. For
skin-conductivity, we found higher levels of conductivity (indicative of higher arousal) in the viewing
condition when the emotion patterns were present. Because this applied to all clips, it most likely
reflected a generic increase in arousal due to the tactile stimulation itself (i.e. ignoring the emotional
communication of the tactile stimulation).
More interesting was the statistically significant interaction showing that the increase in skin
conductivity due to the presence of emotion patterns was different for the various movie clips. Figure 2
shows that for three clips, we observed stronger increases in skin conductivity than for the other clips.
Interestingly, these clips were all intended to evoke negatively valenced emotions.
Figure 2. For all clips the effect of Actuation on skin–conductivity level (left panel) and heart rate (right panel). The
blue line is for actuation absent; the green line is for actuation present. On the x-axis, the clips are numbered and
ordered from left to right as: Braveheart, When Harry met Sally, Jurassic park III, The lion king, My bodyguard,
Silence of the lambs, and Tom & Jerry. On the y–axis are the normalized physiological measurements.
For heart rate (Figure 2, right-hand panel), we observed a similar differentiation over movie clips of
the effect of the added tactile emotion patterns. In this case a correlation with the valence of the clips for
which we observed significant changes in heart rate, was not evident because one clip evoked positive
emotions whereas another clip evoked negative emotions. Instead, it appeared as if the emotion patterns
in these clips replicated increases in heart rate that the actors portrayed in the scenes.
The user study that we carried out showed that the addition to movie clips of emotions patterns that are
projected on the torso using tactile actuators, results in a stronger or emotionally more immersive movie
viewing experience. A few findings deserve some attention.
First, we did not observe a change in emotional state using the SAM of Bradley and colleagues .
We have already highlighted that this may have been due to a response scale that was not sufficiently
granular to capture subtle changes in emotional state. However, a question from the immersion
questionnaire that specifically asked for emotional experience also did not change in a statistically
significant way. So did we change emotional state after all? In our view, we have, because a lot of
subjective feedback from our participants that was not captured using formal questionnaires, indicated
that they truly felt stronger immersion in the movie content when the emotion patterns were present. This
is corroborated by the psychophysiological data. So, regarding subjective feedback, it is apparent that the
questionnaires that we have chosen were not optimal in terms of level of granularity. The subtlety of our
Tactile Experiences 15
effects may require finer granularity than the questionnaires could deliver. In follow up studies it may
even be necessary to select other questionnaires than we employed here.
The other point for discussion is that the psychophysiological recordings show that the emotion
patterns, on average, have the strongest effects on negatively valenced clips. This may be a side effect of
the type of tactile actuator that we chose. As we already mentioned, we were limited to vibrotactile
stimulation due to our choice for eccentric rotating-mass motors. It is exactly this type of tactile
stimulation that is often used for alerting functions, for instance, in cell phones. Therefore, it may be that
these types of actuators are better suited for negatively valenced emotions because these emotions have an
alerting and arousing function. It is clear that we need to study the efficacy of individual emotion patterns
and that further detailed study of the design of the emotion patterns is needed to confirm or contradict this
On a higher level, we conclude that our findings highlight that it is possible to convey an emotional
communication via relatively simple tactile technology. This corroborates the findings in the work of
Hertenstein and colleagues on the code between tactile communication of emotional messages [2,15]. Our
work shows that common wisdoms or sayings regarding bodily effects of emotions can actually be used
to generate tactile stimuli that can trigger (or at least enhance) emotions. In our case, it still is the case that
we require other content to set an emotional context. However, Hertenstein’s findings that people can
accurately decode a touch as an intended emotional communication shows that in principle our tactile
stimuli could evoke emotions without a surrounding emotional context.
To conclude, by exploiting relatively simple and cheap tactile technology, we have been able to evoke
strong psychological effects that show in increased emotional immersion during movie viewing. This
finding shows that it is feasible to tactally enhance the emotional experiences of movie clips. This is a
very useful addition to the enhancing of movie special effects using tactile stimulation that enables
perceptually and emotionally rich movie-viewing experiences. Perhaps it is not without good reason that
we say “I was touched” after having experienced a powerful emotion!
Acknowledgements. The author would like to thank two anonymous internal reviewers for their
comments that helped to improve this manuscript.
 A. Haans and W.A. IJsselsteijn, “Mediated Social Touch: A Review of Current Research and
Future Directions,” Virtual Reality, vol. 9, pp. 149-159, 2006.
 M.J. Hertenstein, D. Keltner, B. App, B.A. Bulleit, and A.R. Jaskolka, “Touch Communicates
Distinct Emotions,” Emotion, vol. 6, pp. 528-533, 2006.
 M.J. Hertenstein, “Touch: Its Communicative Functions in Infancy,” Human Development, vol.
45, pp. 70-94, 2002.
 J.B.F. Van Erp, “Tactile displays for navigation and orientation: perception and behavior,” Ph.D.-
thesis, Utrecht University, 2007.
 R.W. Lindeman, Y. Yanagida, H. Noma, and K. Hosaka, “Wearable Vibrotactile Systems for
Virtual Contact and Information Display,” Virtual Reality, vol. 9, pp. 203-213, 2006.
 K. Kaczmarek, J. Webster, P. Bach-y-Rita, and W. Tompkins, “Electrotactile and vibrotactile
displays for sensory substitution systems,” IEEE Transactions on Biomedical Engineering, vol. 38,
pp. 1-16, 1991.
 “Tactile Gaming Vest - PC Gaming Hardware & Accessories with Physical 3D,”
 A. Israr and I. Poupyrev, “Exploring Surround Haptics Displays,” Proceedings of the 28th of the
international conference extended abstracts on Human factors in computing systems - CHI EA '10,
Atlanta, Georgia, USA: , pp. 4171-4176, 2010.
 K. Tollmar, S. Junestrand, and O. Torgny, “Virtually living together,” Proceedings of the
conference on Designing interactive systems processes, practices, methods, and techniques - DIS
'00, New York City, New York, United States: , pp. 83-91, 2000.
 A. Chang, S. O'Modhrain, R. Jacob, E. Gunther, and H. Ishii, “ComTouch,” Proceedings of the
16 Paul M.C. Lemmens et al.
conference on Designing interactive systems processes, practices, methods, and techniques - DIS
'02, London, England: , p. 312, 2002.
 K. Dobson, D. boyd, W. Ju, J. Donath, and H. Ishii, “Creating visceral personal and social
interactions in mediated spaces,” CHI '01 extended abstracts on Human factors in computing
systems - CHI '01, Seattle, Washington: , p. 151, 2001.
 R. Hansson and T. Skog, “The LoveBomb: Encouraging the Communication of Emotions in Public
Spaces,” CHI'01 extended abstracts on Human Factors in computing systems, New York, NY,
USA: ACM, , pp. 433-434, 2001.
 H.S.M. Cramer, “People's Responses To Autonomous And Adaptive Systems,” Ph.D.-thesis,
Universiteit van Amsterdam, 2010.
 M. Karam, C. Branje, G. Nespoli, N. Thompson, F.A. Russo, and D.I. Fels, “The Emoti-Chair: An
Interactive Tactile Music Exhibit,” Proceedings of the 28th of the international conference
extended abstracts on Human factors in computing systems - CHI EA '10, Atlanta, Georgia, USA: ,
pp. 3069-3074, 2010.
 M.J. Hertenstein, R. Holmes, M. McCullough, and D. Keltner, “The communication of emotion via
touch,” Emotion, vol. 9, pp. 566-573, 2009.
 W. James, “What is an Emotion?,” Mind, vol. 9, pp. 188-205, 1884.
 J.J. Prinz, Gut Reactions: A Perceptual Theory of Emotion, Oxford University Press, , 2004.
 J.J. Prinz, “Are Emotions Feelings?,” Journal of Consciousness Studies, vol. 12, pp. 9-25, 2005.
 M.M. Bradley and P.J. Lang, “Measuring emotion: the Self-Assessment Manikin and the Semantic
Differential,” Journal of Behavior Therapy and Experimental Psychiatry, vol. 25, pp. 49-59, Mar.
 R.J.E. Rajae-Joordens, “Measuring Experiences in Gaming and TV Applications - Investigating the
Added Value of a Multi-View Autostereoscopic 3D Display,” Probing Experience - From
Assessment of User Emotions and Behaviour to Development of Products, J.W.D. Westerink, M.
Ouwerkerk, T.J.M. Overbeek, W.F. Pasveer, and B. De Ruyter, eds., pp. 77-90, 2008.
 M.M. Bradley, “Emotion and Motivation,” Handbook of Psychophysiology, J.T. Cacioppo, L.G.
Tassinary, and Berntson, eds., Cambridge University Press, , pp. 602-642, 2000.
 J.T. Cacioppo, G.G. Berntson, D.J. Klein, and K.M. Poehlmann, “The Psychophysiology of
Emotion Across the Lifespan,” Annual Review of Gerontology and Geriatrics, vol. 17, pp. 27-74,
 P. Gomez, W. Stahel, and B. Danuser, “Respiratory Responses During Affective Picture Viewing,”
Biological Psychology, vol. 67, pp. 359-373, 2004.
 A. Kistler, C. Mariauzouls, and K. Von Berlepsch, “Fingertip Temperature as an Indicator For
Sympathetic Responses,” International Journal of Psychophysiology, vol. 29, pp. 35-41, 1998.
 Task Force of the European Society of Cardiology the North American Society of Pacing
Electrophysiology, “Heart Rate Variability: Standards of Measurement, Physiological
Interpretation, and Clinical Use,” Circulation, vol. 96, pp. 1043-1065, 1996.
Tactile Experiences 17
Multi-Haptics and Personalized Tactile Feedback
on Interactive Surfaces
University of Munich, Amalienstr. 17, 80333 Munich, Germany
Tactile feedback on interactive surfaces such as touch-screens provides significant benefits in terms of
reducing error-rates, enhancing interaction-speed and minimizing visual distraction . However, current
digital multitouch surfaces do not present any tactile information to the interacting user. Existing scientific
approaches to provide tactile feedback on direct-touch-surfaces share the assumption that haptic feedback
has to be given at the location of the interaction. We propose spatially disuniting the body-part of
interaction (finger, hand) and the resulting tactile feedback. This approach is potentially beneficial for
providing multi-haptic feedback on multi-touch surfaces and the communication of additional
personalized information via the sense of touch. In the paper, we describe the potential benefits of our
approach, present three first prototypes and discuss our current findings and intended future work.
Figure 1: First prototypes (from left to right):
Tactile Thimble, Haptic Armrest, Edge Matrix
Research about tactile feedback on direct-touch-surfaces can be categorized as follows: First, mobile
actuator systems like the one used by Kaaresoja et al.  or Koskinen et al.  move the mobile device or
the device’s screen as a whole using motors or piezoelectric actuators. With this approach only a single
touch-input can be augmented haptically. Second, shape displays, such as FEELEX  or Lumen  are
based on the segmentation of the interactive surface into individually movable ‘haptic pixels’. Currently,
these systems only provide a small number of actuated points due to mechanical constraints.
The approaches mentioned above are based on the assumption that tactile feedback associated with an
interaction should be applied on the interacting body part. Imagine touching a button on a tactile
touchscreen with the tip of your index-finger. As a result, the whole screen would vibrate. This causes a
repeated deformation of your finger’s skin touching the screen. The body part of interaction (skin on your
fingertip) is the location of tactile feedback. Only solitary user-inputs can be augmented haptically, the
whole screen or device is moving as a single ‘haptic pixel’. Or imagine touching an interactive shape
display (like Relief  or Lumen mentioned above) with your hands. The moving pins stimulate the
mechanoreceptors in the skin of your hand resting on the device. Again, the body part of interaction (skin
on the palm of your hand) is the location of tactile feedback. The location of input and visual feedback on
the device coincides with the location of haptic feedback. As a consequence, the visual output has to be
mapped to the reduced resolution of the pin-array.
On the contrary, our approach is to disunite the body-part of interaction (finger, hand) and the
resulting tactile feedback. In other words: while the user explores a virtual element on the interactive
surface with his finger or hand, the resulting haptic stimuli are applied somewhere else on the body.
Decisive for the position of application are human physiological conditions, the character of tactile
information to be conveyed and the nature of the used haptic interface. We assume that our approach is
beneficial for multi-haptic feedback on multi-touch surfaces, the accuracy of interaction as well as
additional personalized information via the sense of touch.
In order to investigate the potential of our approach, we developed three first prototypes: the Tactile
Thimble, the Haptic Armrest and the Edge Matrix (section
5). The prototypes differ in construction, body
location of application and type of conveyed tactile information. Our three prototypes have one approach
in common: tactile output (interactive surface Æ user) is spatially separated from manual input (user Æ
In the following, we describe the possible outcomes of our approach, present the three prototypes in
detail and discuss our current findings and planned future work.
2 Tactile Feedback on Interactive Surfaces
Interactive surfaces exist in various types. From small, resistive single-touch screens (e.g. the one used in
mobile devices like the Nintendo DS
) over capacitive screens (e.g. used in the Apple iPad
) to large
interactive tables  or walls . The term ‘interactive surface’ that we use in this paper refers to human-
computer-interfaces with following characteristics :
• Combination of manual input and visual output: In contrast to user interfaces like mice or touch-
pads, the sensor area of input and the display area are congruent. Accordingly, positioning is
absolute, the user’s finger or hand is the tracking symbol.
• Manipulation of interactive elements using fingers or hands
• Direct interaction with interactive elements
• Multiple points of sensed contact: This way it is possible for multiple users to work on a shared
interactive surface. Also gestures can be defined and interpreted as input.
During the interaction with most current touch-sensitive displays, touching is the same as activation.
Tactile exploration on the device is not possible, tracking is done visually by the user while the finger is
still “in the air”. Constant visual attention is required during this targeting/ tracking phase. Tactile
feedback during exploration and manipulation provides benefits. As stated above, tactile feedback during
the interaction with touch-sensitive surfaces helps to reduce the errors made, enhances interaction-speed
and reduces the required visual attention. This applies especially to situations or environments with
increased cognitive or visual workload for the user  .
3 Related Work
Touch screens are used in a variety of mobile devices today. Mobile phones or portable music players
often rely on vibrations to provide users with tactile feedback using built-in vibration motors. Vibrotactile
feedback could help the user to know where he is on the screen and could support him in tasks like the
selection of list items or during text entry. Poupyrev et al. used a single vibrotactile actuator to convey
information. Different vibration patterns were used to communicate scrolling rate or position on the
screen to the user. Selection of list items was 22% faster than when no tactile feedback was provided .
20 Hendrik Richter
Mobile actuator systems like the one used by Kaaresoja et al.  or Koskinen et al.  move the mobile
device or the device’s screen as a whole using motors or piezoelectric actuators.
Another approach to provide the user with tactile information about interactive elements is the use of
tactile displays. Existing electronic Braille displays for accessibility are limited to text-based information.
Graphic tactile displays allow perceiving images by the sense of touch on a reusable surface and
substitution of the visual/auditory sense. Tactile substitution can be used in augmenting accessibility for
the blind or deaf in order to: (a) to enhance access to computer graphical user interfaces, (b) to enhance
mobility in controlled environments . Shape displays bring three-dimensionality to a table surface by
using a matrix of individually movable elements. We already mentioned shape displays that combine
input and output   . However, the resolution of haptic displays still is limited due to mechanical
A third possibility of bringing tactile feedback to the interactive surface is the application of tangible
interfaces that are placed atop the table. On the one hand, this type of interface can be used as input
device for the interaction with virtual elements on the surface. On the other hand, it works as output
device to convey tactile information about the virtual objects underneath. An example for this approach is
the Haptic Tabletop Puck by Marquardt et al. . The prototype allows the user to explore vertical
relief, malleability of materials, and horizontal friction of objects on the interactive surface. The tactile
information is generated by a mechanically activated rod that moves vertically within a column to reach
different heights above the table. Active tactile exploration of object characteristics and interaction with
elements is enabled by a sensor on top of the device that detects the amount of pressure being applied to
its top. The point of input is defined by a virtual arrow that starts under the device. The tracking symbol is
the end of this arrow. This approach is really inspiring, but also has some drawbacks: the interaction
using multiple fingers or even gestures is inhibited. The interaction isn’t direct, the Haptic Tabletop Puck
works as a mechanical separator between hand and interactive elements on the interactive surface.
We propose a different approach that tries to avoid these drawbacks.
4 Spatially Disuniting Interaction and Tactile Feedback
Our research is based on the approach to move the actively generated tactile feedback away from the
interacting user’s hand. The user is working with virtual elements on a multi-touch surface and is
perceiving edges, areas or characteristics of these elements via tactile stimuli applied to his body. The
interaction with virtual elements still is direct. In a multi-user environment like the collaborative work
using a tabletop, additional tactile feedback could improve the accuracy during targeting tasks and the
interaction with small objects . Another possible goal we plan to examine is the exploration of distal
objects on the interactive surface using tactile information. We are particularly investigating possible
consequences such as Multi-Haptics (see section
4.1) and Personalized Tactile Feedback (see section 4.2).
Using the stated approach, the first matter of our scientific interest is how to provide haptic feedback on
multi-touch surfaces. In other words: the simultaneous transmission of tactile cues about a virtual object’s
geometrical, functional and semantic characteristics to more than one finger or hand. We refer to this goal
as Multi-Haptics. Accordingly, the area of tactile stimulation is not limited to the size of a fingertip,
haptically enlarging the tactile resolution of multiple contacts to the interactive surface becomes possible.
Imagine the user is exploring an interactive surface using both hands. The location, form or surface
characteristics of the virtual objects under the user’s fingers are conveyed to the user’s body (e.g. his
underarm). The user is free to use both his hands and to directly interact with the surface and the depicted
4.2 Personalized Tactile Feedback
In the next step, we plan to investigate the personalized transmission of tactile cues to multiple users of a
shared interactive display using the aforementioned approach (Personalized Tactile Feedback). This
means that one user is enabled to perceive tactile characteristics of an interactive element but another user
of the shared touch-surface perceives different sensory cues (or none at all) during the tactile exploration
Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces 21
of the same virtual object. One benefit would be the creation of an individual channel of tactile
information to each user of a shared interactive surface. We plan to investigate how this channel can be
used to carry private information. Imagine two people working on a shared interactive surface; one person
is touching a virtual object that has a soft surface. When the other person is touching the same virtual
object, the object could “feel” completely different, e.g. it could present the object’s state using tactons
. The tactile feedback is personalized. One could also think of invisible objects on interactive surfaces
that could be tactilely sensed by some users.
5 First Prototypes
In order to advance our research on the spatial separation of tactile feedback and body part of interaction,
we designed and implemented three first prototypes of tactile interfaces (see
Figure 1). All three
prototypes stimulate the skin, i.e., are related to cutaneous touch. Our tactile interfaces are concerned with
the mechanical deformation of the skin and leave out pain and temperature sensations. Pasquero 
names basic engineering attributes for the coding of artificial perceptual information (e.g. amplitude,
frequency, duration, resolution and signal waveform). An extra dimension is the locus of interaction. Our
three prototypes provide the user with tactile signals that differ in the aforementioned attributes. Design
of the prototypes and first results of our studies are presented in the following.
5.1 Tactile Thimble
This prototype applies tactile information to the user’s interacting finger or the hand (see Figure 2). Using
off- the-shelf components such as cylindrical vibration-motors, tactile location cues about close-by virtual
objects are conveyed. A direct and private human-computer connection based on the perceived stimuli
can be established.
Figure 2: The Tactile Thimble
The Tactile Thimble is a haptic interface that consists of a textile glove and three attached cylindrical
vibration motors. The locations of the three actuators (forefront of fingernail on thumb, forefront of
fingernail on index-finger and outside section of the palm) are based on physiological conditions of
mechanoreceptors in the glabrous skin in the human hand  . The glove’s position and orientation
is tracked optically using a camera. A force sensor is placed under the tip of the user’s index finger, so the
user is able to “press” a virtual button when he places his finger on it. The user who is wearing the glove
may now be asked to “find” an invisible, virtual object on a 2D surface. This exploration is based on the
tactile information that can be provided by the actuators. For example: if the virtual object is positioned to
22 Hendrik Richter
the right of the hand, the actuator on the right side of the glove may convey a signal. This concept is
based on tactile way-finders such as .
The Tactile Thimble follows the principle of spatially separating body-location of interaction and location
of tactile feedback. The tactile cues are applied to the same body-half, to the interacting hand, but not to
the index-finger’s skin that is in contact with the interactive surface. The Tactile Thimble is used to
analyze the possibilities of supporting the process of tactile exploration using abstract tactile messages as
described in . In order to examine the potential of abstract cues for location information, we reduced
the number of tactile actuators to three. Information like “move your hand backwards” is encoded using
signal attributes such as duration or frequency instead of the location of the actuator.
5.1.3 Intended Research
With the Tactile Thimble, it is possible to perceive distant objects on an interactive surface using the
sense of touch. In a pretest, we examined how our participants search for objects on a surface using only
their sense of touch. We observed two search strategies. At the moment, we are trying to transfer these
findings to the prototype in order to improve searching speed and object discrimination. Another focus of
our ongoing studies lies on the design of the tactile signals that are conveyed to the user. The user has to
be able to distinguish between states like “on the object”, “object to the right” or “object too far away”.
In the future, we plan to further examine the composition of vibrotactile signals to convey information
about distance or location of an object. At the application level, we intend to use more than one device in
order to study user-experience and potential benefits.
5.2 Haptic Armrest
While the user’s dominant hand is interacting with virtual objects on an interactive surface, the non-
dominant hand is placed on this prototype (see
Figure 3). Again, the locations of exploration and the
resulting sensory feedback are apart. Geometrical object-characteristics such as edges are conveyed using
the movement of linear solenoid magnets. Additionally, tactile properties and functional or semantic
characteristics of the virtual object are applied using multiple vibration motors. Based on these multiple
motors, an additional carrier of abstract information is enabled by spatial encoding.
Figure 3: The Haptic Armrest. Left: two individually movable platforms and vibration
actuators that are positioned under each fingertip. Right: the non-interacting hand is
resting upon the actuators
The Haptic Armrest is a tactile interface that consists of a wooden board that the user can rest his non-
interacting hand upon. The interface contains two types of actuators: two solenoid magnets are used to
Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces 23
move two individual platforms (~1cm stroke). Little finger & ring finger of the user’s hand are resting on
the first platform; middle finger and index finger are placed on the second platform. Vibration motors are
attached to the two platforms in a way that they are placed under each fingertip. Following this approach,
the Haptic Armrest combines properties of vibrotactile interfaces and shape displays. In other words,
tactile information can be encoded using amplitude, frequency, duration or location of the signal, but also
by actual displacement of the user’s fingers.
The Haptic Armrest applies tactile information to the non-dominant hand of an interacting user (see
Figure 4). The tactile feedback can be described as being “mirrored” to the other side of the user’s body.
Haptic information is applied to a body-area significantly larger than the body-area that is in contact with
the interactive surface. Therefore, an enrichment of cutaneous experience is possible. Edges of virtual
elements can be presented as pushes of the platforms. The characteristics of virtual objects can be
conveyed using vibrotactile cues. Again, attributes of encoding are signal duration, amplitude, frequency
5.2.3 Intended Research
This prototype is used to assess the importance of haptically perceivable edges and areas for object
identification. An edge can be reduced to binary information (up/down) using the platforms. The
orientation and course of an object’s edge can not be perceived. At the moment, we are analyzing how the
sense of touch is used to discriminate between virtual interactive elements that have the same visual
Figure 4). We are interested in the importance of object edges and the possibilities to
describe these edges using actively generated tactile feedback.
Figure 4: Usage of the Haptic Armrest
in a first user-study (video still)
24 Hendrik Richter
5.3 Edge Matrix
This prototype is basically a matrix of 3x3 linear solenoids (see Figure 5). Again, this haptic interface
reacts to interactions of the user’s hand on an interactive surface. A virtual object’s orientation and
rotation is conveyed to the non-interacting hand or arm that is rested on top of the matrix.
Figure 5: The Edge Matrix is a basic haptic display
The Edge Matrix is an electromagnetic haptic display that can vibrate, for active feedback, as well as
dynamically change its shape, for passive feedback. It consists of a matrix of electro-magnetic actuators,
with individually controllable pins that can be pushed upwards. The solenoids can be activated
individually. As a result, the 9 pins can be moved up and down (9 mm stroke). Additionally, it is possible
to convey vibrotactile cues by continuously activating and deactivating single pins. We can simulate
certain amounts of surface hardness by changing the applied voltage. With this prototype, it is possible to
communicate shape information of objects on the interactive surface to other parts of the user’s body. It is
designed in a way that enlargement of the matrix is possible by adding solenoid actuators. The housing of
the device enables the researcher to adapt the stroke of the pins as well as the force of pressure to the
intended research question.
Again, this tactile interface dislocates tactile feedback to non-interacting body-parts. The body-location of
application is not defined for this tactile interface. It may be positioned to convey tactile cues to the
interacting arm (e.g. skin of the underarm). This approach would be similar to the Tactile Thimble.
However, the tactile cues generated by this prototype may also be applied to the non-interacting hand.
This approach would be similar to the one we used with Haptic Armrest.
The actuator technology combines attributes of vibrotactile interfaces and shape displays. Thus,
information can be encoded by variations in frequency, duration or amplitude of vibrotactile information.
Additionally, communication of shape information is possible by individually movable pins. As a result,
the possible bandwidth of conveyed information is higher than the one provided by the other prototypes.
E.g. the conveyance of edge orientation and course is possible by passive haptic feedback. An object’s
surface characteristics or abstract state is conveyable using active haptic feedback.
Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces 25
5.3.3 Intended Research
In the future, we plan to assess where to place the tactile display on the user’s body in order to convey
tactile object characteristics. Along with that, we are experimenting with size and form of the device.
Similar to the Tactile Thimble, more than one tactile display could be placed on the user’s body.
This prototype also has requirements on tracking technology. In order to analyze the relevance of
haptically perceivable orientation and course of virtual elements, it is necessary to track the orientation
and moving direction of the user’s finger. We are currently working on solutions involving optical and
sensor-based tracking technologies.
6 Conclusion and Future Work
This paper describes our approach to bring tactile feedback to an interactive surface. We analyze the
potential of spatially separating body-part of interaction (finger, hand) and resulting tactile feedback.
Three first prototypes of haptic interfaces are described in this paper: the Tactile Thimble, the Haptic
Armrest and the Edge Matrix. The Tactile Thimble is a glove that communicates the proximity and
position of distant virtual elements on a touch-sensitive surface. The information is encoded using
vibrotactile signals. The Haptic Armrest is a haptic interface that is used to convey tactile information to
the non-interacting hand. This prototype combines properties of vibrotactile interfaces and shape displays.
Our third prototype, the Edge Matrix is a basic pin array. We plan to use this interface to convey tactile
characteristics and spatial information (orientation, height) of virtual edges on touch-sensitive surfaces to
Our research is based on the assumption that the separation of manual input and tactile stimuli
provides a number of potential benefits. Using our approach of providing the user with tactile feedback,
the interaction with virtual objects on the interactive surface remains direct. Multi-touch surfaces could be
explored using multiple hands; every hand initiates a distinct transfer of information to the sense of touch.
We refer to this goal as Multi-Haptics. On a shared interactive surface, our approach could result in
personalized tactile feedback. The human sense of touch could be used as a channel carrying private
Our research is at an initial stage; our prototypes are evaluated in user-studies at the moment. We are
improving our methods to measure the perception and causal connection of spatially separated input and
feedback. In the future, we intend to improve our understanding of the perception of haptics on tabletops
through more formal evaluations. We are constantly enhancing our tracking technologies in order to
identify a finger or hand’s orientation on the table. Another important part of our research is the design of
the interfaces. We are experimenting how and where to apply the tactile cues on the user’s body and how
the form-factor of the interfaces can affect the user experience. At the application level, we assume that
our approach is particularly beneficial in collaborative and dynamic scenarios.
The author would like to thank Hüsnü Güney, Doris Hausen, Kadri Januzaj and Sebastian Löhmann for
technical support and valuable feedback.
 R. Leung, K. MacLean, M. Bertelsen, and M. Saubhasik, "Evaluation of haptically augmented
touchscreen gui elements under cognitive load," Proceedings of the 9th international conference
on Multimodal interfaces, ACM, 2007, p. 374–381.
 T. Kaaresoja, L. Brown, and J. Linjama, "Snap-Crackle-Pop: Tactile feedback for mobile touch
screens," Proc Eurohaptics, Citeseer, 2006, p. 565–566.
26 Hendrik Richter
 E. Koskinen, T. Kaaresoja, and P. Laitinen, "Feel-Good Touch: Finding the Most Pleasant
Tactile Feedback for a Mobile Touch Screen Button," Methodology, 2008, pp. 297-304.
 H. Iwata, H. Yano, F. Nakaizumi, and R. Kawamura, "Project FEELEX: adding haptic surface to
graphics," Proceedings of the 28th annual conference on Computer graphics and interactive
techniques, 2001, p. 476.
 I. Poupyrev, T. Nashida, S. Maruyama, J. Rekimoto, and Y. Yamaji, "Lumen: interactive visual
and shape display for calm computing," ACM SIGGRAPH 2004 Emerging technologies, 2004, p.
 D. Leithinger and H. Ishii, Relief, New York, New York, USA: ACM Press, 2010.
 H. Benko, M. Morris, A. Brush, and A. Wilson, "Insights on Interactive Tabletops: A Survey of
Researchers and Developers," research.microsoft.com, 2009.
 G. Morrison, "Interactive wall displays: interaction techniques and commercial applications,"
SIGGRAPH '07: ACM SIGGRAPH 2007 courses, ACM, 2007, pp. 54-64.
 K. Ryall, C. Forlines, M. Morris, and K. Everitt, Experiences with and Observations of Direct-
Touch Tabletops, IEEE, 2003.
 S. Brewster, F. Chohan, and L. Brown, "Tactile feedback for mobile interactions," Proceedings
of the SIGCHI conference on Human factors in computing systems, 2007, p. 162.
 I. Poupyrev, S. Maruyama, and J. Rekimoto, "Ambient touch: designing tactile interfaces for
handheld devices," Proceedings of the 15th annual ACM symposium on User interface software
and technology, ACM, 2002, p. 60.
 V. Chouvardas, A. Miliou, and M. Hatalis, "Tactile displays: a short overview and recent
developments," ICTA ‘05: Proceedings of Fifth International Conference on Technology and
Automation, 2005, pp. 246-251.
 N. Marquardt, M. Nacenta, J. Young, S. Carpendale, S. Greenberg, and E. Sharlin, "The Haptic
Tabletop Puck: Tactile Feedback for Interactive Tabletops," ACM Interactive Tabletops and
Surfaces - ITS'09. (Banff, Canada), 2009, pp. 93-100.
 S. Brewster and L. Brown, "Tactons: structured tactile messages for non-visual information
display," Proceedings of the fifth conference on Australasian user interface-Volume 28, 2004, p.
 J. Pasquero, "Survey on communication through touch," Center for Intelligent Machines-McGill
University, Tech. Rep. TR-CIM, vol. 6, 2006.
 H. Oey and V. Mellert, "Vibration thresholds and equal vibration levels at the human fingertip
and palm," 5th International Congress on Acoustics, 2004.
 R.S. Johansson, "Tactile sensibility in the human hand: receptive field characteristics of
mechanoreceptive units in the glabrous skin," J. Phyeiol., vol. 281, 1978, pp. 101-123.
 W. Heuten, N. Henze, S. Boll, and M. Pielot, "Tactile wayfinder," Proceedings of the 5th Nordic
conference on Human-computer interaction building bridges - NordiCHI '08, 2008, p. 172.
Multi-Haptics and Personalized Tactile Feedback on Interactive Surfaces 27
Assessing audiotactile interactions:
Spatiotemporal factors and role of visual
Department of Cognitive and Education Science, University of Trento,
Corso Bettini 31, Rovereto (Trento)
In the present paper, a brief overview of the results obtained in recent studies investigating audiotactile
sensory interactions is provided. In particular, data emerging from studies on either monkeys or – brain-
damaged and neurologically-intact – humans will be described, showing how the relative spatial position
of the stimuli, the portion of space stimulated and the presence/absence of visual cues affect the sensory
interplay between hearing and touch.
Humans continuously interact with an environment providing a large amount of sensory
information. The process by which the human nervous system tends to merge together the available
pieces of information in unique events is commonly known as ‘multisensory integration’ .
The pioneering contribution to the understanding of the neural correlates of this process is owed
to the studies in the superior colliculus performed by Stein and Meredith . This midbrain structure is
characterized by a high proportion of neurons responding to stimuli from more than a single sense (i.e.,
multisensory neurons). Multisensory integration is commonly assessed by considering the effectiveness
of a crossmodal stimulus combination, in relation to that of its component stimuli, for evoking some
responses from the organism. The crossmodal combination of stimuli evokes a number of impulses which
is significantly higher than the number of impulses evoked by the most effective of these stimuli
individually. Based on the study of such neurons’ response properties, some principles for sensory
integration have been formulated. The spatial principle is based on the evidence that only stimuli in
spatial register (likely originating from the same external source), fall within the overlap between
receptive fields of different sensory modalities, thus inducing an enhanced response; on the contrary,
stimuli from disparate locations fall outside this area, failing to induce any enhancement or even causing a
response depression. Moreover, only stimuli which occur close in time cause response enhancement,
whereas stimuli separated in time just induce responses comparable to the ones evoked by unisensory
stimuli (temporal rule). As it can be inferred, both spatial and temporal factors are of great importance for
the assessment of multisensory interactions.
Unlike audiovisual and visuotactile sensory pairings, the interactions occurring at both neuronal
and behavioural level between hearing and touch have been much less explored [3, 4]. This is somehow
surprising, considering the wide range of everyday life situations in which we can experience – even
though often in subtle and unconscious ways – the interplays occurring between these two sensory
modalities . Perceiving the buzzing and the itchy sensation of an insect approaching the rear surface of
our neck; reaching a mobile phones ringing and vibrating from our trouser pocket. All these situations
have in common the exclusive – or predominant – reliance on cue provided by senses other than vision.
Besides these anecdotal reports, however, empirical evidence further support the existence of correlations
between hearing and touch, thus justifying and corroborating additional investigations of this topic.
2 Experimental evidence
2.1 Studies on monkeys
Since the body is directly involved in the emergence of the tactile perceptual sensations, it
follows that interactions between audition and touch are stronger within the space close to the body, the
portion of space commonly known as ‘peripersonal space’ . This has been shown in monkeys ,
patients  and neurologically intact humans .
For instance, Graziano and his colleagues [7, 10], have documented the existence of a population
of trimodal neurons (i.e., neurons that respond to tactile, visual, and auditory stimuli) in the ventral
premotor cortex (PMv, or ‘polysensory zone’, PZ). These neurons have receptive fields (RFs) that extend
to a limited distance from the head, being able to respond to visual and auditory stimuli presented within
roughly 30 cm from the tactile RFs. The RFs cover the contralateral side of the head as well as the space
behind the head, thus providing a complete representation of the space surrounding the animal’s head.
Interestingly, the gradient of firing of these neurons was found to vary not only as a function of the
distance of the auditory stimuli from the animal’s head, but also as a function of their spectral complexity.
Indeed, these neurons were found to preferentially respond to complex sounds (i.e., white noise bursts),
with pure tones of different frequencies failing to elicit any significant response . Given this evidence,
these neurons would appear to make good candidates for the coding of multisensory characteristics of the
space in close proximity to the body, i.e., within reach.
2.2 Studies on brain-damaged human patients
Evidence in support of audiotactile interactions in peripersonal space has been shown to have
unique properties in human patients as well. In right brain-damaged patients suffering from left tactile
extinction, the concurrent presentation of sounds on the right side of the head strongly interfered with the
processing of the tactile stimuli on the left side of the neck (crossmodal auditory-tactile extinction) .
Interestingly, sounds interfered strongly with the processing of tactile inputs when they were
delivered from close to the head (i.e., at a distance of 20 cm), but the effect was substantially reduced
when they were presented far from the head (i.e., at a distance of 70 cm). Furthermore, this pattern of
results primarily emerged with complex sounds, with pure tones inducing only a mild form of crossmodal
extinction. Lastly, the portion of space from which the stimuli were presented, have proved to play a
profound role in modulating audiotactile interactions. In particular, while white noise bursts exerted a
stronger influence on tactile processing when the sounds were presented in the rear (vs. front) space, the –
mild – effects induced by pure tones were selectively observed in rear space but not in frontal space ,
and furthermore were not modulated by the distance from which the sounds were presented.
2.3 Studies on neurologically-intact humans
The first study of audiotactile interactions taking place in the region of space surrounding the
head, in particular, the space behind the head (i.e., in the part of space where visual cues are not available)
of neurologically-intact people was conducted by Kitagawa and his coworkers . In Kitagawa et al’s
second experiment, a distractor interference task was used, with participants performing a tactile left/right
discrimination task while auditory distractors (which were to be ignored) were presented simultaneously
from the same or opposite side. In this task, the participants responded more slowly (and less accurately)
when the auditory distractors were presented on the opposite side from the target tactile stimuli.
Furthermore, the magnitude of this crossmodal interference effect varied significantly as a function of the
distance and complexity of the auditory stimuli used. Whereas white noise bursts exerted a stronger
crossmodal interference when they were presented from close to the participants’ head (i.e., within 20
cm) than when they were presented further from the head (i.e., 70 cm or more away), when the auditory
stimuli consisted of pure tones, the overall effect was lower and was not modulated by the distance from
which the sounds were presented.
Across the studies so far described, it has been demonstrated that crossmodal audiotactile
interactions vary as a function of different parameters (e.g., distance between the stimuli and the body,
30 Valeria Occelli
spatial location of stimulation, and auditory complexity). First, stimuli presented in peripersonal space are
distinctive and better able to attract attentional resources than stimuli presented in more distant regions.
Second, the spatial arrangement of the stimuli has proved to play a significant role in modulating
audiotactile interactions. Lastly, the interactions occurring between tactile and auditory complex stimuli
are more pronounced than any interaction between tactile stimuli and pure tones.
Besides pointing to a high degree of similarity of the neural mechanisms coding for audiotactile
interactions between primates and humans, the data emerging from these studies support the assumption
that audiotactile spatial interactions are more prevalent in the region of space behind the head [8, 9].
Thus, the suggestion that has emerged from this kind of research is that the absence of vision (or
visual information), as for stimulation occurring behind the head [8, 9] or as a result of blindness 
seems to be related to the emergence of more prevalent spatial modulation of audiotactile interactions
than those occurring in the frontal space .
This conjecture has received further support from a recent investigation that was designed to
assess the potential existence of an audiotactile version of the so-called Colavita effect, as it has been
termed from the name of its first investigator . This term has been adopted to define the phenomenon
by which neurologically-normal participants, performing speeded detection/discrimination response tasks,
preferentially report the visual component of pairs of bimodal pairs of stimuli, sometimes neglecting to
report the non-visual, auditory: [14, 15]; or tactile  component. As in a typical study of the Colavita
effect, the participants in Occelli et al.’s study  had to make speeded detection responses to unimodal
auditory, unimodal tactile, or bimodal audiotactile stimuli. The physical features of the auditory stimuli
presented in frontal (Experiment 1) or rear space (Experiment 3), and the relative and absolute position of
auditory and tactile stimuli in frontal (Experiment 2) and rear space (Experiment 3) were manipulated.
The most interesting results for the present theoretical context were observed in the third
experiment, in which the tactile stimuli were presented to the side (right or left) of the back of the
participant’s neck, whereas the auditory stimuli – consisting of either white noise bursts or pure tones –
were presented either in close spatial proximity to the body (i.e., over headphones) or far from it (i.e.,
from the loudspeakers located approximately 60 cm behind the participant’s head).
The results demonstrated that sound complexity modulated the audiotactile Colavita effect,
which was only observed when the auditory stimuli consisted of white noise bursts. If it is true that
complex auditory stimuli are more likely to interact – and possibly to be integrated – with the tactile
stimuli in rear space [7, 9], the auditory pure tone stimuli might have been more discriminable as
compared to the white noise bursts . This, in turn, could have facilitated the detection of both discrete
sensory components of the bimodal trials in the present study.
Interestingly, the spectral complexity of the auditory stimuli had a differential effect as a
function of the portion of space from which the stimuli were presented. Namely, when the frontal space
was stimulated (Experiment 1), no audiotactile Colavita effect was observed for any kind of auditory
stimuli (i.e., pure tones or white noise bursts), whereas when the stimuli were presented from the rear
space (Experiment 3), the Colavita effect was selectively observed for those conditions in which the
stimuli consisted of white noise bursts and not of pure tones. This data strengthen the hypothesis, already
put forward in previous studies [8, 9], that in the rear space complex auditory stimuli (rather than pure
tones) and tactile stimuli are more likely to interact.
The fact that a significant audiotactile Colavita effect was reported when the stimuli were
presented from the same (vs. opposite) side only when presented from the back space and not when
presented from the frontal space, can be explained by considering the fact that the spatial factors
differentially affect audiotactile interactions as a function of the region of space in which they occur. As
already mentioned, interactions involving auditory and tactile stimuli presented in frontal space tend to be
less sensitive to spatial manipulations than are sensory interactions involving vision as one of the sensory
components [18, 19, 20]. By contrast, the processing of the auditory and tactile spatial cues can be
improved by presenting the stimuli in the portion of space where visual cues are typically not available
. The fact that this benefit was selectively observed for auditory white noise stimuli and not for pure
tones is consistent with evidence showing that the spectrally-dense auditory stimuli (i.e., stimuli that
contain a wide range of frequencies) induce noticeable benefits in auditory localization for the white noise
bursts and not for pure tones .
Assessing Audiotactile Interactions: Spatiotemporal Factors and Role of Visual Experience 31
Interestingly, the Colavita effect was exclusively observed when the auditory stimuli were
presented via headphones, supporting the notion that the stimulation possibly involved a more
pronounced integration of the signals, making the detection of the two discrete sensory components
harder . This result therefore suggests that when stimulation involves sounds originating from behind
and close to the head it is somehow distinctive and gives rise to effects that cannot necessarily be detected
when the stimulation is delivered far from the head  or in the peri-hand space .
Taken together, these results therefore suggest that the audiotactile Colavita effect is affected by
both the relative and absolute locations from which the auditory and tactile stimuli are presented, as well
as by the spectral complexity of the auditory stimuli. Of particular interest here is