ArticlePDF Available

Abstract and Figures

Affective haptic research is a rapidly growing field. This article intends to improve the existing literature and contribute by involving consumers directly in the design of a smart haptic jacket by adding heat, vibration actuators, and by enhancing portability. The proposed system is designed for six basic emotions: love, joy, surprise, anger, sadness, and fear. Also, it can support several interacts such as a hug, poke, tickle or touch. An online survey was designed, based on literature, and conducted on 92 respondents, who gave their opinion about the physiological impact of emotions and interactions on the human body. The results of this survey assisted in the general design and implementation of the system. 86 % of the volunteers who participated in the final experiment expressed their interest in the system and said that the quality of their multimedia experience was improved through use of the jacket. Detailed design architecture is provided, along with the details of the hardware and software used for the implementation.
Content may be subject to copyright.
Design and development of a user centric affective
haptic jacket
Faisal Arafsha &Kazi Masudul Alam &
Abdulmotaleb El Saddik
#Springer Science+Business Media New York 2013
Abstract Affective haptic research is a rapidly growing field. This article intends to improve the
existing literature and contribute by involving consumers directly in the design of a smart haptic
jacket by adding heat, vibration actuators, and by enhancing portability. The proposed system is
designed for six basic emotions: love, joy, surprise, anger, sadness, and fear. Also, it can support
several interacts such as a hug, poke, tickle or touch. An online survey was designed, based on
literature, and conducted on 92 respondents, who gave their opinion about the physiological
impact of emotions and interactions on the human body. The results of this survey assisted in the
general design and implementation of the system. 86 % of the volunteers who participated in the
final experiment expressed their interest in the system and said that the quality of their
multimedia experience was improved through use of the jacket. Detailed design architecture is
provided, along with the details of the hardware and software used for the implementation.
Keywords Wearable haptic jacket .Basic emotion .Vibrotactile actuators .Heat actuators .
Multimedia feedback
1 Introduction
According to Parrot [14], Emotions are at the heart of social psychology and are part of what
makes social psychology so interesting. Human emotions can be defined as a set of feelings
that occur in a certain manner and affect behaviour. Some emotions can be evoked socially by
touch, such as a handshake, a hug, or a tickle [9]. Improving emotional involvement, in
entertainment for example, by improving video and audio comes closer to its limits [12]. For
the future of emotions however, research has been concerned for some time about stimulating
Multimed Tools Appl
DOI 10.1007/s11042-013-1767-3
F. Arafsha :K. M. Alam :A. El Saddik (*)
Multimedia Communications Research Lab (MCRLab), School of Electrical Engineering and Computer
Science, University of Ottawa, Ottawa, Ontario, Canada
e-mail: elsaddik@site.uottawa.ca
F. Arafsha
e-mail: faraf069@uottawa.ca
K. M. Alam
e-mail: mkazi078@uottawa.ca
emotions using haptic feedback systems [6,7]. This is also referred to as Affective Haptics
[20], which is becoming increasingly popular. It mainly focuses on systems that evoke or affect
ahumans emotional condition by the sense of touch. In other words, Affective hapticsis an
area of research that is being more heavily explored lately. It is concerned with the design of
systems and devices that can intensify, evoke, or further influence the emotional state of a
person by the sense of touch [21].
Involving emotional haptics is also considered an important aspect of research today. There
are some differences in the classifications of basic human emotions amongst researchers and
theorists. However, the majority of them classify basic emotions into love,joy,surprise,
anger,sadness,andfear.Someofthemuseloveinstead of disgust[10,13,15,19].
Recent research reveals that the absence of a comforting touch can cause social problems and
mayevenleadtoanearlydeath[8]. In Romania, thousands of orphans lacked affectionate touch
as they were being raised in facilities that were overcrowded, which caused the spread of
emotional and social problems [4]. People want to become more immersed into new technology
and are always looking for something new [12]. The involvement of haptic wearable devices can
also become a major assistance to visually impaired people in the areas of multimedia and social
involvement. Some research is currently involved in designing haptic wearable devices as a
complementary device to audio described movies [22]. In addition, audio/visual descriptions of
multimedia are reaching their limits with todays advances in technology, such as HD and/or 3D
video and high quality audio. Further improvements are needed for these different aspects [5,12].
This research is intended to incorporate consumersopinions and currently available haptic
technologies which improve Human Computer Interaction (HCI) by providing consumer-
centric affective haptic clothes. The proposed system will support different applications with
the availability of proper interfacing software design. One application is enhancing video gaming
and movie watching experiences similar to the concept of subtitles in movies. Synchronized
instructions can be sent to the haptic jacket at specific times to evoke emotions based on the
current scene in a movie or a video game. Another possible application for the proposed haptic
jacket is helping visually impaired people understand the emotions of the people in front of them
by analyzing facial expressions. Facial expression recognition for emotional analysis is already
being addressed in current research [24]. Such software can analyze facial expressions and use
the data with simple interfacing software to evoke emotions in the proposed haptic jacket.
2 Related works
Hossain et al. [9] developed interpersonal haptic communication in Second Life
1
where social
physical interactions, such as a handshake, a hug, or a tickle, could stimulate emotions and
cause an emotional reaction for people. This research developed an add-on that works with the
Second Life virtual environment to provide physical communication functions to users
through a haptic jacket. This jacket, using a set of carefully placed vibrotactile actuators,
provides the sensation of touch to the user based on the signals received from the Second Life
add-on. These signals are produced by annotating different parts of the Second Life avatar, and
then actuating the corresponding actuators in the haptic jacket.
Cha et al. [5] implemented HugMe, a synchronous haptic teleconferencing system which
aims to provide the ability to express intimacy remotely. The system presented is compatible
with a tolerable bandwidth of 3060 Hz for haptic data. Using a 3-DOF force feedback device,
the user can touch a remote user who, in turn, can feel the touch on the contacted skin through
1
Second Life, http://www.secondlife.com
Multimed Tools Appl
a wearable jacket. This jacket contains an array of vibrotactile actuators that activate depending
on the locations of the touches.
A wearable jacket was also developed in Philips Research Europe [11,12] that focuses on
influencing the emotions of movie watchers by creating tactile stimuli to their bodies. The
jacket is body-conforming and contains 64 tactile stimulators. It attempts to increase the users
emotional immersion by rendering emotional effects to the users skin. This project is based on
the idea that emotions are accompanied by different physical reactions, and that by stimulating
these physical reactions using the jacket, the accompanied emotions could be evoked.
Rahman et al. [1] introduced a custom made web browser with a plug-in that creates tactile
content to be simulated on a wearable vibrotactile jacket. This plug-in enables online video
viewers to feel haptic feedback through the jacket while watching YouTube videos. The plug-
in works by creating a sequence of arrays in XML format. These arrays are composed of
timestamps and actuating values for the operation of the jacket. Viewers can feel the haptic
feedback by choosing YouTube videos that contain annotated files with tactile content, which
is saved on an on-demand server.
Alam et al. introduced complete text analysis for SMS messages looking for secondary
level emotions [14] and communicating them to the users of haptic vibrotactile devices [2].
Using a haptic phone or a haptic jacket, this application prototype maps every emotion into a
distinct vibration pattern. The overall system is composed generally of an emotion server and a
mobile client. The application is in action when the receiving phone forwards received
messages to the remote emotions server (REM) using Simple Object Access Protocol
(SOAP). The message is then analyzed and related emotions are extracted. Finally, emotion
vectors are created and sent back to the phone which, in turn, forwards that vector to the haptic
device (which may be the same phone, another phone, or a haptic jacket).
iFeel_IM! [20,21] introduced a system that uses haptic and visual devices during online
conversations to communicate experienced emotions during that conversation. It consists of
four affective haptic components: HaptiHeart, HaptiHug, HaptiTickler, and HaptiButterfly.
These four affective haptic components activate to represent emotions experienced during
online conversations. Huggy Pajama [17,18] is a wearable and mobile HCI system that allows
its users to remotely feel and hug each other. It is mainly intended for parents who travel a lot
and want to keep in touch with their children and express their affection regularly. The system
is composed of a small, mobile doll that has a pressure sensing circuit embedded in it to sense
different levels of force produced by human interaction (i.e. the hug). Then, it sends the sensed
hug signals to a haptic jacket (or pajama) that, in turn, simulates the measured hug and gives its
wearer the feeling of a hug. The doll (the input device / hugging interface) and the jacket (the
output device) communicate through the internet, and simulate the hug using hardware that
include air pockets, heating elements, and color changing fabric.
The Hug Shirt
2
allows users to send hugs at a distance. The system works by transferring
received hug instructions via SMS to actuators and sensors embedded in the shirt using
Bluetooth technology. Hugs can be sent either by creating a hug SMS instruction using a
provided mobile java application, or by touching specified locations on another input shirt that
is designed for this purpose. The Hug Shirt runs on rechargeable batteries, and is built using
components that comply with the Restriction of use of Hazardous Substances (RoHS
3
).
4
2
Hug Shirt, CuteCircuit, http://www.cutecircuit.com/products/thehugshirt/
3
What is RoHS, Department for Business Innovation and Skills - National Measurement Office, UK, http://
www.bis.gov.uk/nmo/enforcement/rohs-home
4
Best Inventions of 2006, TIME Specials, http://www.time.com/time/specials/packages/article/
0,28804,1939342_1939424_1939709,00.html,2006.
Multimed Tools Appl
From the related studies, it is found that some researchers are focused on the haptic
interfacing aspect, others are focused on the emotional aspect, and a very few are focused
on the integration of emotional stimulation and haptic interfacing in the same system.
Our research contributes to the topic of emotional haptics by designing and
implementing an emotion stimulating haptic jacket which supports six basic emotions
as well as different types of interactions such as a hug, poke, touch, or tickle, and is
based on an empirical study of crowd participation. It includes vibrotactile and heat
actuators in addition to temperature sensors on the body locations agreed upon by the
users.
3 Proposed system and design
In this section, a detailed description of the proposed system is explained including an
explanation of the overall architecture design and the different modules associated with it. It
starts by describing how the data was collected using an online survey and presents the results
of the survey. After that, it explains the general architecture of the implemented system based
on the results of this survey by providing a system overview and an explanation of the
affective haptic components of the system.
3.1 Online survey
The main concept used to construct this survey was based on the list of emotions defined
in the Parrot 2001 [14] model of social psychology as described earlier. The goal was to
figure out which part of the body represents which emotion according to general users. A
total of 92 responses were collected in this survey. Fifty-eight of the respondents were
male and 34 were female. All except for one were between the ages of 18 and 29. The
respondents had varied degree of haptic experiences from none to expert. But, on an
average the population was haptic experienced and most of them are from engineering
background.
3.1.1 Data collection
In the conducted survey, respondents were asked to give their own opinion on how each
(but not necessarily all) of the basic emotions mentioned above can be represented
physically using actuators embedded in smart clothes. In the survey, in addition to
providing the ability for respondents to give their own ideas, three basic types of
feedback were suggested: vibration, beats, and warmth. As a reference, Fig. 1was added
to the survey to assist in providing more accurate and unified results. The areas of
interest are the neck, chest (right and left sides), abdomen (left, middle, right), shoulder
back, back bone, and arms. The Fig. 1was compiled analyzing the available literature
and our intuition.
The survey was created on Survey Monkey
5
as the response collecting link. It was started
on February 8, 2011, and closed on February 22, 2011 (a duration of 2 weeks). The number of
responses varied for each part of the survey because respondents were only asked to answer
the parts they wanted, and had the option to skip questions.
5
www.surveymonkey.com
Multimed Tools Appl
In the survey, we used check boxes to count the number of responses for each type of
feedback, for each body area, and for each emotion. Also, a text box was provided for any
other ideas the users may have (Fig. 1).
3.1.2 Survey results
The results of the conducted survey are described below. Each time a respondent checks a
checkbox in the survey, the count for that specific item is increased. The most interesting and
implementable ideas provided by the respondents are also described below. Overall responses
from the survey participants were analyzed and an overall system was proposed which is
discussed in Section 3.2.
Love (affection, llust, and llonging) The survey results showed that the majority of respon-
dents suggested beats simulation in the jacket to represent love, specifically slow beats in the
left chest. Also, it was noticed that, a good number of responses suggested warmth in the neck
and the abdomen area. One written response also suggested warmth all over the body. Total
number of responses: 91.
Joy (cheerfulness, zest, contentment, pride, optimism, enthrallment, and relief) To evoke
joythe results of the survey showed that the majority of responses voted for beats in the
left chest side and a slight pressure on the complete chest area. Some of the written responses
suggested the simulation of beats to be soft, and to simulate one long breath. Total number of
responses: 87.
Surprise (amazement, and astonishment) According to the survey, surprisecan be simulated
by vibration in almost all the upper body in a shiver-like manner, coupled with a sudden,
Fig. 1 Example given to assist in responding to the online survey
Multimed Tools Appl
slightly strong, and fast heartbeat that fades out after a few seconds. Some respondents also
suggested simulating a benign sense of goose bumps, lowering body temperature, and a
hardening of the body. The last part can be implemented by tightening the worn jacket
mechanically. Total number of responses: 88.
Anger (irritation, exasperation, rage, disgust, envy, and torment) Angercan be evoked by
some warmth around the neck and vibration in the arms with strong beats on the heart in a
normal speed. Written ideas included tightening the neck and the chest for harder breathing,
and simulating fast breathing. Total number of responses: 90.
Sadness (suffering, disappointment, shame, sympathy, and neglect) According to the survey
results, sadnesscan be emulated by a tightness on the chest using some pressure and
providing some slight vibrations all over the abdomen area. Other written responses included
slow heartbeats and a cold body temperature. Total number of responses: 83.
Fear (horror, and nervousness) Results for fearseem to be very similar to anger.
However, for fear, vibration is suggested around neck, arm and back bone area and faster
beats on the left chest. Vibration in the backbone will give a shivering experience. Total
number of responses: 89.
At the end of the survey, respondents were asked if having a haptic jacket with a focus on
emotions was of interest to them. The results show that 78 responded positively, 13 responded
negatively, and one respondent skipped this question.
3.2 Affective clothe component organization
Smart clothe, is becoming increasingly popular in the area of human-computer interaction
(HCI). The main focus of affective haptics is on systems that evoke or affect a humans
emotional condition using the sense of touch [20]. The data collected from the conducted
survey was highly valuable in the overall design of the operation of our system. The designing
decisions were made based on the direct input of the consumers, and the scenarios were agreed
upon as shown in Fig. 2.
As shown in Fig. 2, different scenarios are to be implemented, and different affective haptic
components are required. Based on these scenarios, the system is designed to be composed of
the following affective haptic body parts:
Jacket JðÞ¼fNeck NðÞ;Right Chest RCðÞ;Left Chest LCðÞ;Right Abdomen RAðÞ;
Middle Abdomen MAðÞ;Left Abdomen LAðÞ;Back Shoulder BSðÞ;Back Bone BBðÞ;
Arms AÞgð
In our system, each type of emotional status is represented using the following mechanical
features:
Mechanical Features ¼Warmth WðÞ;Vibration VðÞ;Beats BðÞ;Pressure PðÞ
fg
From Table 1, we find that loveis represented using warmth and heartbeats. Based on the
usersresponses, we have placed warmth and beats on different body parts of the jacket. We
have used vector representation of body parts to formulate different emotional statuses. The
presence of different body parts, such as the neck, abdomen, arms, etc. of the proposed jacket
for a particular emotion, are represented using 1/0 (on/off) in vector J which we call the
localized jacket vector. Additionally, the intensity of one mechanical feature, such as warmth
Multimed Tools Appl
or vibration, on a body part of the jacket is maintained in a range of [01]. Each mechanical
feature, which has been used to simulate a basic emotion, is represented by a feature intensity
vector (Table 2) to control its strength on different body parts. Feature intensity vector of
Table 2is produced following empirical studies conducted by haptic and emotion stimulation
domain experts from August 2011May 2012. Several iterations of jacket component rear-
rangements were made and subsequent user feedbacks were collected to reach a consensus
among domain experts to find average feature intensity value for various mechanical features.
It is to be noted that experts agreed that an ideal solution will be to have an adaptive system
where feature intensities vary according to the user such as male, female, children, visually
impaired etc.
In our system, W
love
is placed in both neck and abdomen parts of the jacket. We maintain
different intensities of warmth in neck and abdomen parts using the feature intensity vector.
Beats simulation is naturally placed in the left side of the chest and it has its own intensity
vector. Hence, the final vector representation of Loveis W
love
,H
love
.Thejackets system
controller generates W
love
and H
love
vectors following Table 2and transmits them to the
microcontroller to simulate the hardware components to produce the right emotional effect.
Fig. 2 Proposed system implementation
Multimed Tools Appl
Here, Love = {W
love
,B
love
}
Wlove ¼Localized Jacket Vector * Feature Intensity Vector
¼Jlove *Ilove
¼1;0;0;1;1;1;0;0;0
fg
*0:4;0;0;0:4;0:4;0:4;0;0;0
fg
¼0:4;0;0;0:4;0:4;0:4;0;0;0
fg
Blove ¼Jlove*Ilove
¼0;0;1;0;0;0;0;0;0
fg
*0;0;0:5;0;0;0;0;0;0
fg
¼0;0;0:5;0;0;0;0;0;0fg
Similarly, Hug is represented using pressure, vibration and beats (Fig. 2). Actuators
following certain pattern are placed in the complete chest and vibration-beat is placed in the
back shoulder. Little vibrationsare alsoused on the middle abdomen.
Hug ¼Phug;Vhug;Bhug

Phug ¼Jhug*I
hug
¼0;1;1;0;0;0;0;0;0
fg
*0;0:7;0:7;0;0;0;0;0;0
fg
¼0;0:7;0:7;0;0;0;0;0;0
fg
Vhug ¼Jhug*I
hug
¼0;0;0;0;1;0;1;0;0
fg
*0;0;0;0;0:4;0;0:5;0;0
fg
¼0;0;0;0;0:4;0;0:5;0;0
fg
Bhug ¼Jhug*I
hug
¼0;0;0;0;0;0;1;0;0
fg
*0;0;0;0;0;0;0:6;0;0
fg
¼0;0;0;0;0;0;0:6;0;0
fg
Tab l e 1 Body local vector for emotions and interactions
NRCLCRAMALABSBBA
Love W 1 0 0 1 1 1 0 0 0
B00100 0000
Joy P 0 1 1 0 0 0 0 0 0
B00100 0000
Surprise V 0 1 1 1 1 1 0 0 1
B00100 0000
Anger W 1 0 0 0 0 0 0 0 0
V00011 1001
B00100 0000
Sadness P 0 1 1 0 0 0 0 0 0
V00011 1000
Fear V 1 0 0 0 0 0 0 1 1
B00100 0000
TickleV00010 1000
Poke B00001 0000
Tou ch V 0 0 0 0 0 0 1 0 0
B00000 0100
P01100 0000
Hug V 0 0 0 0 1 0 1 0 0
B00000 0100
Multimed Tools Appl
Different types of vibrations, pressures, and warmth used in the jacket design are described
below.
&Chest Vibration and Pressure
A number of carefully localized vibrotactile actuators are distributed in the inside of the
jacket in the chest area. They were distributed approximately 12 cm apart in a grid pattern.
These actuators are instructed by the microcontroller to activate in order to simulate the
surpriseemotion. In order to simulate joy, sadness, or a hug, these same actuators are used
in a different pattern so that a pressure is felt in the chest.
&Neck Vibration
A few small vibrotactile actuators were placed on each side inside of the jackets collar.
These actuators vibrate when activating the fearemotion in the jacket.
&Arms and Backbone Vibration
A set of vibrotactile actuators were embedded into the inside of the sleeves around the arms.
The microcontroller gives instructions to these actuators to vibrate in order to stimulate fear,
angerand surprise.
A shiver-like feeling is simulated by using the set of vibrotactile actuators inside the
sleeves. The microcontroller is set up to instruct the actuators to vibrate in a specific and
calibrated order for specific durations and give a shiver-like feeling around the arms/back. This
function is used in the case of stimulating surpriseor anger. Five vibrotactile actuators are
placed on the back and front of each bicep, and are programmed to take turns actuating for
Tab l e 2 Mechanical feature intensity vector for emotions and interactions
NRCLCRAMALABSBBA
Love W 0.4 0 0 0.4 0.4 0.4 0 0 0
B000.50 0 0000
Joy P00.40.400 0000
B000.30 0 0000
Surprise V 0 0.4 0.4 0.5 0.5 0.5 0 0 0.3
B000.30 0 0000
AngerW0.3000 0 000 0
V0000.40.40.4000.5
B000.40 0 0000
Sadness P 0 0.7 0.7 0 0 0 0 0 0
V0000.30.30.3000
Fear V 0.3 0 0 0 0 0 0 0.7 0.5
B000.80 0 0000
TickleV0000.70 0.7000
Poke B00000.80000
TouchV00 00 0 00.300
B00000 00.400
Hug P00.70.700 0 000
V 0 0 0 0 0 .4 0 0.5 0 0
B00000 00.600
Multimed Tools Appl
100 ms from shoulder to elbow. The number of vibrators and the durations of vibrations were
selected to represent the shiver feeling in the best way possible after several trials. Similarly,
eight vibrotactile actuators are placed on the backbone and produces the shiver- like experience
for anger.
&Abdomen and Shoulder Back Vibration
In the case of abdomen and shoulder back vibration, a number of carefully localized
vibrotactile actuators are distributed on the inside of the jacket. They were distributed approx-
imately 12 cm apart in a grid pattern. These actuators are instructed by the microcontroller to
activate in order to simulate surprise,anger,sadness,tickle,touch,andhug.
&Neck and Abdomen Warmth
Two thermoelectric coolers are also placed on each side inside the jackets collar to give a
warm feeling in order to stimulate loveand anger. Also, for love, three thermoelectric
coolers are placed in left, middle and right abdomen. In order to avoid accidental increases in
temperature, tiny electronic thermometers were directly attached to the heat actuators. Their
role is to strictly monitor the temperature of the actuators and give feedback while they are
activated.
&Beat Simulation
It has been proven in research that changing the rate of heartbeats can change a humans
emotional state, and that each emotion has a different and distinct heartbeat rate [20]. Also, the
Merck Manual of Medical Information states that the normal heart rate at rest is usually
between 60 and 100 beats per minute, which is approximately between 1.00 Hz and 1.67 Hz.
However, lower frequencies may also be normal in young adults, especially if they exercise
regularly. Besides responding to exercise and inactivity, a heartbeats frequency also responds
to stimuli such as pain and anger [3]. In the implementation of the prototype, this information
was used so that the heartbeats rate differed depending on the activated emotion.
3.3 Detailed system architecture
The jacket is designed so that it can be used in a variety of applications. It operates by
connecting to a computer or a pre-set digital system that gives direct instructions and evokes
each emotion for the desired duration.
Here, we used an Arduino Duemilanove
6
microcontroller to control the jacket. The micro-
controller is connected directly to the computer that has an interfacing application. This interfac-
ing application is responsible for giving instructions that the jacket (or the microcontroller) can
understand Fig. 3.
The microcontroller is programmed to receive six simple and direct instructions
representing the six emotions discussed previously (see Fig. 2). Also, it receives the desired
duration associated with each emotion. The format of receivable instructions is a simple one.
The microcontroller receives instructions as a set of three bytes from the interfacing program.
The first byte represents the emotion number (1: love, 2: joy, 3: surprise, 4: anger, 5: sadness,
and 6: fear). The second and third bytes represent the duration for which the emotion shall be
activated in the jacket (in seconds). For example, sending the bytes 456 will activate anger for
56 s, and 654 will activate fear for 54 s.
6
Arduino Duemilanove, http://arduino.cc/en/Main/arduinoBoardDuemilanove
Multimed Tools Appl
The microcontrollers input and output pins are connected to what we call the networking
board. The networking board contains all the circuitry needed to control the jacket. It also
contains necessary voltage regulators to accommodate the different voltage requirements of the
jackets components, as well as BJT transistors to control power thirsty components. This
networking board is designed to be connected to the jacket through pluggable wires to make it
easy to improve and to replace parts when necessary.
4 Implementation and evaluation
4.1 Implementation
To implement this system, various hardware and electronic parts are needed. Many products
were dismissed either due to crucial reasons, or simply due to high prices where quality is
almost similar. The following is a description of the main devices used for the implementation
of our system, indicating important characteristics to ensure acceptable quality.
4.1.1 Vibrotactile actuators
Vibrotactile actuators are one of the key requirements for the system since most affective
haptic components highly depend on them. Therefore, the quality of actuators chosen from the
market must be comparatively high. The actuators weight should be very light, and barely
noticeable when worn. It must be small in size, must use low voltage (less than 5V), and must
consume very little power. The reason for these requirements is that most microcontrollers
available in the market cannot provide more than 5V, and provide very little current, thus it is
preferred that devices connected to it comply as much as possible.
4.1.2 Heat actuators
For the same reasons mentioned above, heat actuators must be small in size and require no
additional hardware to operate. There exists many heat actuators on the market, but very few
are small in size. The main issue with heat actuators is that they consume a lot of energy in
comparison to any other electronic hardware. One other important issue concerning heat
actuators is the duration they need to warm up and to cool down.
Fig. 3 System architecture
Multimed Tools Appl
4.1.3 Temperature sensors
Temperature sensors are used to ensure safe operation of the heat actuators and to avoid
accidental burns.
4.1.4 Microcontroller
The main points concerning the selection of a suitable microcontroller include the availability
of a sufficient number of controlling I/O pins for the many sensors and actuators connected to
it, providing enough power to operate most of the connected devices, and ease and efficiency
of programming.
4.2 Difficulties in implementation
During the implementation of this system, valuable lessons were learned as some difficulties
arose related to power, wiring, and lack of specific devices. In this section some of the most
important problems are listed, followed by the solutions used to overcome them.
4.2.1 High power requirement
The purchased thermoelectric coolers were the most suitable heat actuators found during this
research. However, they are very high power consumers and require a power supply of around
15V and 7A DC current to operate in ideal conditions. This was an issue since the Arduino
microcontroller board cannot provide sufficient power to meet this high demand. Therefore, an
external power supply device was used in multiple trials with different voltages and current
supplies until the most suitable supply for the system was found at 5.3V and 1.8A supply. The
tests involved trials for connecting the heat devices in parallel and in series, and keeping in
mind that the most acceptable voltage supply was 6V. This was in order to make the system
functional with batteries, as well as to be as portable as possible. In addition, BJT transistors
were used to control the power coming from the external power supply and provide high
current.
4.2.2 Safety
In order to avoid incidental burns from the thermoelectric coolers, it was necessary to monitor
their temperature digitally and control their output. Therefore, digital temperature sensors were
purchased and attached to each one of the thermoelectric coolers.
4.2.3 Calibration
For the shivering sensation on the arms, each vibrotactile actuator had to operate in its own
turn in order to provide a realistic shiver-like feeling. Therefore, each actuator was instructed to
operate for 100 ms, and stop to actuate, then the next one would operate, and so on. Also, to
simulate heartbeats, some intensive research was done until the best timings were agreed upon.
Each heartbeat consists of two separate sub-beat mechanisms, and each sub-beat sounds for a
specific duration. Also, the duration between every sub-beat must be known. This information
was necessary and different for each emotion. Finally, power calibration was needed for the
different affective haptic components in the jacket. Neck vibration, for example, requires very
little DC current, while the ten chest vibrators required much more. The heartbeat vibration
Multimed Tools Appl
actuators also needed to be distinguishable while the chest is vibrating. All these notes were
taken into consideration and different power assignments were given to different parts of the
system.
4.2.4 Synchronization
The Arduino microcontroller cannot be programmed to do multithreading. This was a problem
when actuating heartbeats while another affective haptic component was operating. To over-
come this problem, the code instructions were synchronized so that each affective haptic
component activates at the appropriate instant and for the appropriate duration. In other words,
the overall code was written as if multithreading was predefined.
4.3 Evaluation procedure
The evaluation of the overall system was done in the following steps:
&A media player software was built that plays videos synchronously with timed instructions
to be sent to the haptic jacket. The media player plays six previously selected short videos
(< 1 min each). Each video represents one of the six emotions used in our research. We
also kept four videos (< 30 s) to induce the sensations of touching, tickling, hugging, and
poking.
&Fourteen volunteers who were aged 1828 years participated in our test. Among them, ten
were males and four were females. They were asked to wear the haptic jacket during the
complete test and each session continued for approximately 45 min with 10 min break. In
the first session, every user has experienced all six videos each on thrice. After each video
watch, they rated the quality of experience (QoE). In the second session, each user
experienced all the actual haptics, interaction videos each one twice, once with haptics
and once without haptics. All the users participated in the tests were greeted with
encouragement remunerations and welcome snacks.
&The first test bed was designed to verify the component organizations of each emotion. In
this case, each emotive video was played once with actual haptic features and twice with
fake haptic features. Fake haptics is a misrepresented haptic pattern according to our
design. For example, when love type of video scenario was played, surprise or anger
haptic pattern was instigated in the jacket. Every user was asked to select all three buttons
of the same row in random order from GUI player. After the completion of one emotion
verification session, users were asked to rate the quality of the experience (QoE) of each.
From this random order, we have classified the QoE results in actual haptics and fake
haptics group (Fig. 5).
&In the second test bed, each video clip was played twice, once keeping the haptic jacket
active with the actual/original haptic features, and once without haptics. The order was
completely random for every video and for every participant. After watching all the videos,
participants were asked to fill a short QoE questionnaire [16]. Touch, poke, tickle and hug
videos were also played in this test bed both with haptic and without haptic features.
4.3.1 Custom built software
The software built specifically for these tests played previously selected videos synchronized
with timed haptic instructions, which were directed towards the haptic jacket. The GUI of this
media player (see Fig. 4) contains a window for viewing the video, buttons for playing
Multimed Tools Appl
emotive/interactive videos, and a checkbox for selecting whether to enable or disable haptic
feedback while playing the video. To provide a non-biased response from the participants, the
buttons contained no description of what type of emotion is represented in the videos. There
are three groups of button actions in the GUI haptic media player where first group represents
actual haptics, second group characterizes fake haptics and third group is for other four
interactions.
4.3.2 The videos
Short videos cannot stimulate emotions as easily as long movies. In a movie, viewers take their
time to understand the idea behind the movie and get involved emotionally. For short videos
however, viewers can make use of the haptic features of the jacket to expedite their emotional
involvement.
4.3.3 Questionnaire
After each participant finished watching the evaluation videos from the second test bed, they
were asked to fill out a short questionnaire composed of eight statements. The first six
statements were selected from a list based on intensive research related to the QoE evaluation
of video games [16]. They were selected based on their relevance to our subject, and to allow a
comparison to the evaluation results of a similar system [12]. The last two statements also
serve our purpose, but were inspired from other research [9]. Each participant was asked to rate
these statements using the 5-point Likert scale, with 5 being Strongly Agree and 1 being
Strongly Disagree. The Likert scale has been commonly used for the cognitive perception
evaluation of users [23]. The statements used in our questionnaire and related results are found
in Table 3.
Fig. 4 GUI of the evaluation media player
Multimed Tools Appl
4.4 Result analysis
In the first test bed (Fig. 5), we verified the component organization of the Haptic clothe. In our
system, every emotion or interaction is represented differently or uniquely in the haptic jacket.
The goal of this test setup was to measure whether the emotion-wise component localization is
recognized, differentiated and accepted by the end user. From Fig. 5we find that our original
orientation of the haptic components (i.e. actual haptics) and corresponding emotion augmen-
tation based movie viewing experience was considerably better than movie viewing with false
or fake haptics for the participants. The users clearly had better immersive experience with our
actual haptics augmentation for all the six emotions.
After the completion of the first test bed, we requested the users to randomly select video
clips from the actual Haptics series i.e. the first column as well as all the interaction (hug,
tickle, touch, and poke) clips one by one. Each video was experienced once with haptics and
once without haptics. These tests were part of the second test bed. After the end of the
complete session, each user was asked to answer the questionnaire of Table 3. The result of
Tab le 3clearly demonstrates that haptics augmented video viewing instils better and
immersive experience over no haptics. In response to the question, whether user would like
Tab l e 3 Usersimmersiveness to emotive video with and without haptics
Questionnaire Absent Present
I felt myself drawn in2.14 3.94
I enjoyed myself 2.00 4.27
My experience was intense 1.79 3.49
I had a sense of being in the movie scenes 1.43 3.36
I responded emotionally 1.71 3.87
I felt that all my senses were stimulated at the same time 1.64 3.36
The system is easy to get familiar with 4.11
I would consider using the system 3.71
Fig. 5 Average Likert scale rating of emotion wise localized component organization
Multimed Tools Appl
to use our haptic jacket, 86 % answered affirmative. If we compare our results with a similar
Haptic jacket [12] that is intended to influence the emotions of movie watchers by creating
tactile stimuli to their bodies, we find that our system performs better and supports various
emotions as well as different interactions like the hug, poke, touch, and tickle.
5 Conclusion and future work
The research presented in this article outlines a wearable haptic system that is aimed at
enhancing user involvement in movie watching, video gaming, etc., to influence their emo-
tional immersion. The design of this system directly included consumersopinions on what the
jacket should and should not do. The results of this study assisted greatly in the design of the
basic architecture of the system. The main emotions we focused on were love, joy, surprise,
anger, sadness, and fear. We also included a few other interactions, including poke, tickle,
touch, and hug. We used vibrotactile and heat actuators, along with temperature sensors, in the
implementation of the jacket. 86 % of the respondents expressed interest in our system and are
willing to experience it further. In a nutshell, our proposed system is more engaging and gives
better immersion. For future improvements to this system, air pumping devices may be added
to stimulate pressure and chest tightness in the case of evoking joy,sadnessand hug.
Furthermore, since heartbeat rates differ between people depending on their age, sex, fitness,
and as well as other factors, a heartbeat rate sensor can be attached to the jacket system to read
the users actual heartbeat rate. The responding stimulation of heartbeats in the jacket system
can be a resulting percentage increase or decrease to the actual rate as read by the sensor,
depending on the intended emotion. In addition, future work for this project can include
analyzing facial expressions and communicating emotions to visually impaired persons to
improve social interaction.
References
1. Abdur Rahman M, Alkhaldi A, Cha J, El Saddik A (2010) Adding haptic feature to youtube,MM10
proceedings of the international conference on multimedia. ACM, New York
2. Alam KM, El Saddik A, Hardy S, Akther A (2011) SMS text based affective haptic application. Proceedings
of Virtual Reality International Conference (VIRC 2011), April, Laval, France
3. Berkow R, Beers MH, Fletcher AJ (1997) Abnormal heart rhythms. In: The Merck manual of medical
information, home edition. Merck Research Laboratories, New Jersey, p 79
4. Carlson M (1998) Understanding the Mothers Touch. Harvard Mahoney Neuroscience Institute Letter to
the Brain 7(1):1213
5. Cha J, Eid M, Barghout A, Rahman ASMM, El Saddik A (2009) Hugme: synchronous haptic teleconfer-
encing. Proceedings of the seventeenth ACM international conference on Multimedia, pp 11351136, New
Yor k , N Y, U S A
6. El Saddik A (2007) The potential of haptics technologies. IEEE Instrum Meas 10:1017
7. El Saddik A, Orozco M, Eid M, Cha J (2011) Haptics technologies: bringing touch to multimedia. ISBN
978-3-642-22657-1, Springer-Verlag
8. Field T (2002) Infantsneed for touch. Hum Dev 45:100103
9. Hossain SKA, Rahman ASMM, El Saddik A (2010) Interpersonal haptic communication in second life.
IEEE Int Symp Haptic Audio-Visual Environ Games, Oct
10. Izard CE (1977) Human emotions. Plenum Press, New York, p 120
11. Jones WD (2009) Jacket lets you feel the movies. IEEE Spectrum, [online], Available: http://spectrum.ieee.
org/biomedical/devices/jacket-lets-you-feel-the-movies [Dec. 1, 2011]. Mar
Multimed Tools Appl
12. Lemmens P, Crompvoets F, Brokken D, van den Eerenbeemd J, de Vries G (2009) A body-conforming
tactile jacket to enrich movie viewing. EuroHaptics conference, 2009 and Symposium on Haptic Interfaces
for Virtual Environment and Teleoperator Systems, pp 712
13. Ortony A, Turner TJ (1990) Whats basic about basic emotions? Psychol Rev 97:315331
14. Parrott WG (2001) Emotions in social psychology. Psychology Press, Philadelphia
15. Plutchik R (1980) A general psychoevolutionary theory of emotion. In Plutchik R, Kellerman H (eds)
emotion: theory, research, and experience: theories of emotion. New York, vol. 1, pp. 333
16. Rajae-Joordens RJE (2008) Measuring experiences in gaming and tv applicationsinvestigating
the added value of a multi-view auto-stereoscopic 3D display. In Westerink JWD, et al. (ed)
Probing experiencefrom assessment of user emotions and behaviour to development of products,
pp. 7790
17. Teh JKS, Cheok AD, Choi Y, Fernando CL, Peiris RL, Fernando ONN (2009) Huggy Pajama- a parent and
child hugging communication system. Proceedings of the 8th International Conference on Interaction Design
and Children IDC, pp. 290291, Como, Italy
18. Teh JKS, Cheok AD, Peiris RL, Choi Y, Thuong V, Lai S (2008) Huggy Pajamaa mobile parent and child
hugging communication system. Proceedings of the 7th International Conference on Interaction Design and
Children IDC, pp. 250257, Chicago, IL, USA
19. Tomkins SS (1984) Affect theory. In: Scherer KR, Ekman P (eds) Approaches to emotion. Lawrence
Erlbaum, Hillsdale, pp 163196
20. Tsetserukou D, Neviarouskaya A (2010) iFeel_IM!: augmenting emotions during online communication.
IEEE Comput Graph Appl 30(5):7280
21. Tsetserukou D, Neviarouskaya A, Prendinger H, Kawakami N, Tachi S (2009) Affective haptics in emotional
communication. 3rd International Conference on Affective Computing and Intelligent Interaction and
Workshops, ACII, pp. 16
22. Viswanathan LN, McDaniel T, Krishna S, Panchanathan S (2010) Haptics in audio described
movies. 2010 I.E. International Symposium on Haptic Audio-Visual Environments and Games
(HAVE), pp. 12
23. Wu W, Arefin A, Rivas R, Nahrstedt K, Sheppard R, Yang Z (2009) Quality of experience in distributed
interactive multimedia environments: toward a theoretical framework. Proceedings of the 17th ACM
International Conference on Multimedia MM 09,NewYork,NY,USA
24. Yun T, Guan L (2009) Automatic fiducial points detection for facial expressions using scale invariant feature.
IEEE International Workshop on Multimedia Signal Processing MMSP 09, October
Faisal Arafsha received his M.A.Sc. degree in Electrical and Computer Engineering from the University of
Ottawa in Canada early 2012, and his B.Sc. in Computer Engineering from King Fahd University of Petroleum
and Minerals (KFUPM) in Saudi Arabia in 2009. He is now a Ph.D. candidate at the University of Ottawa and
has been working in the Embedded Computing Systems and Haptic Feedback research in the MCRLab since
2010. Before that, Faisal has worked in the automation industry in one of the largest oil and gas corporations
worldwide.
Multimed Tools Appl
Kazi Masudul Alam received his M.A.Sc degree in Computer Science from University of Ottawa, Canada in
2012, and his B.Sc in Computer Science and Engineering from Khulna University, Bangladesh in 2007. He is
now a Ph.D. candidate at the University of Ottawa and has been working on Human Computer Interaction and
3D BigData Visualization. His masters thesis was related to design and development of next generation E-Book
reader system to provide immersive reading experience. He has authored and co-authored 15 peer reviewed
research articles in renowned conferences and journals. He is a faculty member (on leave) of Khulna University
Computer Science and Engineering discipline and also works as independent Information Technology specialist.
Abdulmotaleb El Saddik (F2009) is University Research Chair and Professor in the School of Electrical
Engineering and Computer Science at the University of Ottawa. He held regular and visiting positions in Canada,
Spain, Saudi Arabia, UAE, Germany and China. He is an internationally-recognized scholar who has made strong
contributions to the knowledge and understanding of multimedia computing, communications and applications.
He has authored and co-authored four books and more than 400 publications. Chaired more than 40 conferences
and workshop and has received research grants and contracts totaling more than /18 Mio. He has supervised more
than 100 researchers. He received several international awards among others ACM Distinguished Scientist,
Fellow of the Engineering Institute of Canada, Fellow of the C anadian Academy of Engineers and Fellow of IEEE
and IEEE Canada Computer Medal.
Multimed Tools Appl
... Nine devices were created for specific domains and seven were for general or unspecified domains. The specific application domains included aged care [29,30], health care [31], arts [32,33], psychology [34,35] and accessibility [36,37]. 1 of the included devices was completely passive, with a tactile button placed at a fixed position in relation to the hand providing feedback [38]. ...
... Six of these devices utilised vibrotactile feedback exclusively, while one implemented vibration as part of the force actuation [41] and four implemented other forms of feedback in addition to vibration. Other forms of haptic feedback included heat [33], pressure [29], and electrical muscle stimulation [43]. Vibrotactile feedback was used in a wide variety of applications compared to force feedback, and also was mostly used to portray abstracted information rather than literal simulations of touch. ...
... One device utilised vibration to portray distances from objects [36], while the other utilised vibration to alert the user to obstacles [37]. The devices utilising vibration also varied in position on the body, including the torso [32,33,36], arms [29,34,43] and head [37]. In contrast, all the devices utilising force feedback were mounted on the hand. ...
Preprint
This paper outlines the development of a wearable game controller incorporating vibrotacticle haptic feedback that provides a low cost, versatile and intuitive interface for controlling digital games. The device differs from many traditional haptic feedback implementation in that it combines vibrotactile based haptic feedback with gesture based input, thus becoming a two way conduit between the user and the virtual environment. The device is intended to challenge what is considered an "interface" and draws on work in the area of Actor-Network theory to purposefully blur the boundary between man and machine. This allows for a more immersive experience, so rather than making the user feel like they are controlling an aircraft the intuitive interface allows the user to become the aircraft that is controlled by the movements of the user's hand. This device invites playful action and thrill. It bridges new territory on portable and low cost solutions for haptic controllers in a gaming context.
... This review identified a number of devices under development that utilised different mechanisms of haptic feedback, including vibrotactile feedback. Many of these devices rely purely on vibrotactile feedback, though in some cases this was combined with other forms of haptic feedback including heat [19], pressure [20], and electrical muscle stimulation [21]. ...
... The devices utilising vibration also varied in position on the body, including the torso [19,23,25], arms [20][21][22] and head [24]. In contrast, the review identified that only the devices utilising force feedback were mounted on the hand. ...
Preprint
This paper outlines the development of a sensory feedback device providing a tangible interface for controlling digital environments, in this example a flight simulator, where the intention for the device is that it is relatively low cost, versatile and intuitive. Gesture based input allows for a more immersive experience, so rather than making the user feel like they are controlling an aircraft the intuitive interface allows the user to become the aircraft that is controlled by the movements of the user's hand. The movements are designed to allow a sense of immersion that would be difficult to achieve with an alternative interface. A vibrotactile based haptic feedback is incorporated in the device to further enhance the connection between the user and the game environment by providing immediate confirmation of game events. When used for navigating an aircraft simulator, this device invites playful action and thrill. It bridges new territory on portable, low cost solutions for haptic devices in gaming contexts.
... A major challenge in communicating emotions through vibrotactile encoding that they are a non-homogeneous category of information [26]. This means the psychophysiological effects of emotions can highly vary [4] and can differ depending on the body location of the vibrotactile stimulation [1]. A number of approaches have explored encoding across locations such as the wrist [25], forearm [22], hands [19,33], and the back [1,8,23,39]. ...
... This means the psychophysiological effects of emotions can highly vary [4] and can differ depending on the body location of the vibrotactile stimulation [1]. A number of approaches have explored encoding across locations such as the wrist [25], forearm [22], hands [19,33], and the back [1,8,23,39]. Predominantly, pattern creation strategies involve arbitrarily combining vibrotactile dimensions and then asking users to rate valence and arousal [23,25,28,37], rating the locations users assume an emotion is represented in the body [2], mapping facial expressions to vibration intensities [33], or user-generated patterns [8,36]. ...
... By combining different haptic technologies like vibrotactile and thermal, the available range of emotions evoked by the haptic stimulation can be enriched (e.g., Wilson & Brewster, 2017). At the moment, combining different haptic technologies like warmth and vibration has been rare, despite the fact that wearable devices which enable touch stimulation via multiple methods have been evaluated positively (Arafsha et al., 2015). ...
... State: Numerous actuation technologies (see Section 3) and materials (Biswas and Visell, 2019;Cruz et al., 2018) providing both tactile (Coe et al., 2019;Farooq et al., 2020;Evreinov et al., 2021) and kinesthetic output (Kim and Follmer, 2019;Elvitigala et al., 2022), have been developed for skin stimulation controlled with physical parameters (e. g., displacement, acceleration, electrical current, pressure) (Farooq et al., 2015). The studies that integrated more than one of these technologies into a single haptic interface showed that this can improve social and affective responses to the distant touch (Farooq et al., 2016b, Coe et al., 2019, Ahmed et al., 2016Arafsha et al., 2015;Wilson & Brewster, 2017;Messerschmidt et al., 2022). Using a combination of technologies, we can ensure the resulting feedback can deliver a wider bandwidth of haptic information (Tan et al., 2010). ...
Article
Full-text available
Touch between people is an integral part of human life. Touch is used to convey information, emotions, and other social cues. Still, everyday remote communication remains mainly auditive or audio-visual. The theme of this article, interpersonal haptic communication, refers to any communication system that supports mediation of touch between two or more persons. We first present a scoping review of the state of the art in interpersonal haptic communication, including physiological and psychological basis of touch, affective and social touch, and mediated social touch. We then discuss emerging research themes that shape the future of interpersonal haptic communication, identify research gaps and propose key research directions for each theme. Finally, societal impact and ethical aspects are discussed.
... As the human torso provides an extensive skin area to convey tactile information, torso-worn haptic displays deploying tactile spatial cues have gained increasing attention in recent years (Rupert 2000;Lemmens et al. 2009;Arafsha et al. 2015;Lentini et al. 2016;Wacker et al. 2016;Buimer et al. 2018;Garcia-Valle et al. 2018). Moreover, while providing tactile information on the torso, a person's active body parts, such as hands and fingers, remain fully available for daily living activities. ...
... The majority of torso-worn tactile displays developed in the last 2 decades have commonly adopted miniature affordable vibrotactile stimulators in commercial and experimental frameworks (Arafsha et al. 2015;Karafotias et al. 2017;Garcia-Valle et al. 2018). For instance, Van Erp and colleagues have employed torso-worn vibrotactile displays for use as a pedestrian navigation system (van Erp et al. 2003(van Erp et al. , 2005(van Erp et al. , 2005b(van Erp et al. , 2007. ...
Article
Full-text available
There is a steadily growing number of mobile communication systems that provide spatially encoded tactile information to the humans’ torso. However, the increased use of such hands-off displays is currently not matched with or supported by systematic perceptual characterization of tactile spatial discrimination on the torso. Furthermore, there are currently no data testing spatial discrimination for dynamic force stimuli applied to the torso. In the present study, we measured tactile point localization (LOC) and tactile direction discrimination (DIR) on the thoracic spine using two unisex torso-worn tactile vests realized with arrays of 3 × 3 vibrotactile or force feedback actuators. We aimed to, first, evaluate and compare the spatial discrimination of vibrotactile and force stimulations on the thoracic spine and, second, to investigate the relationship between the LOC and DIR results across stimulations. Thirty-four healthy participants performed both tasks with both vests. Tactile accuracies for vibrotactile and force stimulations were 60.7% and 54.6% for the LOC task; 71.0% and 67.7% for the DIR task, respectively. Performance correlated positively with both stimulations, although accuracies were higher for the vibrotactile than for the force stimulation across tasks, arguably due to specific properties of vibrotactile stimulations. We observed comparable directional anisotropies in the LOC results for both stimulations; however, anisotropies in the DIR task were only observed with vibrotactile stimulations. We discuss our findings with respect to tactile perception research as well as their implications for the design of high-resolution torso-mounted tactile displays for spatial cueing.
... As the human torso provides an extensive skin area to convey tactile information, torso-worn haptic displays deploying tactile spatial cues have gained increasing attention in recent years (Rupert 2000;Lemmens et al. 2009;Arafsha et al. 2015;Lentini et al. 2016;Wacker et al. 2016;Buimer et al. 2018;Garcia-Valle et al. 2018). Moreover, while providing tactile information on the torso, a person's active body parts, such as hands and fingers, remain fully available for daily living activities. ...
... The majority of torso-worn tactile displays developed in the last two decades have commonly adopted miniature affordable vibrotactile stimulators in commercial and experimental frameworks (Arafsha et al. 2015;Karafotias et al. 2017;Garcia-Valle et al. 2018). For instance, Van Erp and colleagues have employed torso-worn vibrotactile displays for use as a pedestrian navigation system (van Erp et al. 2003;Van Erp et al. 2005;Van Erp 2005b;van Erp 2007). ...
Preprint
Full-text available
There is a steadily growing number of mobile communication systems that provide spatially encoded tactile information to the humans' torso. However, the increased use of such hands-off displays is currently not matched with or supported by systematic perceptual characterization of tactile spatial discrimination on the torso. Furthermore, there are currently no data testing spatial discrimination for dynamic force stimuli applied to the torso. In the present study, we measured tactile point localization (PL) and tactile direction discrimination (DD) on the thoracic spine using two unisex torso-worn tactile vests realized with arrays of 3x3 vibrotactile or force feedback actuators. We aimed to, firstly, evaluate and compare the spatial discrimination of vibrotactile and force stimulations on the thoracic spine and, secondly, to investigate the relationship between the PL and DD results across stimulations. Thirty-four healthy participants performed both tasks with both vests. Tactile accuracies for vibrotactile and force stimulations were 60.7% and 54.6% for the PL task; 71.0% and 67.7% for the DD task, respectively. Performance correlated positively with both stimulations, although accuracies were higher for the vibrotactile than for the force stimulation across tasks, arguably due to specific properties of vibrotactile stimulations. We observed comparable directional anisotropies in the PL results for both stimulations; however, anisotropies in the DD task were only observed with vibrotactile stimulations. We discuss our findings with respect to tactile perception research as well as their implications for the design of high-resolution torso-mounted tactile displays for spatial cueing.
... Furthermore, other studies have designed tactile devices, such as vests to enhance people's affective communication through multimedia [34], or to evoke specific emotions through vibration and heat [35]. Additional works have focused on enhancing the moviewatching experience through tactile stimulation [10,36,37], similar to Lemmens et al.'s [10] multisensory jacket, designed with 16 different segments of 4 motors each capable of generating synchronized vibrotactile stimuli during moments in a film. ...
Article
Full-text available
Recent psychology and neuroscience studies have used tactile stimuli in patients, concluding after their experiments that touch is a sense tightly linked to emotions. In parallel, a new way of seeing films, 4D cinema, has added new stimuli to the traditional audiovisual via, including the tactile vibration. In this work, we have studied the brain activity of audience while viewing a scene filmed and directed by us and with an emotional content, under two different conditions: 1) image + sound, 2) image + sound + vibro-tactile stimulation. We have designed a glove where pulse trains are generated in coin motors at specific moments and recorded 35 viewers’ electroencephalograms (EEGs) to evaluate the impact of the vibro-tactile stimulation during the film projection. Hotelling’s T-squared results show higher brain intensity if the tactile stimulus is received during the viewing than if no tactile stimulus is injected. Condition 1 participants showed activation in left and right orbitofrontal areas, whereas Condition 2 they also showed activities in right superior frontal and right-medial frontal areas. We conclude that the addition of vibrotactile stimulus increases the brain activity in areas linked with attentional processes, while producing a higher intensity in those related to emotional processes.
... During the development of devices for rendering affective haptic stimuli, approaches have been investigated at various sites on the body with a variety of different working mechanisms [17]- [27]. Specifically for the forearm, different systems were investigated, mainly with vibration [16], [28]- [30]. ...
Article
Full-text available
Physically accurate (authentic) reproduction of affective touch patterns on the forearm is limited by actuator technology. However, in most VR applications a direct comparison with actual touch is not possible. Here, the plausibility is only compared to the user's expectation. Focusing on the approach of plausible instead of authentic touch reproduction enables new rendering techniques, like the utilization of the phantom illusion to create the sensation of moving vibrations. Following this idea, a haptic armband array (4x2 vibrational actuators) was built to investigate the possibilities of recreating plausible affective touch patterns with vibration. The novel aspect of this work is the approach of touch reproduction with a parameterized rendering strategy, enabling the integration in VR. A first user study evaluates suitable parameter ranges for vibrational touch rendering. Duration of vibration and signal shape influence plausibility the most. A second user study found high plausibility ratings in a multimodal scenario and confirmed the expressiveness of the system. Rendering device and strategy are suitable for a various stroking patterns and applicable for emerging research on social affective touch reproduction.
Article
Emotion recognition based on electroencephalogram (EEG) signals has been one of the most active research topics of affective computing. In previous studies of emotion recognition, the selection of stimulus sources was usually focused on single stimuli, such as visual or auditory. In this work, we propose a novel emotional stimulation scheme that synchronizes haptic vibration with audiovisual content to form a mixed sense of visual-auditory-haptic to trigger emotions. Fifteen subjects were recruited to watch the four kinds of emotional movie clips (happiness, fear, sadness, and neutral) with haptic or not, and their EEG signals were collected simultaneously. The power spectral density (PSD) feature, differential entropy (DE) feature, wavelet entropy (WE) feature, and brain function network (BFN) feature were extracted and fused to reflect the time-frequency-spatial domain of emotional EEG signals. The t-distributed stochastic neighbor embedding (t-SNE) was utilized for dimensionality reduction and feature selection. In addition, the fusion features are classified by the stacking ensemble learning framework. The experimental results show that the proposed haptic vibration strategy can enhance the activity of emotion-related brain regions, and the average classification accuracy was 85.46%.
Article
Wearable haptic garments for communicating emotions have great potential in various applications, including supporting social interactions, improving immersive experiences in entertainment, or simply as a research tool. Shape-memory alloys (SMAs) are an emerging and interesting actuation scheme for affective haptic garments since they provide coupled warmth and compressive sensations in a single actuation---potentially acting as a proxy for human touch. However, SMAs are underutilized in current research and there are many unknowns regarding their design/use. The goal of this work is to map the design space for SMA-based garment-mediated emotional communication through warm, compressive actuation (termed 'warm touch'). Two online surveys were deployed to gather user expectations in using varying 'warm touch' parameters (body location, intensity, pattern) to communicate 7 distinct emotions. Further, we also investigated mental models used by participants during the haptic strategy selection process. The findings show 5 major categories of mental models, including representation of body sensations, replication of typical social touch strategies, metaphorical representation of emotions, symbolic representation of physical actions, and mimicry of objects or tasks; the frequency of use of each of these mental frameworks in relation to the selected 'warm touch' parameters in the communication of emotions are presented. These gathered insights can inform more intuitive and consistent haptic garment design approaches for emotional communication.
Article
Full-text available
The iFeel_IM! system employs six haptic devices and visual stimulation to augment the emotions experienced during online conversations. The Web extra is a video that shows the authors' system, iFeel_IM!, which employs haptic devices and visual stimulation to convey and augment the emotions experienced during online conversations.
Article
The longer-term use of multi-view 3D displays in gaming applications was investigated with 20 experienced gamers. During two gaming sessions, one in 2D and one in 3D, galvanic skin response and heart rate were measured (both assumed to assess emotions and presence), followed by questionnaires at the end of each session. The results show that 3D displays provoke significantly higher positive emotions and stronger feelings of presence than 2D displays, indicating that 3D displays have added value in gaming applications.
Book
Human emotions
Conference Paper
In this article, we propose a new mobile application which automatically detects secondary level emotions from SMS text by applying lexical rules based analysis and maps emotions to unique vibrotactile haptic actions. In our prototype application, a SMS receiver can feel affective haptic effect through her mobile phone or wearable haptic jacket before reading any SMS. We envision such application as a preview of SMS content's emotion which will guide the receiver to select a message for reading or discarding at various contexts. In this article, we have presented our emotion analysis method, application architecture, some performance analysis and users perception towards such application.
Article
Hertenstein [2001] readily points out that the communicative functions of touch during infancy have been thoroughly neglected relative to the other senses. He extensively reviews the literature on infants’ responses to positive and negative forms of touch and postulates a model in which the stimulus qualities interact with the mother-infant interactive context. As he suggests, this model helps ‘conceptualize the dynamic and bi-directional nature of tactile communication ...’.
Article
The general psychoevolutionary theory of emotion that is presented here has a number of important characteristics. First, it provides a broad evolutionary foundation for conceptualizing the domain of emotion as seen in animals and humans. Second, it provides a structural model which describes the interrelations among emotions. Third, it has demonstrated both theoretical and empirical relations among a number of derivative domains including personality traits, diagnoses, and ego defenses. Fourth, it has provided a theoretical rationale for the construction of tests and scales for the measurement of key dimensions within these various domains. Fifth, it has stimulated a good deal of empirical research using these tools and concepts. Finally, the theory provides useful insights into the relationships among emotions, adaptations, and evolution.
Chapter
The longer-term use of multi-view 3D displays in gaming and TV applications was investigated in two experiments. Participants played a video game on a 20′ 3D monitor (experiment 1) or watched a movie on a 42\t" 3D display for 90 minutes (experiment 2); half of the time in 3D, the other half in 2D with the same temporal and spatial resolution as the 3D mode. Meanwhile, galvanic skin response and heart rate were measured (both assumed to assess emotions and presence), followed by questionnaires at the end of each session. Gaming performance (experiment 1) and memory performance (experiment 2) were also measured. The results show that 3D displays provoke significantly higher positive emotions and stronger feelings of presence than 2D displays in the gaming application, and are highly preferred by a large majority (85%) of the participants. Watching TV on a 3D display does not significantly evoke more emotions than on 2D displays, although a trend towards an increase in emotions is found, and 3D is preferred above 2D, or at least not found to be annoying by 95% of the participants. In conclusion, 3D displays have added value in gaming applications, and to a lesser extent to TV applications with a relatively low amount of depth added.
Conference Paper
Detecting fiducial points successfully in facial images or video sequences can play an important role in numerous facial image interpretation tasks such as face detection and identification, facial expression recognition, emotion recognition, and face image database management. In this paper we propose an automatic and robust method of facial fiducial point's detection for facial expressions analysis in video sequences using scale invariant feature based Adaboost classifiers. Face region is first located using the face detector with local normalization and optimal adaptive correlation technique. Candidate points are then selected over the face region using local scale-space extrema detection. The scale invariant feature for each candidate point is extracted for further examination. We choose 26 fiducial points on the face region from training samples to build the fiducial point detectors with Adaboost classifiers. All the candidate points in the test samples are examined through these detectors. Finally, all the 26 facial fiducial points are located on each frame of the test samples. Cohn-Kanade database and Mind Reading DVD are used for experiment. The results show that our method achieves a good performance of 90.69% average recognition rate.