Content uploaded by Ophelia Deroy
Author content
All content in this area was uploaded by Ophelia Deroy on May 12, 2015
Content may be subject to copyright.
As Light as your Footsteps: Altering Walking Sounds to
Change Perceived Body Weight, Emotional State and Gait
Ana Tajadura-Jiménez
1
, Maria Basia
1
, Ophelia Deroy
2
, Merle Fairhurst
2
, Nicolai Marquardt
1
,
Nadia Bianchi-Berthouze1
1
UCL UCL Interaction Centre
UCL, London, UK
[a.tajadura, maria.basia.13, n.marquardt,
n.berthouze]@ucl.ac.uk
2
Centre for the Study of the Senses, University
of London, London, UK
[ophelia.deroy, merle.fairhurst]@sas.ac.uk
ABSTRACT
An ever more sedentary lifestyle is a serious problem in our
society. Enhancing people’s exercise adherence through
technology remains an important research challenge. We
propose a novel approach for a system supporting walking
that draws from basic findings in neuroscience research.
Our shoe-based prototype senses a person’s footsteps and
alters in real-time the frequency spectra of the sound they
produce while walking. The resulting sounds are consistent
with those produced by either a lighter or heavier body. Our
user study showed that modified walking sounds change
one’s own perceived body weight and lead to a related gait
pattern. In particular, augmenting the high frequencies of
the sound leads to the perception of having a thinner body
and enhances the motivation for physical activity inducing a
more dynamic swing and a shorter heel strike. We here dis-
cuss the opportunities and the questions our findings open.
Author Keywords
Auditory body perception; multimodal interfaces; sonifica-
tion; interaction styles; emotion; evaluation method
ACM Classification Keywords
H.5.2. User Interfaces: Auditory (non-speech) feedback;
interaction styles.
INTRODUCTION
Our societies are seeing an increase in inactive and seden-
tary lifestyles that causes 6% of deaths each year and is a
risk factor for many chronic diseases [36,61]. Further, con-
cern about general body appearance reflects in the growth
of the fitness industry and of surgery procedures to change
one’s appearance [32]. This has led to a raising awareness
of the need to make people feel good about their bodies and
motivate them toward physical activity so that they stay
physically and mentally healthy and independent [33].
Figure 1. Overview of the experimental setup: (left) ma-
nipulating sound feedback; (right) sensing gait and gal-
vanic skin sensor (GSR).
To tackle the problem, various activity tracking devices and
applications are appearing on the market, such as Fitbit or
Nike Fuelband. To increase exercise adherence, these tech-
nologies typically exploit cognitive behavioral strategies
based on setting goals and providing rewards [21]. Here we
take a very different and complementary approach that al-
ters the perception of one’s own body in order to enhance
self-esteem, body feelings and the motivation for and quali-
ty of physical activity. Specifically, we propose to exploit
the multisensory nature of body perception [55] and induce
changes in perceived body weight by manipulating the
sound feedback received while walking (Figure 1).
Our physical body is our means of perceiving the world but
also our means of expression, of acting and interacting [2,
16,45]. How we perceive our body in terms of its appear-
ance and its action capabilities is thus vital for self-esteem
and for engaging in physical activity and social interaction.
Interestingly, recent neuroscience research has shown that,
rather than being fixed, the perception of our body (mental
body model) is continuously updated by the body-related
multisensory experiences that accompany our interactions
with the environment [3,51,55]. Theories of ‘forward inter-
nal models’ of motor-to-sensory transformations [60] sug-
Permission to make digital or hard copies of part or all of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. Copyrights for third
-
party components of this work must be honored. For all other uses, contact
the Owner/Author.
Copyright is held by the owner/author(s).
CHI 2015, Apr 18-23 2015, Seoul, Republic of Korea
ACM 978-1-4503-3145-6/15/04.
http://dx.doi.org/10.1145/2702123.2702374
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2943
gest that we predict the sensory feedback (e.g., the sound of
our steps) we should receive from our actions by consider-
ing, among other factors, the mental model of one’s body
dimensions and configuration. When the received sensory
feedback does not match these predictions an update of our
internal body model may occur.
The possibility of altering the perceived physical appear-
ance and the physical capabilities of one’s own body
through sensory feedback derived from one’s actions may
offer a practical way to make people feel good about their
bodies [9,41], thus facilitating healthy changes in self-
esteem and motor behavior. In our paper, we explore how
sound feedback received while walking may be altered to
induce changes in body perception and related emotion and
behavior. Walking was chosen as it is an everyday gentle
and easy form of physical activity, fit for people of all ages
and most abilities, and with many associated health benefits
such as a risk reduction in many chronic diseases [34].
Sound feedback was chosen for three reasons further devel-
oped in the background section. First, sound offers an ex-
cellent potential for consumer applications used during
walking, when headphones and portable sound devices are
often used. Second, the link between walking sounds and
the perceived walker’s weight is known. Heavier bodies
produce lower spectral mode sounds than lighter bodies and
listeners pick up on these cues when estimating properties
of the heard walker [17,29]. While prior studies explored
the effects of passive listening to walking sounds in the
perceived body of an unknown walker, our study focuses on
changes in the perception of one’s own body due to altering
self-produced walking sounds. Finally, sound has shown to
have positive effects in sport, dance and motor rehabilita-
tion [11,19,42,46,58], such as enhancing body awareness
and movement coordination. Sensory feedback has shown
for example, to ease motor learning by enhancing body-
related information, such as the distance to a target posture
[30,48], and to improve self-efficacy [49]. Yet, none of
these studies has explored the possibility of altering one’s
own body size perception with sound, to enhance physical
performance, self-esteem and positive attention to one’s
body. Such feedback could be integrated in applications
such as Fitbit to complement its strategies to increase exer-
cise adherence. Our paper aims to make both a theoretical
and technological contribution in this direction by propos-
ing and evaluating a system that effectively uses bodily-
related sound feedback to evoke changes in the perceived
body and measure changes in behavior and emotion.
BACKGROUND AND RELATED WORK
Walking Patterns (Gait) and Acoustic Signals
Gait is a periodic movement of each of the lower limbs
from one position of support to the subsequent one [56]. It
highly varies across individuals, due to age, gender or
weight. Troje et al. [54] explored the link between gait and
these various factors. They developed an application1,
based on motion data captured from eighty walkers, which
displays an animated point-light walker whose gait changes
according to various settings. In particular, the light-heavy
setting appears to change the acceleration of the limbs and
the straightness of the walker’s posture. Other work showed
that obese, as compared to normal weight walkers, display
longer heel strike and reduced ankle mechanical joint power
that relates to lower acceleration for the forward progres-
sion [25].
When walking, the force exerted by the foot against the
ground can be represented as a time varying spectrum [57].
The low frequency components (under 300 Hz) are respon-
sible for the displacement of the walker’s center of mass.
They depend on the walker’s weight and walking rate and
are essentially independent of footgear or walking surface
[15], unlike the high frequency components, which can
depend on the footgear and ground material [57]. The im-
pact of the foot on the ground gives rise to acoustic signals,
which frequency spectra shares properties with the ground
force spectra. For instance, walking sounds produced by
heavier bodies contain more energy in the low frequency
components than those sounds produced by lighter bodies.
Information Conveyed by Walking Sounds
Listeners can extract various sorts of information from
walking sounds (for a review see [57]). For instance, they
can discriminate between solid and aggregate ground mate-
rials based solely on walking sounds [18]. They can also
extract properties of a walker’s body from the acoustic fea-
tures of the walking sounds, including the gender, the emo-
tional state and the size/hardness of the shoe soles of the
heard walker [17]. Judgments of gender and shoe character-
istics depend on the sound spectral characteristics and
judgments of emotional state depend on the sound intensity
and temporal features (average pace, pace irregularity).
Listeners also rely on the sound spectra when asked to
judge the posture of a heard walker (upright, stooped) [37].
Li et al. [29] found that the acoustic features from which
listeners deduct the gender of a heard walker, including
spectral peak and high frequency components, change with
the weight and height of the walker. In fact, these authors
demonstrated that shifting the spectral mode of the walking
sounds to lower frequencies (at least to 125 Hz) increased
the ‘male’ reports, while shifting it to higher frequencies
(1000 Hz) increased the ‘female’ reports.
Hearing One’s Footsteps Changes Gait and Emotion
While several studies have investigated the effects of vibro-
tactile patterns [59], musical temporal patterns [50] and
musically induced arousal [28] on human gait, as well as
the effects of a broad range of sounds in emotion [5], few
studies have focused on the specific effect of self-produced
walking sounds. Some of these studies have suggested that
1biomotionlab.ca/Demos/BMLwalker.html
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2944
the use of sounds, including footsteps and other sounds,
synthesized in real-time from users’ gait pattern improves
navigation and interaction with a virtual reality (VR) envi-
ronment [35]. Sound also enhances robot-assisted lower
extremity motor adaptation during physical rehabilitation
[62]. Another study [31] showed that delaying the provision
of self-produced footstep sounds altered the gait period
without the walkers being aware of the change. This study
also showed that the delays in sound feedback reduced
agency, i.e, the feeling of having produced the sounds.
Two recent studies have looked at how changing the per-
ceived ground or shoe material by altering self-produced
walking sounds can affect a walker’s emotional state and
motor behavior. Listening to pre-recorded footstep sounds
produced by high heels of different materials and different
types of ground affects women’s emotional state when
these sounds are presented in synchrony with their own
footsteps [53]. Similarly, Bresin et al. [7] showed that hear-
ing self-produced footsteps sounds as if they were produced
on different ground surfaces seems to influence both walk-
ers’ emotion and their walking speed. However, the behav-
ioral results did not reach significance, perhaps due to par-
ticipants being over-conscious about their movement, as
they were asked to walk in a specific emotion-related style.
Using Walking Sounds to Change the Perceived Body
Most of neuroscience studies on the plasticity of our mental
body model have focused on visual and tactile feedback to
alter people’s perception of their body (e.g., perceiving and
acting as if one’s arm was longer). Recently, a few studies
have shown that sound can also be used for this purpose.
For example, sounds generated by tapping one’s hand on a
surface inform us of how long our tapping arm is [51] as
well as the strength applied when tapping [14]: altering
these sounds results in changes in arm extent and strength,
as well as in movements and emotional state [14]. Sound
may also change the perceived body material properties.
For example, one’s hand feels stiffer and heavier when al-
tering the sound received when an object hits the hand so
that it resembles the sound produced when the object hits
marble [47]; or one’s body feels as composed of metallic
parts or ‘robotized’ if, when moving it, one receives sound
and vibro-tactile feedback constructed from recordings of a
real robot actuation [26]. The variation of sound spectra and
amplitude also alters the perceived body material proper-
ties. Increasing both the high frequency components and the
overall amplitude of the sound elicited when rubbing two
hands together increases the perceived smoothness and the
perceived dryness of the skin [20,24].
The studies discussed above suggest we use the sounds de-
rived from bodily movements to build a model of our body
appearance and of its capabilities for interaction, and that
these sounds influence behavior and emotion. Here we ex-
plore for the first time how these sounds can alter the per-
ception of one’s own body weight, and look at how these
changes may impact on emotion and gait behavior.
METHOD
Motivation and Hypothesis
The aim of the study is to investigate the effects of walking
sounds on one’s body perception to inform the design of
ubiquitous technology for behavioral changes and wellbe-
ing. Given the previous work showing how shifting the
frequency mode of walking sounds influences the perceived
weight of the heard walker (i.e., another person’s body
[29]), we posited that altering self-produced footstep sounds
may likewise result in changes in the perceived weight of
one’s own body. We hypothesized that shifting the spectral
mode of self-produced footstep sounds to lower frequencies
may result in a perceived heavier body, while shifting it to
higher frequencies may result in a perceived lighter body.
We further posit that these changes in perceived body
weight may come together with behavioral and emotional
changes given the tight links between these three dimen-
sions: by perceiving one’s body as lighter, one may feel
more positive about this body and behave as if it were
lighter. For instance, this perception may accelerate move-
ments of the lower limbs [25,54] or induce an upright pos-
ture, which in turn may affect emotional arousal and domi-
nance (i.e., feeling in control, stronger). Indeed, upright
posture and dominance are known to relate to each other
[8,37,54]. On the contrary, feeling heavier may result in a
longer heel strike [25] or/and a larger exerted force of the
feet against the ground [15,57]. Feeling lighter may also
reflect in other experiential aspects, like for instance, one
may feel faster or stronger, with this new body.
Participants
Twenty-two paid participants (age=18-35, four male and
eighteen female, normal hearing) naïve to the study aim
took part in the experiment. Their mean body weight and
height (SD) was 59.25(10.55) Kg and 164.82(6.91) cm.
Figure 2. (left) Sandals with microphones, FSRs and ac-
celerometer; (right) a participant during one condition2.
2Ten additional EMG/accelerometer sensors were attached to
walker’s body for further analyses (not presented here).
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2945
Materials
Our prototype allows the dynamic modification of footstep
sounds, as people walk, and measurement of walking be-
havior changes. We used strap sandals (EU size 42) that are
easy to wear, fit a wide range of foot sizes, and have hard
rubber sole, so that they elicited clear and distinctive foot-
step sounds. As shown in Figures 1 and 2, each sandal was
equipped with a microphone (Core Sound, frequency re-
sponse 20 Hz-20 kHz) that captured the walking sounds and
two force-sensitive resistors (FSR; 1.75x1.5" sensing area)
attached to the front and the rear part of the sandal insole
that detected the exerted force by feet against the ground (as
in [7]). A triple axis accelerometer (Sparkfun) was attached
to the walker’s left ankle. FSRs and an accelerometer were
connected to an Arduino Uno microcontroller board linked
via Bluetooth (Arduino XBee) to a computer that acquired
their data (Arduino 1.0.5, Processing 2).
As shown in Figure 1, the microphones connected to a
small stereo pre-amplifier (SP-24B) and this connected to a
stereo 9-band graphic equalizer (Behringer FBQ800) that
changed the sound spectra. The resulting sound was fed
back via closed headphones (Sennheiser HDA 300) with
high passive ambient noise attenuation (>30 dBA) that muf-
fled the actual sound of footsteps. The analogue sound loop
had minimal latency (<1 ms). Arduino board, pre-amp and
equalizer were fitted into a small backpack the walker could
carry (~2 Kg, 35x29x10 cm). The cables from sandals to
backpack were attached with Velcro straps to the walker’s
legs. Finally, a GSR sensor (Affectiva Q Sensor; 32 Hz) in
the walker’s wrist measured emotional arousal [4].
The experiment was conducted in a quiet and dimly lit
room, with an 8.5x1.22 m corridor created by using wooden
boards (medium-density fiber, 2.5 cm thick; Figure 2). The
ground and footwear materials are relevant as they affect
the resulting sounds [29]. The hard rubber soles in contact
with the wooden board produce clear sounds. We placed
vertical panels on the corridor side to occlude objects in the
lab that could distract people’s attention or could have
served as a reference when judging one’s body dimensions.
Figure 3. (left) Body visualization tool; (right-top) exam-
ples of FSR and (right-bottom) and accelerometer data.
Experimental Design
Sound Feedback Conditions
Three sound feedback conditions were designed based on
[29]: a ‘Control’ condition in which participants were pro-
vided with their natural footsteps sounds equally amplified
across frequency bands; a ‘High frequency’ condition in
which the frequency components of the footsteps sounds in
the range 1–4 kHz were further amplified by 12 dB and
those in the range 83–250 Hz were attenuated by 12 dB;
and a ‘Low Frequency’ condition in which the frequency
components in the range 83–250 Hz were further amplified
by 12 dB and those above 1 kHz were attenuated by 12 dB.
Multi-Measurement Approach
The user experience was evaluated by combining self-
reporting, physiological and objective behavioral measures:
Perceived Body Weight: To measure perceived body
weight, we used a body visualization tool (bodyvisualiz-
er.com; see Figure 3) adopted by other studies for the same
purpose [39]. People adjusted the weight related dimension
of the body of a 3D avatar displayed on the screen to corre-
spond to their perceived body size [10,27,39].
Behavioral Changes (Gait Patterns): The ‘stance’ and the
‘swing’ of the two phases of a gait cycle (i.e., the time be-
tween two successive steps made by one foot [12]) were
analysed. The stance phase starts with the strike of the heel
on the ground and ends when the toes lose contact with the
ground. The FSR data of the left foot was used to quantify
the exerted force (peak and mean values) of the heel and
toes against the ground and their contact times, as well as
the stance and the gait cycle times (see Figure 3). We fo-
cused on data from the left foot since we did not have any
specific hypothesis on left/right asymmetries (additionally,
data from one right sensor seemed abnormal). The swing
phase starts with the foot lifting, first accelerating and then
decelerating (midswing) while preparing for the next heel
strike and while the other foot is on the ground. The foot
accelerates again when the flexor muscles are activated to
move the foot forward and downwards [56]. The accel-
erometer data was used to quantify the foot lifting accelera-
tion and deceleration, as well as the downward acceleration.
Emotional Responses: Emotional valence, dominance, and
arousal felt by participants were quantified by using the 9-
item graphic scales of the self-assessment manikin ques-
tionnaire [6]. Arousal was also quantified based on physio-
logical changes recorded by the GSR biosensor [4].
Perceived Body Behavior: We quantified other aspects of
the experience by asking participants to rate their level of
agreement with some statements (7-point Likert-type re-
sponse items, from strongly disagree to strongly agree). We
checked whether participants felt as if they were the agents
of the sounds (agency [31]), as many studies have shown
that large discrepancies between modalities and delays be-
tween actions and sensory feedback disrupt agency and
diminish the sensory-induced bodily illusions [31,51]. The
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2946
questionnaire included statements assessing whether partic-
ipants had a vivid feeling (vividness [51]) or had unex-
pected feelings about their body (surprise). We also looked
at feet localization as sound may interact with propriocep-
tion [51], and used 7-point Likert-type response items to
assess the perceived walking speed (slow vs. quick), body
weight (light vs. heavy), body strength (weak vs. strong)
and body straightness (stooped/hunched vs. straight).
Experimental Procedure
A within-subjects design was preferred given the greater
variability of body size perception between subjects [22]
due to psychological condition, thoughts about ideal body
size, media exposure or body dissatisfaction [39]. As in
other studies assessing perceived body size [10,27,39] we
adopted various strategies to compensate for practice
/habituation bias. First, condition randomization served to
discard differences due to practice. Second, while the ava-
tar’s parameters were set to match the participant’s gender
and height, the initial avatar’s weight varied across trials to
avoid anchor effects of the initial value [10,27,39]. This
was set to match the participant’s weight ± 25% (whether it
was + or - was counterbalanced across the two repetitions).
Participants changed the avatar’s weight by pressing two
keys. Finally, to distract participants from the manipulation
of one’s body weight, we mixed the avatar’s task with two
other ‘fake’ tasks related to objects weight: a ‘spanner task’
and a ‘lifting weights task’. For the ‘spanner task’ we
showed participants three metal spanners (13, 15, 17 mm)
and told them that in each trial they would have to carry one
of them and later provide assessments of its length and
weight. In fact, in order to keep weight constant we always
gave participants the 15 mm spanner, though this was not
made clear to them. For the ‘lifting weights task’ we used a
self-efficacy scale assessing the perceived ability to lift
various weights [1]. Participants were fully debriefed as to
the purpose of the study after the experiment ended.
After participants had been equipped with all the sensors
and had been provided with task instructions, they per-
formed two initial practice blocks (one without and one
with the equipment on) in order to familiarize themselves
with the task. Next, they completed a set of three experi-
mental blocks differing in the sound feedback condition
(Low frequency, High frequency and Control) presented in
a randomized order. Then, the full set of three conditions, in
another randomized order was repeated, in order to collect
more data points. In each block, participants were given the
15 mm spanner and, while holding it in their hand, they
walked in place for 10 s (marching phase). After a go-ahead
signal they walked along the 8.5 m wooden-floor corridor
(walking phase). Note that the marching phase was intro-
duced to increase sound exposure but that the analyses of
gait patterns focused on the walking phase. Participants
were asked to walk at a self-paced, comfortable speed. At
the end of the corridor, participants placed the spanner on a
black cloth bag and adjusted the avatar on the body visuali-
zation tool so that it matched their own perceived body di-
mensions (see Figure 3 – left).
After this body visualization task, participants were asked
to complete the questionnaire that assessed, in this order,
(1) the perceived spanner length and weight; (2) self-
efficacy regarding lifting different weight objects; (3) body
feelings (speed, weight, strength, straightness, agency over
footstep sounds, vividness, surprise, feet localization); and
(4) emotional feelings (valence, arousal, dominance).
RESULTS
First of all, we confirm that all participants attributed the
sounds to themselves, as a product of walking. As shown in
Table 1, participants agreed that the sounds they heard were
produced by their own body (agency scale), and did not
find the experience more or less vivid or surprising than
normal (non-significant differences between conditions).
Given the confirmation of perceived agency, we proceeded
with the analysis of the other collected data.
For the gait analyses (FSR and accelerometer), for each trial
and for each extracted parameter (see Methods) we calcu-
lated the average of all steps in the walking phase. Then, for
all physiological data (FSR, accelerometer and GSR), indi-
vidual z-scores were calculated to reduce intra-subject vari-
ability [4]. We analyzed normal data (normality tested with
Shaphiro-Wilk) with two-way repeated measures analyses
of variance (ANOVA), with 3x2 within-subject factors
sound condition and repetition. Significant effects were
followed by paired samples one-tailed t-tests, with the sig-
nificance alpha level adjusted to multiple comparisons. If
the ANOVA did not show an effect of repetition, we aver-
aged the data across repetitions and used t-tests to compare
the three sound conditions3. In case of non-normal data, we
attempted normalization by LOG-transformation. If normal-
ization was not achieved (i.e., GSR and questionnaire data)
we analyzed the data with non-parametric Wilcoxon tests.
Results of these analyses are presented in the next sections.
Perceived Body Weight (Body Visualization)
As shown in Figure 4, the sound condition affected per-
ceived body weight, as measured by the body visualization
tool. The ANOVA on the LOG-transformed data showed
significant differences between sound conditions
(F(2,42)=4.02, p=0.026, η2=0.16). T-tests comparing the
three sound conditions revealed that the High frequency
feedback was associated with a significantly lighter body
than the Control feedback (t(21)=2.73; p=0.006). The High
vs. Low frequency comparison indicated a trend towards
statistical significance (p=0.04), but the Control vs. Low
frequency comparison was far from significance (p>0.2).
These results indicate that the High frequency condition
caused participants to feel as having a thinner body.
3Ap-value equal to 0.017 corresponds to a significance level of
alpha=0.05 when Bonferroni correction for 3 levels is applied.
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2947
Figure 4. Mean (±SE) results for all three sound condi-
tions. * marks significant differences between means.
Foot Pressure (FSR Sensor Data)
The pressure data for one participant and for one trial for
two participants were lost. For the remaining 21 partici-
pants, data showed an average of 5.34 (SD=0.64) steps in
each walking phase. Of all parameters, only the heel contact
time with the ground revealed a significant effect of sound
condition (F(2,36)=6.38, p=0.004, η2=0.26; Figure 4). T-
tests on the z-scored data revealed that the heel remained in
longer contact with the ground in the Low than in the High
frequency condition (t(20)=2.79, p=0.006). The Low fre-
quency vs. Control comparison indicated a trend towards
statistical significance (p=0.039). No other differences were
found. These results suggest an increase in body weight
perception in the Low frequency condition.
Foot Acceleration (Accelerometer Data)
Acceleration data for one participant and for one trial for
another participant were lost. For the remaining 21 partici-
pants, the net acceleration was calculated as the square root
of the sum of the squares of the three acceleration compo-
nents. Of all parameters, only the upward foot acceleration
revealed significant effects of sound condition
(F(2,38)=3.94, p=0.028, η2=0.17; Figure 4). T-tests on the
z-scored data revealed larger acceleration during the foot
upward movement in the High than in the Low frequency
condition (t(20)=2.62; p=0.008). The High frequency vs.
Control comparison indicated a trend towards statistical
significance (p=0.029), but no other differences were iden-
tified. These results suggest a decrease in body weight per-
ception in the High frequency condition.
Table 1. Median(Range) for questionnaire data (7-level
Likert items except for 9-level valence, arousal and
dominance scales). * marks significant mean differences.
Emotional Response (GSR, Self-assessment Manikin)
GSR data from three participants were lost due to technical
problems. For the remaining 19 participants, GSR change
scores were calculated for each condition by subtracting the
minimum from the maximum response during each condi-
tion (from the beginning of the marching phase to the end
of the walking phase). Wilcoxon paired comparisons on the
z-scored data did not reveal an effect of repetition; thus we
averaged the data across repetitions and used further Wil-
coxon tests to compare the three sound conditions3. As
shown in Figure 4, the sound condition affected the GSR
scores. Higher changes were elicited by the High than the
Low frequency (T=39, p=0.011) and the Control condition
(T=28, p=0.003), while the Low frequency vs. Control
comparison did not achieve statistical significance.
The questionnaire data provided further insight into partici-
pants’ emotional feelings. As shown in Table 1, they felt
more positive in the High than in the Low frequency condi-
tion (T=22, p=0.008). A trend towards statistical signifi-
cance was observed for the High frequency vs. Control
comparison (p=0.037). No significant main effects of sound
condition on arousal and dominance were found, but results
indicate a trend in the direction of greater feelings of arous-
al in the High frequency than in the Control condition
(p=0.029). There was also a trend towards greater feelings
of dominance in the High frequency than in the Low fre-
quency (p=0.022) and the Control conditions (p=0.041).
In brief, our results indicate that the High frequency condi-
tion caused participants to feel more aroused and positive.
Perceived Body Behavior
For the questionnaire items on perceived body behavior,
Wilcoxon paired comparisons did not reveal a repetition
effect; thus we used further Wilcoxon tests to compare the
three sound conditions (mean across repetitions). We report
the comparisons that reached significance (p<0.017) or that
showed a trend towards significance (p<0.05).
Scales Control High freq Low freq
Emotional valence* 5.5(2.5-7.5) 6(5-8.5) 5.25(3-8)
Arousal 4.5(2-7.5) 5(2.5-7.5) 4.5(2-7.5)
Dominance 5(2.5-8.5) 5.5(4-8.5) 5(2.5-8)
Speed* 4(2-6.5) 5(3-6.5) 4 (1.5-6)
Weight 4(2-6.5) 4(1.5-5.5) 4.25(2-6.5)
Strength 4(2.5-6) 4.5(3.5-6) 4(2.5-5.5)
Straightness 5.25(2.5-7) 5.25(2.5-7) 5(2-7)
Agency 6(1-7) 6.5(1-7) 6.5(2-7)
Vividness 3(1-6) 2.75(1-6.5) 3.5(1-6)
Surprise 4(1-6.5) 4.25(1-7) 4(1-7)
Feet localization* 5.25(3-7) 6(4-7) 5.5(2-7)
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2948
The sound condition affected perceived speed. As shown in
Table 1, participants felt quicker in the High than in the
Low frequency condition (T=23, p=0.005), and they tended
to feel quicker in the High frequency than in the Control
condition (p=0.022). Although no significant effects of
sound condition were found for perceived body weight,
strength and straightness of the body the statistical compari-
sons revealed trends towards a significant lighter (p=0.024),
stronger (p=0.037) and straighter (p=0.029) body in the
High than in the Low frequency condition.
Attention to one’s body
Finally, it was found that participants were more confident
in localizing their feet in the High frequency than in the
Control condition (T=100, p=0.011); moreover they tended
to feel more confident in the High than in the Low frequen-
cy condition (p=0.020). These results and the more positive
emotion reported suggest that the High frequency condition
may have enhanced the proprioceptive feedback or possibly
even generated a more positive attention to one’s body.
DISCUSSION
We investigated the alteration of one’s footsteps sounds to
modulate the perception of one’s body and enhance self-
esteem, body feelings and the quality of walking. We found
that shifting the frequency of self-produced footstep sounds
alters the perceived body weight, emotional state, perceived
physical abilities and gait. This modulation process could
be used to improve the efficacy of technology for wellbeing
and for motivating physical activity. The next sections dis-
cuss the specific effects and their implications and possible
applications, and perspectives for further work.
Body dimensions
We had hypothesized that shifting the spectral mode of self-
produced footstep sounds to lower frequencies may result in
a perceived heavier body, while shifting it to higher fre-
quencies may result in a perceived lighter body. The second
part of this hypothesis was confirmed by our results in the
body visualization task as, after the High frequency feed-
back, participants set the dimensions of the avatar that rep-
resented their body to correspond to a lighter body. We also
observed that participants tended to feel lighter in the High
than in the Low frequency condition (questionnaire data).
Our approach exploits the multisensory nature of body per-
ception and builds on previous neuroscience research show-
ing that how people perceive their body can be changed by
altering body-related multisensory cues [3,13,38,51,55].
Our findings constitute a novel contribution to this research
in which the majority of works have focused on visual, tac-
tile and proprioceptive cues. Although walkers receive in-
formation about their body from vision, sound, touch (via
the tactile sensory receptors in the skin of the feet) and pro-
prioception [57], we show that altering sound cues alone
can change perceived body weight and other related body
percepts. In this way we contribute to the growing interest
in HCI on the possibilities offered by multisensory integra-
tion mechanisms for enhancing the user experience.
Moreover, our findings advance current research looking
specifically at walking sounds as a source of information
about events. While previous research has explored how
material identity in VR can be conveyed even when some
modalities are absent [14], we focused on walking sounds
as a way of altering body perception. A few studies have
looked at walking sounds as information on the appearance
of an unknown walker’s body during passive listening
[17,29,37]. However, to the best of our knowledge, this is
the first study looking at walking sounds as a source of real-
time information as to the appearance of one’s own body.
Emotional State and Physical Abilities
We further hypothesized that changes in perceived body
weight would come together with emotional changes, given
the tight links between body perception and self-esteem
[9,41]. We showed that listening to walking sounds with
higher frequencies resulted in people reporting feeling more
positive. People were also more physiologically aroused
during this sound feedback condition, as evidenced by the
GSR recordings. In addition, we observed trends towards an
increase in participants’ perceived abilities, as they reported
feeling more in control (dominance ratings) when listening
to the high frequency version of their footsteps.
There were further changes in participants’ perception of
their physical capabilities as they felt quicker and more
confident in localizing their feet (i.e., better proprioception)
after this condition. They also tended to feel stronger. The
enhancement in perceived strength and proprioception
caused by changes in self-produced footstep sounds relates
to previous findings showing similar effects of self-
produced tapping sounds [14]. Finally, we observed that
participants tended to feel as if they were walking with a
straighter posture when listening to higher than lower fre-
quency versions of their footsteps. This adds to previous
findings on the links between the perceived posture of the
heard walker and the spectral frequency of walking sounds
[37] and between body posture and emotional state [8].
The possibility of operating at the level of the perception of
one’s own body using sound feedback can apply to the de-
sign of systems aimed to enhance wellbeing. In fact, feeling
more positive and energized, quicker and in better control
of one’s body relates to enhanced self-esteem and a better
predisposition for physical activity. These systems could
benefit the increasing number of people who are concerned
with the appearance of their bodies [32] and about what
their body can do. There are a number of clinical cases of
body- and action-distortions leading to body-impairments
[9] and to deficits in emotional states, such as some cases of
chronic pain [49] or anorexia nervosa [41]. Apart from
these cases, many young people in their teen years, older
adults, and a significant proportion of the remaining popu-
lation often have these concerns to the detriment of their
emotional state and motor performance. Altered feedback
on walking sounds might help to recalibrate distorted feel-
ings of one’s own body weight and capabilities, and to feel
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2949
more positive and in control of one’s body. Our findings
can inform the design of more effective therapies using VR
and serious gaming technologies to enhance wellbeing.
Sound feedback in these applications can optimize user’s
embodiment in a virtual character that may have different
anthropomorphic characteristics than the user [44,57].
Gait
We further posited a link between changes in perceived
body weight and gait. Based on prior research we hypothe-
sized that a condition eliciting a lighter perceived body
would result in a larger acceleration of lower limbs [25,54],
while a condition eliciting a heavier perceived body would
result in longer duration of the heel strike [25] and a larger
exerted force by the feet against the ground [15].
Our results provide support for these hypotheses. First, a
higher frequency mode resulted in larger acceleration of the
upward movement of the lower limbs during walking,
which is consistent with previously observed gait patterns
of lighter walkers [25,54]. This result is interesting from the
point of view of increase in physical activity. Second, a
lower frequency mode resulted in an increase in the time
participants kept their heel in contact with the ground, as
measured by the force sensitive resistors, which is con-
sistent with walking with “heavier” steps [25]. It should be
noted that, although most of the self-reported changes seem
to derive from shifting the spectral mode to higher but not
to lower frequencies, we were able to observe behavioral
changes for the low frequency mode. Thus, these changes
seem to occur without users being aware of them [51].
Our study provides more insight into the updating of ‘for-
ward internal models’ of motor-to-sensory transformations
[60], as sound has rarely been investigated as feedback to
body models. We suggest that the observed gait changes
may result from an attempt to reduce the sensory discrepan-
cies that our feedback introduces. Gait changes may have
contributed to maintain the sound-induced bodily illusion.
In fact, all observed changes in body perception, emotion
and gait may reinforce each other during the process [43].
Sound feedback is currently used in many HCI applications
aimed at increasing exercise adherence, facilitating move-
ment and enhancing positive emotions, including applica-
tions used in sport and motor rehabilitation contexts
[11,19,42,45,46,58]. Here we show that operating at the
level of the perception of one’s body by using sound feed-
back can also enhance physical performance (reflected in
the acceleration of lower limbs), self-esteem and positive
attention to one’s body. While we show the short-term ef-
fect of brief exposure to manipulated walking sounds, it
remains to be tested whether the effects may differ after
longer exposure due to habituation [40], or generalize to
other environments with different lighting, environmental
noise or ground/footgear materials [29,53], as well as the
longer term effects. Moreover, while our subject sample
does not allow testing gender differences, it is possible they
exist given the media pressure on women on body size [52].
Further, as alteration in the sound can lead to different per-
ception of shoe material and style (e.g., high heel vs. heavi-
er shoes), the subject’s gender may have inhibitory or en-
hancing effects on the experience. Research has indeed
shown that different sounds are preferred for walkers of
different genders [57]. We observed that when removing
the four males in the current sample, the data show even
stronger effects in perceived body weight (sound condition:
p=0.021), heel contact time (sound condition: p=0.001),
perceived speed (High vs. Low: p=0.003) and emotional
arousal (High vs. Control: p=0.016), thus suggesting gender
differences and the need for further investigation.
Implications Beyond Walking
In this study, we focused on walking but future research
should explore a similar approach of using sound feedback
to enhance other types of physical activity. It should be
noted that we used shifts in frequency mode due to their
relation to body weight [17,29] but that different sound
manipulations might be needed to induce similar effects
during physical activities other than walking. Our study
highlights the importance of a careful selection of sounds in
order to optimize the desired effects. Prior research has
shown the importance of keeping sensorimotor discrepan-
cies under a certain threshold to maintain the feeling of
agency and enhance the multisensory-induced effects
[31,51]. Furthermore, we suggest that the relative contribu-
tions of shifts in frequency spectra and in loudness to our
observed effects need to be considered. We were surprised
to see that the perceived body weight, as quantified in the
body visualization task, was slightly (but not significantly)
higher in the Control than in the Low frequency condition.
While the self-reported weight in the questionnaire was not
higher for the Control condition, it is possible that the muf-
fling of the high frequencies in the Low frequency condi-
tion was not completely achieved or that our results might
depend on an interaction between spectral mode and loud-
ness, rather than spectral mode alone. We opted for ma-
nipulating frequency spectra based on prior research find-
ings [29], but this resulted in the three sounds conditions
differing also in loudness [23]. Past studies have demon-
strated that overall sound amplification of body-related
sounds (e.g., rubbing hands) may lead to changes in per-
ceived sound source characteristics [20,24]. Future research
may also test the optimum magnitude of frequency shifts.
Finally, while we presented results from a lab study, our
prototype could be further developed to allow running stud-
ies in the open. A more refined and compact version of our
system could be comprised of wireless headphones and
microphones connected to a smartphone in which sound
equalization takes place and in which the integrated accel-
erometer/gyroscopes are used to record the body activity.
This system could be used for walking, running or physical
rehabilitation applications that could increase the users’
performance, creating a more pleasurable experience of
physical activity and providing motivation for it [49].
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2950
CONCLUSIONS
Our results broaden the understanding of how auditory sen-
sory feedback can be used to design technology that chang-
es the perceived physical appearance and physical capabili-
ties of one’s body. We make a theoretical contribution to
HCI, by bridging psychological research on multisensory
stimuli updating one’s body model and HCI research on the
design of sensory-augmentation technologies supporting
wellbeing. This study therefore forms the basis of a larger
effort to design sound and motion technology that triggers
changes in how we perceive and use our bodies. Technolo-
gies integrating our proposed feedback may become an
extension of the users themselves, providing a different
experience of their bodies and impacting their self-esteem
and motivation to perform physical activity.
ACKNOWLEDGMENT
Supported by and ESRC ES/K001477/1 grant ‘The Hearing
Body’ and partially by AHRC AH/L007053/1 ‘Rethinking
the Senses' and EPSRC EP/H016988/1 ‘Emo&Pain’ grants.
We also thank Dr Torsten Marquardt for his advice.
REFERENCES
1. Bandura, A. Guide for constructing self-efficacy scales.
Self-efficacy beliefs of adolescents 5 (2006), 307-337.
2. Bianchi-Berthouze, N. Understanding the role of body
movement in player engagement. Human Computer In-
teraction 28(2013), 42-75.
3. Botvinick, M., and Cohen, J. Rubber hands 'feel' touch
that eyes see. Nature 391 (1998), 756.
4. Boucsein, W. Electrodermal activity. New York: Ple-
num press, 1992.
5. Bradley M.M., and Lang, P.J. Affective reactions to
acoustic stimuli. Psychophysiology 37(2000), 204-215.
6. Bradley M.M., and Lang, P.J. Measuring emotion: the
self-assessment manikin and the semantic differen-
tial. Journal of behavior therapy and experimental psy-
chiatry 25, 1(1994), 49-59.
7. Bresin, R., de Witt, A., Papetti, S., Civolani, M., and
Fontana, F. Expressive sonification of footstep sounds.
Proc. ISON (2010), 51-54.
8. Carney, D.R., Cuddy, A.J., and Yap, A.J. Power posing
brief nonverbal displays affect neuroendocrine levels
and risk tolerance. Psychological Science 21(2010),
1363-1368.
9. Carruthers, G. Types of body representation and the
sense of embodiment. Consciousness and Cognition
17(2008), 1302-1316
10. Cazzato, V., Mian, E., Serino, A., Mele, S., and Urgesi,
C. Distinct contributions of extrastriate body area and
temporoparietal junction in perceiving one’s own and
others’ body. Cognitive, Affective, & Behavioral Neu-
roscience (2014), 1-18.
11. Cesarini, D., Hermann, T., and Ungerechts, B. A real-
time auditory biofeedback system for sports swimming.
Proc. ICAD (2014).
12. Cunado, D., Nixon, M. S., and Carter, J.N. Automatic
extraction and description of human gait models for
recognition purposes. Computer Vision and Image Un-
derstanding 90, 1(2003), 1-41.
13. de Vignemont, F., Ehrsson, H.H., and Haggard, P. Bodi-
ly illusions modulate tactile perception. Current Biology
15, 14(2005), 1286-1290.
14. Furfaro, E., Bianchi-Berthouze, N., Bevilacqua, F., and
Tajadura-Jiménez, A. Sonification of surface tapping:
Influences on behaviour, emotion and surface percep-
tion. Proc. ISON (2013), 21-18.
15. Galbraith F., and Barton M. Ground loading from foot-
steps. JASA 48, 5B (1970).
16. Gallagher, S. How the body shapes the mind. Clarendon
Press, Oxford; New York, 2005.
17. Giordano, B.L., and Bresin, R. Walking and playing:
what’s the origin of emotional expressiveness in music?
Proc. ICMPC9 (2006), 149.
18. Giordano, B.L., Visell, Y., Yao, H.Y., Hayward, V.,
Cooperstock, J.R., and McAdams, S. Identification of
walked-upon materials in auditory, kinesthetic, haptic,
and audio-haptic conditions. JASA 131(2012), 4002-12.
19. Großhauser, T., Bläsing, B., Spieth, C., Hermann, T.
Wearable sensor-based real-time sonification of motion
and foot pressure in dance teaching and training. J. of
the Audio Engineering Society 60, 7/8(2012),580–589.
20. Guest S., Catmur C., Lloyd D., and Spence C. Audiotac-
tile interactions in roughness perception. Experimental
Brain Research 146 (2002), 161-171.
21. Harrisson, D., Marshall, P., Bianchi-Berthouze, N., and
Bird, J. Tracking physical activity: Problems related to
running longitudinal studies with commercial devices.
Proc. Ubicomp & ISWC, ACM Press (2014), 699-702.
22. Hennighausen, K., Enkelmann, D., Wewetzer, C., and
Remschmidt, H. Body image distortion in Anorexia
Nervosa--is there really a perceptual deficit? European
Child &Adolescent Psychiatry, 8 (1999), 200-206.
23. ISO 226:2003. Acoustics -- Normal equal-loudness-
level contours.
24. Jousmäki, V., and Hari, R. Parchment-skin illusion:
sound-biased touch. Current Biology 8(1998), R190-
191.
25. Ko, S.-u., Stenholm, S., and Ferrucci, L. Characteristic
gait patterns in older adults with obesity—Results from
the Baltimore Longitudinal Study of Aging. Journal of
Biomechanics, 43(2010), 1104-1110.
26. Kurihara, Y., Hachisu, T., Kuchenbecker, K.J., Kajimo-
to, H. Jointonation: robotization of the human body
by vibrotactile feedback. Proc. SIGGRAPH, ACM Press
(2013).
27. Legenbauer, T., Vocks, S., Betz, S., Baguena
Puigcerver, M.J., Benecke, A., Troje, N.F., and Ruddel,
H. Differences in the nature of body image disturbances
between female obese individuals with versus without a
comorbid binge eating disorder: an exploratory study
including static and dynamic aspects of body image.
Behavior Modification,35, 2 (2011), 162-186.
28. Leman, M., Moelants, D., Varewyck, M., Styns, F., van
Noorden, L., and Martens, J.P. Activating and relaxing
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2951
music entrains the speed of beat synchronized walk-
ing. PloS one 8, 7(2013), e67932.
29. Li, X.F., Logan, R.J., and Pastore, R.E. Perception of
acoustic source characteristics: walking sounds. JASA
90, 6(1991), 3036-3049.
30. Magill, R.A., and Anderson, D.I. The roles and uses of
augmented feedback in motor skill acquisition. Skill Ac-
quisition in Sport: Research, Theory and Practice. N.
Hodges, A.M. Williams, eds. Routledge, 2012.
31. Menzer, F., Brooks, A., Halje, P., Faller, C., Vetterli,
M., and Blanke, O. Feeling in control of your footsteps:
Conscious gait monitoring and the auditory conse-
quences of footsteps. Cognitive Neuroscience 1(2010),
184-192.
32. NHS, Cosmetic surgery.
www.nhs.uk/conditions/Cosmetic-surgery
33. NHS, Health and fitness. www.nhs.uk/livewell/fitness
34. NHS, Walking for Health. www.nhs.uk/Livewell
35. Nordahl, R. Increasing the motion of users in photo-
realistic virtual environments by utilising auditory ren-
dering of the environment and ego-motion. Proc. Pres-
ence (2006), 57-63.
36. OECD. Health at a Glance 2013: OECD Indicators.
OECD Publishing, 2013.
37. Pastore, R.E., Flint, J.D., Gaston, J.R., and Solomon,
M.J. Auditory event perception: the source—perception
loop for posture in human gait. Perception & psycho-
physics 70(2008), 13-29.
38. Petkova, V.I. and Ehrsson, H.H. If I were you: Percep-
tual illusion of body swapping. PLoS ONE 3, 12(2008).
39. Piryankova, I.V., Stefanucci, J.K., Romero, J., Rosa, S.
D.L., Black, M.J., and Mohler, B.J. Can I recognize my
body's weight? The influence of shape and texture on
the perception of self. ACM Trans. Appl. Percept.
11(2014), 1-18.
40. Polich, J. Habituation of P300 from auditory stimuli.
Psychobiology 17, 1 (1989), 19-28.
41. Pollatos, O., Kurz, A.-L., Albrecht, J., Schreder, T., et
al. Reduced perception of bodily signals in anorexia
nervosa. Eating Behaviours 9, 4(2008), 381-388.
42. Rosati, G., Rodà, A., Avanzini, F., and Masiero, S. On
the role of auditory feedback in robotic-assisted move-
ment training after stroke. Computational Intelligence
and Neuroscience, (2013), ID586138.
43. Sakurai, S., Katsumura, T., Narumi, T., Tanikawa, T.,
Hirose, M. Interactonia balloon. SIGGRAPH (2012).
44. Sanchez-Vives, M.V., and Slater, M. From presence to
consciousness through virtual reality. Nature Reviews
Neuroscience 6, 4(2005), 332-339.
45. Savva, N., Scarinzi, A., and Bianchi-Berthouze, N. Con-
tinuous recognition of player's affective body expression
as dynamic quality of aesthetic experience. IEEE IEEE
Transactions on Computational Intelligence and AI in
Games 4(2012), 199-212
46. Schaffert, N., Mattes, K., and Effenberg, A.O. Listen to
the boat motion: acoustic information for elite rowers.
Proc. ISON (2010), 31-38.
47. Senna, I., Maravita, A., Bolognini, N., and Parise, C.V.
The Marble-Hand Illusion. PloS one 9, 3 (2014).
48. Sigrist, R., Rauter, G., Riener, R., Wolf, P. Augmented
visual, auditory, haptic, and multimodal feedback in
motor learning: A review. Psychonomic Bulletin & Re-
view 20, 1 (2013), 21–53.
49. Singh, A., Klapper, A., Jia, J., Fidalgo, A., Tajadura-
Jimenez, A., Kanakam, N., Bianchi-Berthouze, N., and
CdeC Williams, A. Motivating people with chronic pain
to do physical activity: opportunities for technology de-
sign. Proc. CHI’14, ACM Press (2014), 2803-2812.
50. Styns, F., van Noorden, L., Moelants, D., and Leman,
M. Walking on music. Human movement science 26,
5(2007), 769-785.
51. Tajadura-Jiménez, A., Väljamäe, A., Toshima, I., Ki-
mura, T., Tsakiris, M., and Kitagawa, N. Action sounds
recalibrate perceived tactile distance. Current Biology
22, 13(2012), R516-R517.
52. Tiggemann, M. Gender differences in the interrelation-
ships between weight dissatisfaction, restraint, and self-
esteem. Sex Roles 30 (1994), 319-330.
53. Tonetto, P.L.M., Klanovicz, C.P., Spence, C. Modifying
action sounds influences people’s emotional responses
and bodily sensations. i-Perception,5(2014), 153-163.
54. Troje, N.F. Retrieving information from human move-
ment patterns. In Understanding Events: How Humans
See, Represent, and Act on Events (pp. 308-334). Ox-
ford University Press, 2008.
55. Tsakiris, M. My body in the brain: a neurocognitive
model of body ownership. Neuropsychologia 48,
3(2010), 703-712.
56. Vaughan, C.L., Davis, B.L., and O'connor, J.C. Dynam-
ics of human gait. Champaign, Illinois: Human Kinetics
Publishers, 1992.
57. Visell, Y., Fontana, F., Giordano, B. L., Nordahl, R.,
Serafin, S., and Bresin, R. Sound design and perception
in walking interactions. International Journal of Hu-
man-Computer Studies 67, 11(2009), 947-959.
58. Vogt, K., Pirro, D., Kobenz, I., Holdrich, R., and Eckel,
G. PhysioSonic: evaluated movement sonification as
auditory feedback in physiotherapy. Proc.
CMMR/ICAD, (2009), 103-120.
59. Watanabe, J., and Hideyuki, A. Pace-sync shoes: intui-
tive walking-pace guidance based on cyclic vibro-tactile
stimulation for the foot. Virtual Reality 14(2010), 213-9.
60. Wolpert, D.M., and Ghahramani, Z. Computational
principles of movement neuroscience. Nature Neurosci-
ence 3(2000), 1212-1217.
61. World Health Organization. Global health risks: mortal-
ity and burden of disease attributable to selected major
risks. Geneva, 2009.
62. Zanotto, D., Rosati, G., Spagnol, S., Stegall, P., and
Agrawal, S.K. Effects of complementary auditory feed-
back in robot-assisted lower extremity motor adapta-
tion. IEEE Transactions on Neural Systems and Reha-
bilitation Engineering, 21, 5(2013), 775-786.
Wellness & Wearables
CHI 2015, Crossings, Seoul, Korea
2952