Conference PaperPDF Available

TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback

Authors:
TactJam: An End-to-End Prototyping Suite
for Collaborative Design of On-Body Vibrotactile Feedback
Dennis Wittchen
dennis.wittchen@htw-dresden.de
University of Applied Sciences
Dresden
Dresden, Saxony, Germany
Katta Spiel
katta.spiel@tuwien.ac.at
TU Wien
Vienna, Austria
Bruno Fruchard
bruno.fruchard@inria.fr
Univ. Lille, Inria, CNRS, Centrale Lille,
UMR 9189 CRIStAL
Lille, France
Donald Degraen
donald.degraen@dfki.de
Saarland University & German
Research Center for Articial
Intelligence (DFKI)
Saarland Informatics Campus
Saarbrücken, Germany
Oliver Schneider
oliver.schneider@uwaterloo.ca
University of Waterloo
Waterloo, Ontario, Canada
Georg Freitag
georg.freitag@htw-dresden.de
University of Applied Sciences
Dresden
Dresden, Saxony, Germany
Paul Strohmeier
paul.strohmeier@mpi-inf.mpg.de
Sensorimotor Interaction,
Max Planck Institute for Informatics
& HCI Lab - Saarland University
Saarland Informatics Campus
Saarbrücken, Germany
ABSTRACT
We present TactJam, an end-to-end suite for creating and sharing
low delity prototypes of on-body vibrotactile feedback. With Tac-
tJam, designers can create, record and share vibrotactile patterns
online. This opens up new ways of collaboratively designing vi-
brotactile patterns both in collocated as well as in remote settings.
We evaluate TactJam in a two-part distributed online workshop,
exploring the design of on-body tactons. Participants were able to
successfully use TactJam to learn about tacton design. We present
an overview of mappings between tactons and their associated con-
cepts before comparing the results of tactons created using solely a
GUI and tactons created through experimenting with placements
directly on the body. Conducting both parts of the workshop sepa-
rately highlighted the importance of designing directly with bodies:
less implicit assumptions were made, and designs were guided
by personal experience. We reect on these results and close on
deliberations for the future development of TactJam.
This work is licensed under a Creative Commons Attribution-NonCommercial
International 4.0 License.
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
©2022 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-9147-4/22/02.
https://doi.org/10.1145/3490149.3501307
CCS CONCEPTS
Human-centered computing Haptic devices
;Empirical
studies in HCI;Interface design prototyping; User interface toolkits.
KEYWORDS
vibrotactile feedback, tactile feedback, tactile prototyping, tactons,
collaborative sketching, on-body design, embodied design, embod-
ied interaction
ACM Reference Format:
Dennis Wittchen, Katta Spiel, Bruno Fruchard, Donald Degraen, Oliver
Schneider, Georg Freitag, and Paul Strohmeier. 2022. TactJam: An End-to-
End Prototyping Suite for Collaborative Design of On-Body Vibrotactile
Feedback. In Sixteenth International Conference on Tangible, Embedded, and
Embodied Interaction (TEI ’22), February 13–16, 2022, Daejeon, Republic of
Korea. ACM, New York, NY, USA, 13 pages. https://doi.org/10.1145/3490149.
3501307
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea Wichen, et al.
Figure 1: An overview of the TactJam suite.
1 INTRODUCTION
Tactile feedback and communication continues to receive less atten-
tion than vision or audio in interactive technology. Its importance,
however, is known; touch is among the rst senses we learn to
rely on [
1
], and serves purposes ranging from simple reexes to
communication of complex emotions [
18
,
19
,
53
]. There have been
almost 20 years of investigation into guidelines and design tools to
support the creation of tactile feedback [
7
,
17
,
40
]. Yet, we still see
this modality used with less prevalence.
We believe there are two reasons why tactile feedback is still
something which is often added as an afterthought, rather than
explicitly included in design: When people are concurrently con-
fronted with visual or acoustic as well as tactile stimuli, they tend to
prioritize the visual or acoustic element. This is reected in visual
and audio media pipelines which are well established and highly
streamlined, while tactile processes are only recently being iden-
tied [
33
,
41
,
46
]. Second, design processes for all modalities rely
heavily on collaboration, which remains one of the most signicant
challenges for both expert and novice haptic design [
41
,
46
]. In
practice and in research, it is comparatively easy to communicate
visual or acoustic information through pictures and recordings;
however, the same is not true for tactile feedback.
In this paper, we address the collaboration problem as a step
towards improving our ability to design tactile information through-
out the design process. We introduce TactJam (Figure 1), an open
source suite of software and hardware tools for designing, storing,
sharing, and testing on-body vibrotactile feedback patterns. Tac-
tJam supports placing up to eight actuators on a body as shown
on the left in Figure 1. Designers can then jam by controlling each
actuator in real time using button-presses on dedicated control
hardware. These patterns can be recorded on the device and then
uploaded to a cloud based sharing application. The device is further
able to download previously shared designs, enabling the playback
of tactile patterns designed by others.
This system complements industrial design tools by demonstrat-
ing how to support a collaborative haptic design process. Haptic
design tools are becoming more common with Apple’s API (limited
to single actuators) [
24
], the Lofelt [
29
] and Interhaptics [
25
] editors
and middleware that enable sharing designs, or the Syntacts [
38
]
vibrotactile rendering framework as prominent examples. With
TactJam, we expand upon these by oering end-to-end support
for collocated and remote collaborative work—from inception of
a design, to recording and sharing—throughout the haptic design
pipeline.
TactJam enables prototyping, recording, and sharing tactile icons
within the same environment. It supports in-situ exploration of
haptic experiences and allows design outcomes to feed back into de-
sign processes. This particularly suits haptic design as we often lack
a language which fully captures the richness of tactile experience
[
35
]. It also supports the design of on-body haptics, as designers are
faced with a plethora of bodily shapes and sizes, which interact with
how a given tactile design is experienced. Fast sharing mechanisms
enable designers to explore a range of multiple designs rapidly.
We put TactJam to a practical test through a two-part remote
workshop on vibrotactile feedback design. Workshop results high-
lighted that TactJam supported remote collaborative haptic design.
The workshops formed a type of natural experiment, which com-
pared designing vibrotactile patterns using a GUI vs directly proto-
typing on bodies. This highlighted how using a GUI compared to
designing directly on bodies leads to very dierent design results,
but also very dierent ways of talking about and reecting on these
designs.
In summary, we make the following contributions: We present
TactJam, an open source suite for collaboratively designing on-
body vibrotactile feedback and present a qualitative evaluation
comprising a two part workshop. We further share anecdotal results
of tacton designs from the workshops, as well as implications of
how designing on bodies or using a GUI aects design choices.
2 RELATED WORK
2.1 Vibrotactile information display
Tactile perception is largely mediated through vibration [
5
]. Conse-
quently, vibrotactile actuators are a powerful tool for simulating
physical touch sensations such as textures [
36
], forces [
3
], or compli-
ance [
50
]. Such work, however, is predated by the use of vibrotactile
feedback as information display. In 1957, Frank Geldard developed
Vibratese, a vibrational representation of an alphabet across ve
actuators on the body [
16
]. In 1959, the movie The Tingler provided
vibration feedback in theater seats [
23
]. This structured information
display employed multiple vibrotactile actuators. The goal of such
displays is to convey information as clearly as possible, rather than
provide physical realism. Correspondingly the design of vibrotac-
tile interactions usually involves a manual, iterative process, which
TactJam was designed to support.
TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
The design of structured vibrotactile information displays was
outlined by Brewster and Brown, in their investigation of tactons, or
tactile icons [
7
], a type of haptic icon [
14
]. Follow-up inquiry looked
at eectiveness of information transfer [
8
], usability considerations
such as how large a set could be learned [
52
], and connection to
emotions [
57
], user experience [
30
], and player experience in games
[
47
]. This work typically identies key design parameters like fre-
quency, amplitude, waveform, and careful timing (e.g., duration,
number of pulses, rhythm).
Tactile icons have been used in scenarios like collaborative turn
taking [
10
], and mobile/wearable situations, like a spatial vibrotac-
tile belt to help waynding [
39
]. Other examples include emotion
inducing cues using spatio-temporal vibrotactile patterns [
22
], and
approaches combining vibrations with thermal cues [
54
]. The work-
shop used as evaluation of TactJam focused on the design of such
tactile icons.
2.2 Haptic design tools
As haptics has become more common within HCI research, we see
a pattern of design tools being developed for each new device. Early
work in artistic use of vibrotactile feedback, here for tactile-music
composition, recognized the need for compositional tools [
17
]. The
Hapticon editor [
14
] was an early example of an editor for a 1-
degree-of-freedom force feedback knob. Since then, a plethora of
tools have been developed for various haptic modalities, including
variable friction displays [
32
] and wearable pneumatic pressure
displays [13].
Of these design tools, tools for the design of vibrotactile feedback
are the most common, rst with research prototypes and now with
industry tools. The maturity of vibrotactile actuation technology
makes it an easy target. In research, these show increasing complex-
ity, from graphical editors for single actuators [
40
] and multiple
on-body actuators [
26
,
37
], the use of tactile illusions to abstract
into a single “animation object" [
42
], and nally combine vibrotac-
tile feedback with audio and video feedback in game design [
51
]
and VR [
11
,
12
]. In industry, companies have long used in-house
tools (e.g., D-Box’s and Immersion’s editors for motion planning
and vibrotactile feedback), but more recently some companies have
emerged as software-focused to meet this growing need for haptic
design (e.g., Lofelt, Interhaptics).
Custom DIY kits have also emerged, such as Stereo Haptics
[
58
] for accessible 2-actuator control, and Syntacts [
38
] for more
general wearable vibrotactile feedback. This mirrors similar DIY
kits for other haptic modalities, like the Hapkit and Haply for force
feedback display [
15
,
31
]. TactJam is similarly designed to be easily
replicable by other makers and tinkerers, its main contribution over
existing kits is its intended use in remote collaboration settings.
2.3 Collaboration in haptic design
Researchers have not just built design tools, but also applied design
processes to haptics [
34
]. For example, haptic sketching [
33
] has be-
come a standard tool for people designing haptics or other physical
interactive systems. Now, haptic experience design (HaXD) is an
emerging design practice with similarities to other elds of design,
but unique challenges [
41
], especially for novices [
46
]. One of those
central challenges is collaboration.
A series of tools and techniques have helped to contribute to
collaboration in haptics, but the eld is still young. Mirroring two
actuators’ output on a “haptic instrument" can enable two hapti-
cians to rapidly try out and discuss ideas in a collocated setting
[
43
]. Using open-source examples of vibrotactile eects in an online
tool can help people start from scratch, and learn the idioms of
haptic design [
44
] in an asynchronous manner. “HapTurk" showed
that crowdsourced studies of vibrotactile icons are similar to those
run in-person, lending them credibility, and shows that existing
commodity hardware can be used as a proxy for feedback with
rarer, higher-delity actuators [45].
However, these systems typically focus on a subset of the collab-
orative process or on a specic way of collaboration. TactJam can
be used for both collocated as well as distributed collaboration, it
supports both synchronous and asynchronous sharing, as well as
private or shared control.
3TACTJAM
3.1 Design Considerations
TactJam provides a tool chain for a fast and iterative design pro-
cess of on-body vibrotactile patterns, and supports collocated and
remote collaborative work. In this section, we outline this process,
the requirements for tools needed to implement it, and the oppor-
tunities for cooperation that this opens up.
3.1.1 Design Process. While sketching, designers typically create a
large number of designs with simple tools and later select a subset
to rene later [
9
]. With TactJam, we focus on this rst rapid, iter-
ative sketching process, but with vibration. Much like scribbling
with a pen on paper, haptic designers should be able to quickly
develop vibrotactile patterns directly on their body. Iterative sketch-
ing helps identifying strong ideas through multiple processes. For
one, capturing an idea concretely helps designers to think about
their sketch and identify strengths and weaknesses. Beyond that,
an important part of this iterative sketching process is the ability
to share and discuss sketches with others for soliciting feedback
and outside opinions. TactJam is also explicitly designed to support
these processes by providing the ability to record, display, and share
vibrotactile patterns.
In summary, with TactJam we intend to support these three main
activities:
(1)
Sketching, which is done by attaching actuators on the body
and jamming with them, by playing vibrotactile patterns live
for experimenting with experiences.
(2)
Recording vibrotactile patterns, and providing vibrotactile
playback to support reecting on a captured idea.
(3)
Documenting the vibrotactile patterns and sharing with oth-
ers, for collecting feedback and for remote collaboration.
Throughout all steps, designers should be able to experience the
tactile feedback they produced.
3.1.2 Tools. To support the proposed activities we use two sepa-
rate tools. The TactJam hardware allows for sketching and reect-
ing. The TactJam GUI supports documenting and collaborating
remotely. Both parts of TactJam communicate with each other,
while being fully functional as standalone systems. The TactJam
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea Wichen, et al.
Figure 2: (a) Initial prototype of TactJam used for teaching, (b) breadboarded version of TactJam designed for TEI workshop, (c)
rst PCB version of workshop device and (d) re-designed workshop device with improved ergonomics for bimanual hand-held
use.
hardware is portable so users can design vibrotactile patterns with-
out requiring access to a PC. The device can be held and used like a
game-controller in mobile situations. The device also comfortably
sits on a desk and can be used like a keyboard. Eight actuators can
be individually connected to the TactJam hardware and have long
cables to attach them anywhere on the body.
The TactJam hardware has three modes which mirror the main
activities outlined above:
(1)
Jamming, where each actuator has a corresponding button
that turns the vibration on or o. The global amplitude of
vibration can be modulated with a rotary controller.
(2)
Record/Play, where users can record button presses and am-
plitude modulations. These can then be played back.
(3)
Data Transfer, where the physical device connects to the
GUI. From it, recorded vibration designs can be uploaded to
or downloaded from a server to the device.
To compare and contrast designs, TactJam has three independent
‘slots’ to work with. This allows, for example to record patterns
in slot one and slot two and switching back and forth between
them. Concurrently one might use slot three for testing new ideas,
or download a third pattern created by someone else using the
TactJam GUI.
The TactJam GUI handles the persistent storage in a remote
database as well as viewing, sorting, and retrieving vibrotactile
patterns from this database.
3.1.3 Collocated and Remote Collaboration. Schneider and MacLean
described several design dimensions for haptic instruments [
43
].
Some of these consider the collaborative aspects of such tools, e.g.
collocated/distributed output, synchronous/asynchronous output,
and private/shared control [
43
]. TactJam is designed to provide
designers with exibility, by supporting all the described aspects
(but to varying extent). Similarly, TactJam is designed to be used
on oneself or on another person, which provides further exibility.
When two or more designers are
collocated
,TactJam can be set
up so that the actuators of designer one are placed on the body of
the designer two. Here designer two perceives the tactile sketches
of the collaborator synchronously, but designer one needs to rely
on verbal feedback from designer two. Both designers can place the
actuators on their own bodies, which allows them to experience
their respective sketches in real time, but requires asynchronous
sharing of sketches. In addition to private control (one device per
user), multiple designers can use a device together for shared control
over the actuation. Similarly, one device can be used to attach
actuators on and provide output to multiple people. These uses,
however, are constrained by the number of available actuators and
the ability of multiple people to reach the device concurrently.
The focus of the design of TactJam, though, is to support
dis-
tributed
usage. That is, design in a context where multiple design-
ers are working from dierent locations. This focus emerged as
TactJam was originally designed to facilitate shared sketching of
vibrotactile patterns during the COVID-19 pandemic. In this case,
TactJam provides asynchronous output via playback of recorded
patterns where each user has private control over their own device.
Equipped with TactJam, users can share their designs, discuss ideas
and can further elaborate them iteratively.
3.2 Implementation
TactJam is a fully open source hardware (Figure 2) and software (Fig-
ure 4) for designing on-body vibrotactile patterns. Hardware and
software can be used together or as standalone systems. All source
materials are available online: https://github.com/TactileVision.
3.2.1 Hardware: The TactJam hardware consists of a 160
×
80 mm
PCB with SMD and through-hole components. As it is intended to
be used without casing, all buttons, switches, encoders, LEDs and
the display are mounted on the top side of the PCB while all non-
interactive components are on the side facing downwards. These
downward-facing components are protected by an acrylic plate,
attached with 20 mm spacers.
The main PCB acts as an extension board to the ESP32 NodeMCU
with ESP32-WROOM-32 chip from AZ-Delivery. The ESP32 con-
nects to the main PCB through standard headers mounted on the
downward facing part of the PCB. The top side consists of eight
tactile switches (M1 M8, Figure 3) to trigger up to eight actuators
simultaneously and a potentiometer for setting the global ampli-
tude. Rotary switches are used to toggle between the device’s modes
(jam,record/play, and data transfer) and for choosing one of three
slots to work with vibrotactile patterns. Finally the board has three
general purpose buttons to trigger context relevant functions, e.g.
TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
Figure 3: Hardware user interface, with eight actuator but-
tons (M1–M8), three function buttons (F1–F3), knobs to se-
lect the mode, slot, and amplitude, and the display.
to start/stop record and play (F1 F3, Figure 3). Feedback regard-
ing the device’s state and settings is provided with a monochrome
OLED display (128×64 px) and errors are indicated with a buzzer.
The device supports up to sixteen Eccentric Rotating Mass vi-
bration motors (ERMs) as actuators, connected to the PCB via JST
connectors and a 1.5 m long cable. We use the NFP-C1030L brushed
coin vibro motors (
10 mm and 3 mm thick) for actuation and are
powered with a separate 5 V DC power supply via an additional
micro-USB port.
3.2.2 Firmware & Data Format: The rmware is implemented us-
ing PlatformIO [
27
], the Espressif 32 platform [
28
] and the Arduino
framework [
2
]. The rmware has three major tasks: (1) handle in-
puts (e.g. button presses), (2) record and play vibrotactile patterns,
and (3) communicate with the PC to transfer data between both
entities.
The device can record and store up to three tactile patterns in the
on-board ash memory of the ESP32. Patterns are stored as a series
of 32bit binary instructions. The encoding and decoding system is
implemented using libvtp [
20
] for memory ecient binary encod-
ing and decoding. Each instruction consists of a instruction type
(increment time,set amplitude, or set frequency), actuator ID, time
oset to last instruction and nally the value to set. For example, a
command to change the amplitude is structure as follows:
4 bit instruction code (0010)
8 bit actuator id (all actuators if zero)
10 bit time oset in milliseconds to last instruction
10 bit amplitude
If the time oset exceeds the 10-bit limit of 1023 ms the increment
time instruction can be used with the following structure:
4 bit instruction code (0000)
28 bit time oset in milliseconds to last instruction
As frequency and amplitude are coupled for ERMs (the parameters
inuence eachother, and can therefore not be tuned separately from
each other), the current iteration of TactJam does not use the set
frequency command, however it is implemented for compatibil-
ity with other vibrotactile pattern generation tools such as that
provided by bARefoot [50].
3.2.3 GUI and Backend: An important piece of information which
TactJam cannot automatically record is the location of a given ac-
tuator. However, knowing where an actuator is placed is absolutely
crucial for recreating an on-body vibrotactile pattern. Therefore,
the automatically encoded pattern needs to be complemented with
clear documentation of where actuators are placed. The purpose of
the GUI is then to provide this documentation before uploading the
pattern to a server. Additionally, the GUI provides a visualization
of the vibrotactile design.
The TactJam GUI automatically connects to the rmware as
soon as the device is plugged in to the computer via USB. To enable
sharing vibrotactile patterns remotely with others, we use a backend
centralized server to which the GUI also connects automatically.
All vibrotactile patterns are saved on the server and can be accessed
by all users at any time.
Once a vibrotactile pattern is loaded on the GUI, the time prole
of each actuator is depicted as a continuous line with an area of
varying thickness corresponding to the vibrations amplitude (Fig-
ure 4, left). To document what actuators are being used and where
they are located on a body, the GUI provides a 3D mannequin (Fig-
ure 4, right), the location of actuators on a body must be manually
indicated by moving colored dots on to the corresponding locations
on the mannequin.
Before uploading patterns to the server, users are prompted
to provide a title and description of the pattern, as well as tags
indicating the body parts engaged in the design and custom tags
dened by users. To download a pattern users browse a page that
lists all existing patterns including meta information, and select one
of them. To simplify searching for vibrotactile patterns, a search
text eld lters the list dynamically.
4TACTJAM WORKSHOPS
Tactjam was put to a practical test during a two session online work-
shop at TEI 2021 [
55
]. Overall, 17 participants from North American
and Europe (six countries total) designed ten tactons in the rst
session, and implemented 32 tactons in the second. Participants
were almost equally split between students and faculty, with eight
Figure 4: Screenshot showing the graphical user interface. A
visualization of the pattern recorded by the TactJam hard-
ware can be seen on the left. The positions of the actuators
are manually indicated on the right, using this GUI.
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea Wichen, et al.
participants having nished their PhD and nine still studying (one
in a BA degree, two in a Master’s degree and six working towards a
PhD). About 2/3 of participants presented as more masculine and at
least one participant was non-binary. The participants’ research in-
terests spanned from haptic material and experience design, safety
critical systems, animal-computer interaction to interaction design
more generally.
The workshop was held in two distinct sessions, two months
apart: the rst focused on theory and designing tactons without
experiencing them, and the second focused on implementing the
former ideas and reecting on theoretical design choices with the
device available to participants. Both sessions were approximately
four hours long.
In the rst session, participants were asked to design tactons us-
ing three prompts: topics,contexts, and design goals. For all prompts
participants were provided with examples and participants were en-
couraged to add their own ideas. We proposed topics such as tactile
emotions,social presence, or time mapping, contexts such as biking,
gaming or physical activities, and design goals such as enhancing
immersive experiences or designing unambiguous feedback. We ac-
companied each topic and context with a list of examples presented
as textual label and an icon. Participants worked in groups to design
vibrotactile patterns and discuss their ideas. They used the TactJAM
GUI to illustrate the actuator placements and added descriptions to
specify the spatio-temporal properties of each vibrotactile pattern.
The main activity of the second session was physical on-body
design with a functional prototype. As we observed dierences
in the designs resulting from these activities we deem relevant
and interesting, we initially report a sum mary of results for each
session individually.
Our analysis is grounded in textual data such as documents
created by individual groups and personal notes, chats, and video
recordings of nal group discussions through their transcripts as
well as visual data in the form of digital representations of the
resulting tacton designs. We analyzed textual data using a light-
weight coding and theming approach [
6
] involving four steps: 1)
familiarizing ourselves with the data (sensing), 2) creating a coding
frame to ensure reliability, 3) identifying relevant codes regarding
e.g., intent, inspiration, metaphors used, or envisioned experiences,
and 4) interpreting those in context. These steps are of iterative
character and can be repeated and returned to as needed [ibid].
For visual data, we conducted a schematic analysis [
49
] to identify
common and diering characteristics between the dierent designs
and interpreted them together with textual data. This analysis was
lead by the second and last author and discussed with all authors
to ensure a coherent narrative.
4.1 Workshop 1: GUI-based Design
The rst workshop was conducted without the TactJam hardware.
Tactile icons were designed using only the TactJam GUI. To provide
some constraints to this exploration, participants and organizers
formed small groups to explore tactile icons for specic settings.
In the end, four distinct groups formed who chose the contexts of
bicycling (G1), Virtual Reality (G2), Sports (G3) and Hospital Inten-
sive Care Units (ICU) (G4), respectively (see also, Table 1). Each
Figure 5: Locations selected during workshop one
group consisted of three to four participants and would indepen-
dently conduct their design deliberations. These were presented in
a group discussion at the end of the workshop which was recorded
and builds the core data basis for the second part of our analysis
regarding designers’ reections.
4.1.1 Actuator Locations. Participants designed tactons which used
actuators all over their bodies. Emphasis was along the arms, on
the chest, down the spine and around the head. An overview of
selected locations is presented in Figure 5.
4.1.2 Design Reflections of Participants. The group discussion at
the end of the workshop was structured around each group present-
ing their individual tacton design(s) and reecting on their intent,
inspirations, and experiences for half an hour. During the presen-
tations, group members commented and added on what was said
whereas all workshop participants were invited to ask questions or
comment on the designs and deliberations. We report here from the
thematic analysis (along Boyatzis’s style [
6
]) of the video record-
ing and transcript thereof across presentations. Instead of zoning
in on the individual designs, we focus here on how participants
communicated their designs and which points were prominent in
presenting them.
Primarily, participants were interested in
communicating their
intent
around the designs. Here, we observed a tendency to focus
on logically driven, clear mappings between information and sen-
sory experiences with an emphasis on the information conveyed.
P3, for example, indicates how information ows in their concept
oriented on guiding bicyclers through trac.
Another idea was to indicate, on the cycler’s back, which
direction to turn next or whether to stop at all. P3
In cases where sensory experiences where mentioned, they
largely were relegated to notions of equivalence, i.e., aiming at
allowing immersive experiences that would not be accessible with-
out using a set of vibrating motors. Here, participants aimed at
providing people who lost a limb an option to counteract phantom
pain or oering up sensory experiences of dierent embodiments,
most prominently that of an elephant, as P7 describes.
It’s this idea of, if you touch something, can you give
the feeling how an elephant
would interact with things? P7
TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
Identier Group Context Examples
G1 Bicycling Alerting for oncoming trac
Indicating information regarding the weather forecast
G2 Virtual Reality Supporting immersion when playing as non-human characters (e.g., elephants)
Facilitating management of phantom pain in the case of limb loss
G3 Sports Providing suggestions for workout intensity
Structuring timings in high intensity interval training
G4 Intensive Care Units (ICU) Guiding hospital sta to where care is most urgently needed
Communicating to and calming patients when sta is on their way
Table 1: Contexts of investigation during both workshops
The way participants talked about the groups they designed for,
they often used phrasings that indicated they preferred to
design
for other
, meaning they chose scenarios that feasibly had someone
else interacting with the resulting artefact. P2 notes this explicitly:
... we want to support caretakers in knowing where to
be at what time and what priority it is. P2
Hence, intended target groups were imagined external to the set of
workshop participants. As participants’ familiarity with the target
contexts varied, they made assumptions about others’ experiences.
This reasoning might lead to complicated implications for the appro-
priateness and suitability of those designs (cf. [
4
]), for instance in the
context of disabilities that participants had no personal experiences
with themselves. The underlying assumption is the equivalence
of experiences made in a specic bodily area and produced with
sensory input at another.
We replaced the sensors on the foot they have lost and
moved it to another place of the body. P4
However, some participants also discussed the impossibility of
actually understanding how sensory experiences are made in other
contexts, e.g., for animals. P6 muses about this extensively in the
context of the above mentioned elephant.
... if you have a trunk, then it’s an arm with ngers at
the end and it’s also a nose, and it’s also a tongue, and
it’s actually also a noisemaker, and it’s something that
you put food in your mouth with. It has so many uses.
How is it possible for us to imagine that? P6
Finally, participants aimed at
imagining experiences
, largely
due to the abstract nature of the designs grounded in a lack of
knowledge of the corresponding experiences. Here, they often drew
on known interactive experiences and specic ones they made
personally from which they envisioned how the populations they
designed for would experience and feel about interacting with the
potential artefacts.
That’s like moving from slow to intense or something
coming towards you. P5
Participants also noted how these populations would make meaning
about the interactive experiences. Particularly the following state-
ment by P2 indicates how experiences for others were imagined
with a caring intent behind.
You want people to know what’s coming, maybe do it
in a reassuring way like a stroke on the arm, and try
and get that aspect of sense of touch. P2
Given the setup of the rst workshop, participants provided
largely structured and logically driven designs. They oriented them-
selves towards an expanded set of embodiments not just represent-
ing the ones already present among participants; however, that also
means they prioritized their own assumptions about these expe-
riences, as well informed as they might have been, over those of
people who make them rst hand. Ultimately, experiences had to
be imagined and could not be veried, leading to reections more
oriented on intent than actuality of designs.
4.1.3 Workshop Results: Tacton Mappings. Something we were es-
pecially interested in were metaphors and mappings participants
might chose when creating tactons. We therefore summarized these
mappings using anity diagrams and identied four major cate-
gories:
Nominal
mappings where participants mapped a discrete de-
sign to a specic purpose. Such mappings might be naturalistic,
where a tacton simulates the concept it represents. An example
of a naturalistic mapping was a zig-zag pattern on the back of a
cyclist, which communicated upcoming thunderstorms (Figure 6a).
Another type of mapping might be logical where the tacton reects
the logical organization of a concept. An example of this were two
actuators placed on top of each other. If the higher one vibrated,
this communicated to a cyclist that they should switch into a higher
gear, if the other, lower, one vibrated, the cyclist should switch into
a lower gear (Figure 6b). Participants also used visual analogies, for
example, by arranging a set of tactons to represent, for example,
the familiar visual icons for the play or pause symbol.
Another type of mappings were
Body Based
mappings. These
again included spatial mappings, where a tacton might communi-
cate a location on a body, or directions relative to a body (Figure 6c),
for example, highlighting contextual cues such as surrounding traf-
c or directly guiding cyclers towards specic directions. Another
type of body-based mapping was perceptual. Here tactons were
used to communicate perceptual events. For example by providing
feedback on the shoulder when a prosthetic hand would touch an
object, or providing a human with feedback on the nose, so they
might experience what it feels like to be an elephant (Figure 6d).
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea Wichen, et al.
Figure 6: Tactons shared in the wrap-up session of workshop one. a) Naturalistic mapping: tactons display the weather on the
back. Rain is presented by intermittent random vibrations, while lightning is shown as zig-zag pattern. b) Logical mapping:
Top of wrist for switching to higher gear, bottom of wrist for switching to lower gear. Front of shin for faster, back of shin for
slower. c) Temporal mapping: Heart rate and body temperature are communicated based on the speed of circular vibration
patterns. d) Spatial mapping: Directions in VR are interpolated using the actuators distributed on the head. e) Perceptual
mapping: Actuators are placed at locations, where an elephant would have sensory experiences a human usually would not
have access to. f) Aective mapping: Actuators are placed so they might be used to provide a calming gesture.
A further type of mapping we encountered was
Magnitude
,
where variations in the tacton design provided a quantitative ele-
ment to the concept they communicated. Examples included, again,
spatial mappings, for example, actuators might be arranged in a
row, and depending which actuator vibrates, this might indicate
how urgent something is. Similar strategies were represented in a
temporal manner. The speed with which a pattern repeated might
provide feedback to a runner on their pulse or body temperature
(Figure 6e). Finally the magnitude of a concept might also be com-
municated by the magnitude of vibration. For example, the number
of actuators vibrating might indicate how soon an event will occur.
The nal type of mapping we found was
Aective
. Here, tactons
were designed to communicate a concept to support creating a
target mood in the person with the tacton. For example, in a hospital
setting, a patient might receive a tactile notication telling them
that a nurse is on their way. The tacton might then be designed
to simulate a gentle touch, with the intention of soothing and
reassuring the patient (Figure 6f).
4.2 Workshop 2: On-Body Design
In the second workshop, the same groups came together to imple-
ment the designs from the previous workshop using the TactJam
hardware (see Appendix for additional visualizations of vibrotac-
tile patterns). As most participants invested most of their time in
unstructured experimentation with the hardware, the nal round
was held as an open discussion.
4.2.1 Actuator Locations. The locations chosen for the on-body
prototyping were noticeably less diverse than during the rst ses-
sion. Most tactons were placed on the left arm, while some were
placed around the chest (Figure 7). To our surprise, the face and neck
were also used extensively. Actuators were additionally placed on
dog noses. We speculate that two dierent factors might have inu-
enced this. For one, the social awkwardness of some locations was
experienced much stronger in the actual on-body context. Further,
as most participants were alone during the workshop, locations
were constrained to those they could reach themselves as well as to
body parts participants were willing to expose in an online setting.
4.2.2 Design Reflections of Participants. The concluding discussion
and reections on the second workshop were analyzed in the same
fashion as the rst workshop. However, the conversation was less
structured along groups and participants overall spoke more freely
largely about the experiences they made and to a lesser extent
attached to design intents.
During the second workshop, participants focused their reec-
tion on their activities experimenting with their own bodies and
exploring
sensory experiences
they made themselves. In describ-
ing these explorations, they used less concrete words and relied on
a shared sense of how things should feel.
I think I started with pulses and then I went with swoops.
So I was really trying to play around with the volume.
And I found that like having that ease in, that ease out,
that fade in, fade out. P2
Figure 8 and the supplementary gures in the appendix illustrate
this iterative approach. There was a fairly hedonic drive in exper-
imenting with the sensory potentials with participants seeking
out pleasurable experiences. Subsequently, they reported feeling
‘good’ about their explorations and modulating them in ways they
personally found rewarding.
I really like the feeling to use vibrations in dierent
positions. P14
Figure 7: Locations selected during workshop two
TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
Figure 8: Example of tacton design from workshop two. Note the iterative nature of the temporal patterns and the increasing
use of amplitude dynamics (swoops).
Having that extra amplitude is the only way we can get
continuous stu, which is more biological and more, I
don’t know, just pleasant, right? P2
Participants also experimented with objects (and non-human
animals) that surrounded them and reected on how vibrotactile
experiences diered along dierent body parts and congurations.
I put some on a headset because I happened to have one
of those, stuck some on here. And that was a real buzz,
because it’s right next to your skull and everything.
That was a huge shock. I also tried it on my dog’s nose
and I got a quite strong reaction from her when I did
that. P6
The surprise P6 states here on how
their imagination diered
from their experiences
was shared among participants in a range
of ways. For example, P9 describes how they experimented with
actuator placements in ways they previously found conceptually ill
tting or imagined as unpleasant.
I was trying a bit on the throat which is super interesting
and very not aggressive at all. Whereas I would have
thought, without trying it, that it would be like, “No,
don’t put something on my throat." P9
Participants further reected on their previous designs and how
assumptions about the experiences shaped them. In some cases,
however, as P10 describes, exploring dierent designs through on-
body placements proved crucial to
assess the feasibility
of prior
ideas sometimes indicating that they could not be implemented
in the way previously intended.
I thought it would be well more distinctive with regard
to the play shape. When I tried it out on my arm, it was
hard to recognize a certain pattern or recognize which
shape underlies the dierent vibrations. P10
In another case, participants indicated that a previously assumed
equivalence between experiences made with vibrotactile patterns
placed on arms and those placed on a chest did not hold. They
started reconsidering their assumption and reect on how they
could consider the dierences going further.
It failed gloriously because it was an entirely dier-
ent experience and that we treated it then as well as a
dierent experience. P11
While participants were more focused on exploring sensory
experiences for themselves on their own bodies, they also shared
tactons with each other. In these cases, designs that were initially
intended for personal use or stemmed from individual explorations,
were then made available to other participants and became designs
not just for self, but also for others. Two participants mentioned the
resulting
intimacy
they experienced explicitly, with P12 pondering
on the interpersonal meanings created by this practice.
The very rst time, I downloaded [another person’s]
tacton, I had this moment where I wondered what the
social meaning or what that means to download some-
body else’s tacton. And I guess we were also, especially,
we were designing things that feel pleasant. P12
As participants noticed this
mismatch between earlier as-
sumptions and the experiences they made
themselves, some
tried to nd an explanation as to why this was the case. P13 hinted at
a lack of training regarding vibrotactile sensitivities of participants.
I wanted to share that it’s really hard to imagine how
stu feels. [...] I think humans are just really not used
to imagine how tactile input feels and tactile output on
the skin. P13
Participants also provided the development team with a range of
suggestions for further improvements on the interaction of TactJam
hardware and software parts for design iterations. Overall, though,
participants largely explored sensory experiences for themselves
during the second workshop, re-evaluated their prior designs and
sought out explanations for the meaning behind sharing on-body
experiences as well as the divergence between assumptions driving
designs in the rst workshops and experiences made in the second.
4.2.3 Workshop Results: Tacton Mappings. The re-evaluations made
when working on the body, lead to new insights about tacton map-
pings.
Participants identied a new type of mapping,
action centric
mappings. These are tactons, which are modied based on what
the user is doing. For example, their intensity might increase while
the user is reaching for an object.
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea Wichen, et al.
Participants also found that
aective
mappings were almost
unavoidable, and more complex than assumed. This manifested
in the simple desire of participants wishing to continuously test
some patterns, simply because they felt nice, even though they were
not designed with that in mind. Other examples included tactons
created with spatial metaphors in mind, which were experienced
as arousing or as calming. However, factors such as variability of
body shape, e.g.: size and distribution of muscles and fatty tissue,
impact how vibration propagates, making it dicult to transfer
these experiences between participants.
Participants found that borrowing visual analogies did not work
well. For example, actuators organized in the shape of a play or
pause symbol lost their meaning, when the body (and consequently
the icon) changed orientation. We share visual impressions from
workshop two in Figure 9.
5 DISCUSSION
TactJam enabled 17 people from six countries, two continents and
three timezones to come together and collaboratively prototype
on-body tactile experiences. There was something profoundly sat-
isfying, to be able to speak with a person sitting thousands of
kilometers away and then, with a couple of mouse-clicks and but-
ton presses, be able to share a bodily experience with that person.
TactJam also supported serendipitous discovery, both accidental,
for example by erroneously uploading tactons designed for the arm
to a actuator pattern placed on the chest as well as through playful
experimentation, for example by designing tactons on the lips, or
together with their pet.
We argue this sense of discovery and playfulness was in part en-
abled by the features of TactJam for sharing of vibrotactile patterns.
Participants did not need to worry about uploading specic les
online for their collaborators to download, but could instead focus
on testing and discussing their designs. TactJam further helped
participants document their work by integrating means to report
on the logic of each design.
Consequently, TactJam not only enabled playful collaboration,
but through this collaboration allowed participants to learn about
vibrotactile feedback design and deepen their knowledge of tacton
design. It should be highlighted that experts who had been work-
ing with vibrotactile feedback for years were able to deepen their
knowledge through playing with TactJam while complete novices
were similarly able to not only produce tactons, but through these
tactons produce insights which we share in the results sections of
this paper.
5.1 Comparisons between Workshops
A further exciting aspect about the workshops is that, together, they
created something akin to a natural experiment comparing design
by means of a GUI tool to designing directly on bodies. We observed
how the tools used lead to relevant dierences in how participants
reect on their designs and associated experiences. On a language
level, when using the GUI, participants used many comparisons and
metaphors to describe their intent and the inspirations they drew
from. Hence, metaphors indicate a design potential. In contrast,
when working directly on bodies, participants relied on analogies
when describing their experiences and used more abstract notions
to convey how things felt to them. Hence, as the envisioned inter-
actions become concrete and translated into sensory experiences,
metaphors and clear design constraints become less relevant to
create new designs and communicate them to others. Additionally,
the designs created become more personal, not just in how they are
experienced but also in how they have meaning for those engaging
with them.
Across both workshops, we re-armed that designing for bodies
requires designing with bodies (as suggested by [
21
]). However, in
both workshops participants prioritized their individual perspec-
tives on bodily experiences. Working with the GUI these were used
for making assumptions about how something might feel, whereas
when working with their bodies, participants extrapolated from
their own experiences onto others’. This was partly due to the
specic setup of the workshops, though we need to stress here
that to account for the diversity of bodies, designers will need to
actively seek out the perspectives of those they design for and con-
sider designing with their target populations especially if they hold
dierent embodied experiences from their own (see [4]).
Even though participants mentioned the intimacy involved in
sharing sensory experiences amongst each other, this experience
was limited, presumably due to the professional setting. Here, our
workshops provide only a glimpse into the potential we see for
TactJam to facilitate interactions between humans and allow for
shared meaning making.
Ultimately, the workshops highlighted that the tools we use
shape the way we think about embodied interactions with vibro-
tactile patterns. The choice of tools therefore can lead not only
to dierent ideas and design proposals, they also convey dierent
experiences and allow for more distance or closeness of designers
to their designs.
Using the GUI, participants largely designed for populations that
they were not necessarily part of, while when working on their
own bodies, designs were largely driven by personal preferences
and assessed in light of personal experiences. However, given how
the placement of tactons in the second workshop was much more
limited, socio-technical factors such as presenting and discussing
bodily experiences to a group of relative strangers and having cam-
eras focused on faces while individually sitting in single rooms
play additional roles in how people design around and with their
bodies. The workshops provide initial comparative insights regard-
ing the qualitative dierences of designs and reections based in
an embodied design process as compared to design with a more
traditional GUI.
5.2 Limitations and Future Work
This is not to say that everything worked perfectly during the work-
shops. As with any rst deployment of a system, we encountered
bugs, as well as areas where we wish we would have done things
dierently or done more. Here we wish to again stress that TactJam
is open source, and that anyone is invited to reproduce the devices,
use the software and iterate on it, maybe even addressing some of
our own critique (see also https://github.com/TactileVision).
While TactJam was explicitly designed to support rapid creation
of a breadth of lo- prototypes, rather than as a tool for editing
and rening designs towards higher delity, this was perceived as
TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
Figure 9: Participants while trying several actuators placements during workshop two.
a major missing feature of the TactJam suite. The ability to edit
time proles of actuators using the GUI (as is done with the haptics
composer by interhaptics, [
25
] or in work by Schneider et al [
44
])
and copying, pasting, and swapping proles between actuators was
the feature most commonly requested.
The virtual mannequin standing in a T-pose which is currently
used by TactJam for documenting the placement of actuators on the
body also proved a non-ideal choice given its lack of exibility (see
also, Figure 4). For one, many workshop participants did not identify
with it and would have preferred the ability to customize it, to better
match their own bodies, or the bodies of the target group they were
designing for. This also had practical design consequences, as it
masked diculties which arise when designing a xed layout for
multiple bodies (see also [
48
]). One might also speculate if the chest
had been used as frequently for placing tactons in workshop one,
if the mannequin had larger breasts. Finally, the xed mannequin
posed problems for participants who intended to design tactons
for people with missing limbs, or for non-human bodies (such as
elephants and dogs...).
In terms of the implementation, a concrete lessons learned from
the hardware design was to stay away from SMD micro-USB connec-
tors. These were the most frequent point of failure of the hardware,
both during assembly, as well as during use. Another hardware
design question which caused much discussion was whether ERM
motors were the right tool to use, when better actuators are avail-
able. ERM motors were chosen in the spirit of low- prototyping
which TactJam is designed for. Recoil-Style Tactile Actuators [
56
]
or Linear Resonate Actuators (LRA) would have opened up a whole
new dimension of experimenting with vibration frequency. While
we would like to see a version of TactJam using high-delity actua-
tors in the future, we speculate that this might have hampered the
rapid iteration process we aimed for.
6 CONCLUSION
In this paper, we presented TactJam, an end-to-end suite for cre-
ating and sharing low delity prototypes of on-body vibrotactile
feedback. With TactJam, designers can create, record and share
vibrotactile patterns. The ease of sharing between local and remote
collaborators using TactJam opens up new ways of collaboratively
designing vibrotactile patterns.
We showed the utility of TactJam by putting it to practice in
a two day distributed online workshop, exploring the design of
on-body tactons. Participants were able to successfully use TactJam
to learn about tacton design. Participants used a wide range of
mappings between tactons and the concepts they represent, which
we shared in this paper.
Additionally, comparing workshop discussion and results when
a GUI was used for creating tactons to when participants worked
directly on their bodies highlighted that the tools used heavily in-
uence the outcome. Tactons created using the GUI were more
structured, used a larger area of their bodies and had clearer map-
pings. While tactons created through sketching directly on bodies
came with less implicit assumptions about other bodies and fo-
cused more strongly on the experiential dimension of vibrotactile
feedback.
Finally, we wish to again highlight that all source materials of the
work presented here are available online and we invite researchers
and professionals interested in designing tactile feedback to explore
and extend TactJam.
ACKNOWLEDGMENTS
We would like to thank all participants of the two workshops,
without them this paper would not exist. We also thank Alexander
Mamedow, Richard Adler, Martin Johne, Christopher Praas, Lucas
Hindenberger and Jonas Bruschke for their support while building
TactJam. This project received funding from the European Research
Council (ERC StG Interactive Skin 714797) and the Hertha-Firnberg
Grant T 1146-G by the Austrian Science Fund (FWF).
REFERENCES
[1]
Margaret Addabbo, Elena Longhi, Nadia Bolognini, Irene Senna, Paolo Tagliabue,
Viola Macchi Cassia, and Chiara Turati. 2015. Seeing Touches Early in Life. PLOS
ONE 10, 9 (sep 2015), e0134549. https://doi.org/10.1371/JOURNAL.PONE.0134549
[2]
Arduino AG. 2020. Arduino. http://arduino.cc/en/Reference/HomePage Last
accessed 06 July 2021.
[3]
Tomohiro Amemiya and Taro Maeda. 2008. Asymmetric Oscillation Distorts
the Perceived Heaviness of Handheld Objects. IEEE Transactions on Haptics 1, 1
(2008), 9–18. https://doi.org/10.1109/TOH.2008.5
[4]
Cynthia L. Bennett and Daniela K. Rosner. 2019. The Promise of Empathy: Design,
Disability, and Knowing the "Other". Association for Computing Machinery, New
York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300528
[5]
Sliman Bensmaïa and Mark Hollins. 2005. Pacinian representations of ne
surface texture. Perception & Psychophysics 67, 5 (01 Jul 2005), 842–854. https:
//doi.org/10.3758/BF03193537
TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea Wichen, et al.
[6]
Richard E Boyatzis. 1998. Transforming qualitative information: Thematic analysis
and code development. sage.
[7]
Stephen A Brewster and Lorna M Brown. 2004. Tactons: structured tactile
messages for non-visual information display. (2004).
[8]
L.M. Brown, S.A. Brewster, and H.C. Purchase. 2005. A First Investigation into the
Eectiveness of Tactons. In First Joint Eurohaptics Conference and Symposium on
Haptic Interfaces for Virtual Environment and Teleoperator Systems. IEEE, 167–176.
https://doi.org/10.1109/WHC.2005.6
[9]
Bill Buxton. 2010. Sketching user experiences: getting the design right and the right
design. Morgan kaufmann.
[10]
Andrew Chan, Karon MacLean, and Joanna McGrenere. 2008. Designing haptic
icons to support collaborative turn-taking. International Journal of Human-
Computer Studies 66, 5 (2008), 333–355. https://doi.org/10.1016/j.ijhcs.2007.11.002
[11]
Fabien Danieau, Philippe Guillotel, Olivier Dumas, Thomas Lopez, Bertrand Leroy,
and Nicolas Mollet. 2018. HFX Studio: Haptic Editor for Full-Body Immersive
Experiences. In Proceedings of the 24th ACM Symposium on Virtual Reality Software
and Technology (Tokyo, Japan) (VRST ’18). Association for Computing Machinery,
New York, NY, USA, Article 37, 9 pages. https://doi.org/10.1145/3281505.3281518
[12]
Donald Degraen, Bruno Fruchard, Frederik Smolders, Emmanouil Potetsianakis,
Seref Güngör, Antonio Krüger, and Jürgen Steimle. 2021. Weirding Haptics: In-
Situ Prototyping of Vibrotactile Feedback in Virtual Reality through Vocalization.
In Proceedings of the 34th Annual ACM Symposium on User Interface Software and
Technology (Virtual) (UIST ’21). Association for Computing Machinery, New York,
NY, USA. https://doi.org/10.1145/3472749.3474797
[13]
Alexandra Delazio, Ken Nakagaki, Jill Fain Lehman, Roberta L. Klatzky,Alanson P.
Sample, and Scott E. Hudson. 2018. Force jacket: Pneumatically-actuated jacket for
embodied haptic experiences. Conference on Human Factors in Computing Systems
- Proceedings 2018-April (2018), 1–12. https://doi.org/10.1145/3173574.3173894
[14]
M.J. Enriquez and K.E. MacLean. 2003. The hapticon editor: a tool in support
of haptic communication research. In HAPTICS ’03. IEEE Comput. Soc, 356–362.
https://doi.org/10.1109/HAPTIC.2003.1191310
[15]
Colin Gallacher, Arash Mohtat, and Steve Ding. 2016. Toward Open-Source
Portable Haptic Displays with Visual-Force-Tactile Feedback Colocation. (2016).
[16]
Frank A. Geldard. 1957. Adventures in tactile literacy. American Psychologist 12,
3 (1957), 115–124. https://doi.org/10.1037/h0040416
[17]
Eric Gunther, Glorianna Davenport, and Sile O’Modhrain. 2002. Cutaneous
grooves: composing for the sense of touch. In NIME ’02. New York, NY, USA,
73–79. http://dl.acm.org/citation.cfm?id=1085171.1085181
[18]
Antal Haans and Wijnand IJsselsteijn. 2006. Mediated social touch: a review of
current research and future directions. Virtual Reality 9, 2-3 (2006), 149–159.
[19]
Matthew J Hertenstein, Rachel Holmes, Margaret McCullough, and Dacher Kelt-
ner. 2009. The communication of emotion via touch. Emotion 9, 4 (2009), 566.
[20]
Lucas Hinderberger. 2020. libvtp: A library for encoding and decoding vibrotactile
patterns. https://github.com/lhinderberger/libvtp Last accessed 06 July 2021.
[21]
Kristina Höök. 2018. Designing with the body: Somaesthetic interaction design.
MIT Press.
[22]
Gijs Huisman, Aduén Darriba Frederiks, Betsy Van Dijk, Dirk Hevlen, and Ben
Kröse. 2013. The TaSSt: Tactile sleeve for social touch. In 2013 World Haptics
Conference (WHC). 211–216. https://doi.org/10.1109/WHC.2013.6548410
[23]
W.A. A IJsselsteijn. 2003. Presence in the past: what can we learn from media
history? Being there: concepts, eects and measurements of user presence in synthetic
environments 5 (2003), 17–40.
[24]
Apple inc. 2021. Core Haptics. https://developer.apple.com/documentation/
corehaptics Last accessed August 1st 2021.
[25]
Interhaptics. 2021. Haptic Composer. https://www.interhaptics.com/haptic-
composer Last accessed August 1st 2021.
[26]
Yeongmi Kim, Jongeun Cha, Ian Oakley, and Jeha Ryu. 2009. Exploring Tactile
Movies: An Initial Tactile Glove Design and Concept Evaluation. IEEE Multimedia
PP, 99 (2009), 1. https://doi.org/10.1109/MMUL.2009.63
[27]
PlatformIO Labs. 2014. Professional collaborative platform for embedded devel-
opment. https://platformio.org/ Last accessed 06 July 2021.
[28]
PlatformIO Labs. 2016. Espressif 32: development platform for PlatformIO. https:
//github.com/platformio/platform-espressif32 Last accessed 06 July 2021.
[29]
Lofelt. 2021. Lofelt Studio. https://lofelt.com/design Last accessed August 1st
2021.
[30]
Emanuela Maggioni, Erika Agostinelli, and Marianna Obrist. 2017. Measuring
the added value of haptic feedback. In 2017 Ninth International Conference on
Quality of Multimedia Experience (QoMEX). 1–6. https://doi.org/10.1109/QoMEX.
2017.7965670
[31]
Melisa Orta Martinez, Joseph Campion, Tara Gholami, Michal K. Rittikaidachar,
Aaron C. Barron, and Allison M. Okamura. 2017. Open source, modular, customiz-
able, 3-D printed kinesthetic haptic devices. 2017 IEEE World Haptics Conference,
WHC 2017 (2017), 142–147. https://doi.org/10.1109/WHC.2017.7989891
[32]
David J. Meyer, Michael A. Peshkin, and J. Edward Colgate. 2016. Tactile Paint-
brush: A procedural method for generating spatial haptic texture. In 2016 IEEE
Haptics Symposium (HAPTICS). IEEE, 259–264. https://doi.org/10.1109/HAPTICS.
2016.7463187
[33]
Camille Moussette. 2010. Sketching in Hardware and Building Interaction Design
: tools , toolkits and an attitude for Interaction Designers. In Processing. https:
//public.me.com/intuitive
[34]
Camille Moussette, Stoel Kuenen, and Ali Israr. 2012. Designing Haptics. In Pro-
ceedings of the Sixth International Conference on Tangible, Embedded and Embodied
Interaction (Kingston, Ontario, Canada) (TEI ’12). Association for Computing Ma-
chinery, New York, NY, USA, 351–354. https://doi.org/10.1145/2148131.2148215
[35]
Marianna Obrist, Sue Ann Seah, and Sriram Subramanian. 2013. Talking about
Tactile Experiences. Association for Computing Machinery, New York, NY, USA,
1659–1668. https://doi.org/10.1145/2470654.2466220
[36]
Allison M Okamura, Katherine J Kuchenbecker, and Mohsen Mahvash. 2008.
Measurement-Based Modeling for Haptic Rendering. In Haptic Rendering: Algo-
rithms and Applications. A. K. Peters, Chapter 21, 443–467.
[37]
Sabrina A. Paneels, Margarita Anastassova, and Lucie Brunet. 2013. TactiPEd:
Easy Prototyping of Tactile Patterns. INTERACT ’13 8118 (2013), 228–245. https:
//doi.org/10.1007/978-3- 642-40480- 1
[38]
Evan Pezent, Brandon Cambio, and Marcia K O Malley. 2020. Syntacts : Open
Source Framework for Audio-Controlled Vibrotactile Haptics. c (2020), 2020.
[39]
M Pielot, N Henze, and S Boll. 2009. Supporting map-based waynding with
tactile cues. Proceedings of the 11th International Conference on HumanComputer
Interaction with Mobile Devices and Services MobileHCI 09 (2009), 1. https://doi.
org/10.1145/1613858.1613888
[40]
Jonghyun Ryu and Seungmoon Choi. 2008. posVibEditor: Graphical authoring
tool of vibrotactile patterns. In HAVE ’08. IEEE, 120–125. https://doi.org/10.1109/
HAVE.2008.4685310
[41]
Oliver Schneider, Karon MacLean, Colin Swindells, and Kellogg Booth. 2017.
Haptic experience design: What hapticians do and where they need help. Inter-
national Journal of Human-Computer Studies 107 (2017), 5–21. https://doi.org/
10.1016/j.ijhcs.2017.04.004 Multisensory Human-Computer Interaction.
[42]
Oliver S. Schneider, Ali Israr, and Karon E. MacLean. 2015. Tactile Animation by
Direct Manipulation of Grid Displays. In UIST’15.
[43]
Oliver S. Schneider and Karon E. MacLean. 2014. Improvising design with a
Haptic Instrument. In 2014 IEEE Haptics Symposium (HAPTICS). 327–332. https:
//doi.org/10.1109/HAPTICS.2014.6775476
[44]
Oliver S. Schneider and Karon E. MacLean. 2016. Studying Design Process and
Example Use with Macaron, a Web-based Vibrotactile Eect Editor. In HAPTICS
’16: Symposium on Haptic Interfaces for Virtual Environment and Teleoperator
Systems.
[45]
Oliver S. Schneider, Hasti Sei, Salma Kashani, Matthew Chun, and Karon E.
MacLean. 2016. HapTurk: Crowdsourcing Aective Ratings for Vibrotactile
Icons. In CHI ’16. ACM Press, New York, New York, USA, 3248–3260. https:
//doi.org/10.1145/2858036.2858279
[46]
Hasti Sei, Matthew Chun, Colin Gallacher, Oliver Schneider, and Karon E.
MacLean. 2020. How Do Novice Hapticians Design? A Case Study in Creating
Haptic Learning Environments. IEEE Transactions on Haptics 13, 4 (2020), 791–805.
https://doi.org/10.1109/TOH.2020.2968903
[47]
Tanay Singhal and Oliver Schneider. 2021. Juicy Haptic Design: Vibrotactile Em-
bellishments Can Improve Player Experience in Games. Association for Computing
Machinery, New York, NY, USA. https://doi.org/10.1145/3411764.3445463
[48]
Katta Spiel. 2021. The Bodies of TEI Investigating Norms and Assumptions in
the Design of Embodied Interaction. In Proceedings of the Fifteenth International
Conference on Tangible, Embedded, and Embodied Interaction (Salzburg, Austria)
(TEI ’21). Association for Computing Machinery, New York, NY, USA, Article 32,
19 pages. https://doi.org/10.1145/3430524.3440651
[49]
Katta Spiel and Lennart E. Nacke. 2020. What Is It Like to Be a Game?—Object
Oriented Inquiry for Games Research, Design, and Evaluation. Frontiers in
Computer Science 2 (2020), 18. https://doi.org/10.3389/fcomp.2020.00018
[50]
Paul Strohmeier, Seref Güngör, Luis Herres, Dennis Gudea, Bruno Fruchard,
and Jürgen Steimle. 2020. BARefoot: Generating Virtual Materials Using Motion
Coupled Vibration in Shoes. In Proceedings of the 33rd Annual ACM Symposium on
User Interface Software and Technology (Virtual Event, USA) (UIST ’20). Association
for Computing Machinery, New York, NY, USA, 579–593. https://doi.org/10.
1145/3379337.3415828
[51]
Colin Swindells, Seppo Pietarinen, and Arto Viitanen. 2014. Medium delity rapid
prototyping of vibrotactile haptic, audio and video eects. In 2014 IEEE Haptics
Symposium (HAPTICS). 515–521. https://doi.org/10.1109/HAPTICS.2014.6775509
[52]
D Ternes and K E MacLean. 2008. Designing Large Sets of Haptic Icons with
Rhythm. Haptics: Perception, Devices and Scenarios 5024 (2008), 199–208. http:
//www.springerlink.com/index/atl5686222k242m4.pdf
[53]
Marc Teyssier, Gilles Bailly, Éric Lecolinet, and Catherine Pelachaud. 2017. Survey
and Perspectives of Social Touch in HCI. In Proceedings of the 29th Conference
on l’Interaction Homme-Machine (Poitiers, France) (IHM ’17). Association for
Computing Machinery, New York, NY, USA, 93–104. https://doi.org/10.1145/
3132129.3132136
[54]
Graham Wilson and Stephen A. Brewster. 2017. Multi-Moji: Combining Ther-
mal, Vibrotactile and Visual Stimuli to Expand the Aective Range of Feed-
back. Association for Computing Machinery, New York, NY, USA, 1743–1755.
TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback TEI ’22, February 13–16, 2022, Daejeon, Republic of Korea
APPENDIX: TACTON EXANPLES
Tactons in the context of Bicycling. Participants placed the actuators on the feet (a) to provide instructions to the user, e.g. to
speed up (b) and (c) or to slow down (d).
Tactons in the context of Virtual Reality. Participants placed the actuators on the head and throat (a) to provide spatial cues
for out-of-sight events (b), (c), and (d).
Tactons in the context of Sports. Participants placed the actuators on the forearm to indicate certain phases during a physical
training, e.g. get ready (b), start (c), and pause/stop (d).
... It used shift registers and PWM drivers to control up to sixteen ERMs (i.e., the number of channels available on the PWM drivers). TactJam supported up to eight actuators and was intended for collaborative design [54]. With Syntacts, designers could design vibration effects in a GUI Editor and then send them through an audio amplifier to actuators for testing [33]. ...
Preprint
Full-text available
Spatialized vibrotactile feedback systems deliver tactile information by placing multiple vibrotactile actuators on the body. As increasing numbers of actuators are required to adequately convey information in complicated applications, haptic designers find it difficult to create such systems due to limited scalability of existing toolkits. We propose VibraForge, an open-source vibrotactile toolkit that supports up to 128 vibrotactile actuators. Each actuator is encapsulated within a self-contained vibration unit and driven by its own microcontroller. By leveraging a chain-connection method, each unit receives independent vibration commands from a control unit, with fine-grained control over intensity and frequency. We also designed a GUI Editor to expedite the authoring of spatial vibrotactile patterns. Technical evaluations show that vibration units reliably reproduce audio waveforms with low-latency and high-bandwidth data communication. Case studies of phonemic tactile display, virtual reality fitness training, and drone teleoperation demonstrate the potential usage of VibraForge within different domains.
... Compared to previous applications, delivering obstacle directions during UAV teleoperation could be more challenging , because it requires numerous actuators with custom layout on the body to represent obstacle directions precisely. Current vibrotactile systems such as TactJam [25] and VHP [26]) offer increased customizability but only support less than 10 actuators. In contrast, solutions like bHaptics X40 [27]) support numerous actuators but have predetermined positions and limited body coverage. ...
Preprint
Full-text available
Haptic feedback enhances collision avoidance by providing directional obstacle information to operators in unmanned aerial vehicle (UAV) teleoperation. However, such feedback is often rendered via haptic joysticks, which are unfamiliar to UAV operators and limited to single-directional force feedback. Additionally, the direct coupling of the input device and the feedback method diminishes the operators' control authority and causes oscillatory movements. To overcome these limitations, we propose AeroHaptix, a wearable haptic feedback system that uses high-resolution vibrations to communicate multiple obstacle directions simultaneously. The vibrotactile actuators' layout was optimized based on a perceptual study to eliminate perceptual biases and achieve uniform spatial coverage. A novel rendering algorithm, MultiCBF, was adapted from control barrier functions to support multi-directional feedback. System evaluation showed that AeroHaptix effectively reduced collisions in complex environment, and operators reported significantly lower physical workload, improved situational awareness, and increased control authority.
... As underlined by the field of haptic design, there is a pressing need for intuitive and comprehensive approaches to enable designers to create and share convincing and immersive experiences (Schneider et al., 2017;Degraen et al., 2020). Rather than taking an active feedback approach (Degraen et al., 2021a;Wittchen et al., 2022), we envision the use of fabrication technologies for designing haptic experiences. Specifically in virtual environments, the visuo-haptic perception of fabricated artefacts is able to drive on-demand creation of haptic experiences. ...
Article
Full-text available
During interaction with objects in Virtual Reality haptic feedback plays a crucial role for creating convincing immersive experiences. Recent work building upon passive haptic feedback has looked towards fabrication processes for designing and creating proxy objects able to communicate objects’ properties and characteristics. However, such approaches remain limited in terms of scalability as for each material a corresponding object needs to be fabricated. To create more flexible 3D-printed proxies, we explore the potential of metamaterials. To this aim, we designed metamaterial structures able to alter their tactile surface properties, e.g., their hardness and roughness, upon lateral compression. In this work, we designed five different metamaterial patterns based on features that are known to affect tactile properties. We evaluated whether our samples were able to successfully convey different levels of roughness and hardness sensations at varying levels of compression. While we found that roughness was significantly affected by compression state, hardness did not seem to follow the same pattern. In a second study, we focused on two metamaterial patterns showing promise for roughness perception and investigated their visuo-haptic perception in Virtual Reality. Here, eight different compression states of our two selected metamaterials were overlaid with six visual material textures. Our results suggest that, especially at low compression states, our metamaterials were the most promising ones to match the textures displayed to the participants. Additionally, when asked which material participants perceived, adjectives, such as “broken” and “damaged” were used. This indicates that metamaterial surface textures could be able to simulate different object states. Our results underline that metamaterial design is able to extend the gamut of tactile experiences of 3D-printed surfaces structures, as a single sample is able to reconfigure its haptic sensation through compression.
... Much progress has been made in devising experiences involving digital touch for mediating interpersonal interactions [5,8,28,30,33,36,45,74,79,81,83,121]. A wide repertory of sensations such as vibration [9,11,53,114,126,128], texture [49,87,88], pressure [10,17], weight [31,82], temperature shifts [29,69,73,89,90,98], pain [47,59], itching [94], tingling [4], and wetness [32] have been enabled. When integrated with other senses, these sensations have been used to mimic pre-existing human-human interaction, such as kissing [105], holding hands [28], or hugging [8]. ...
Article
Full-text available
Human-Computer Interaction (HCI) researchers more and more challenge the notion of technologies as objects and humans as subjects. This conceptualization has led to various approaches inquiring into object perspectives within HCI. Even though the development and analysis of games and players is filled with notions of intersubjectivity, games research has yet to embrace an object oriented perspective. Through an analysis of existing methods, we show how Object-Oriented Inquiry offers a useful, playful, and speculative lens to pro-actively engage with and reflect on how we might know what it is like to be a game. We illustrate how to actively attend to a game's perspective as a valid position. This has the potential to not only sharpen our understanding of implicit affordances but, in turn, about our assumptions regarding play and games more generally. In a series of case studies, we apply several object-oriented methods across three methodological explorations on becoming, being, and acting as a game, and illustrate their usefulness for generating meaningful insights for game design and evaluation. Our work contributes to emerging object-oriented practices that acknowledge the agency of technologies within HCI at large and its games-oriented strand in particular.
Article
Access to haptic technology is on the rise, in smartphones, virtual reality gear, and open-source education kits. However, engineers and interaction designers are often inexperienced in designing with haptics, and rarely have tools and guidelines for creating multisensory experiences. To examine the impact of this deficit, we supplied a haptic design kit, custom software, and technical support to 9 teams (25 students) for an innovation challenge at a major haptics conference. Teams (predominantly undergraduate engineers with little haptics, interaction design, or education training) designed and built haptic environments to support learning of science topics. Qualitative analysis of surveys, interviews, team blogs, and expert assessments of teams' final demonstrations exposed three themes in these design efforts. 1) Novice teams tended to ignore many of ten design choices that experts navigate, such as explicitly choosing whether haptic and graphic feedback should reinforce versus complement one other. 2) Their design activities differed in timing and inclusion from ten observed in expert process. 3) We identified three success strategies in how teams devised useful and engaging interactions and interpretable multimodal experiences, and communicated about their designs. We compare novice and expert design needs and highlight where future haptic design tools and theory need to support novice practice and training.
Conference Paper
This paper examines the promise of empathy, the name commonly given to the initial phase of the human-centered design process in which designers seek to understand their intended users in order to inform technology development. By analyzing popular empathy activities aimed at understanding people with disabilities, we examine the ways empathy works to both powerfully and problematically align designers with the values of people who may use their products. Drawing on disability studies and feminist theorizing, we describe how acts of empathy building may further distance people with disabilities from the processes designers intend to draw them into. We end by reimagining empathy as guided by the lived experiences of people with disabilities who are traditionally positioned as those to be empathized.
Book
Interaction design that entails a qualitative shift from a symbolic, language-oriented stance to an experiential stance that encompasses the entire design and use cycle. With the rise of ubiquitous technology, data-driven design, and the Internet of Things, our interactions and interfaces with technology are about to change dramatically, incorporating such emerging technologies as shape-changing interfaces, wearables, and movement-tracking apps. A successful interactive tool will allow the user to engage in a smooth, embodied, interaction, creating an intimate correspondence between users' actions and system response. And yet, as Kristina Höök points out, current design methods emphasize symbolic, language-oriented, and predominantly visual interactions. In Designing with the Body, Höök proposes a qualitative shift in interaction design to an experiential, felt, aesthetic stance that encompasses the entire design and use cycle. Höök calls this new approach soma design; it is a process that reincorporates body and movement into a design regime that has long privileged language and logic. Soma design offers an alternative to the aggressive, rapid design processes that dominate commercial interaction design; it allows (and requires) a slow, thoughtful process that takes into account fundamental human values. She argues that this new approach will yield better products and create healthier, more sustainable companies. Höök outlines the theory underlying soma design and describes motivations, methods, and tools. She offers examples of soma design “encounters” and an account of her own design process. She concludes with “A Soma Design Manifesto,” which challenges interaction designers to “restart” their field—to focus on bodies and perception rather than reasoning and intellect.
Conference Paper
Current virtual reality systems enable users to explore virtual worlds, fully embodied in avatars. This new type of immersive experience requires specific authoring tools. The traditional ones used in the movie and the video games industries were modified to support immersive visual and audio content. However, few solutions exist to edit haptic content, especially when the whole user's body is involved. To tackle this issue we propose HFX Studio, a haptic editor based on haptic perceptual models. Three models of pressure, vibration and temperature were defined to allow the spatialization of haptic effects on the user's body. These effects can be designed directly on the body (egocentric approach), or specified as objects of the scene (allocentric approach). The perceptual models are also used to describe capabilities of haptic devices. This way the created content is generic, and haptic feedback is rendered on the available devices. The concept has been implemented with the Unity®game engine, a tool already used in VR production. A qualitative pilot user study was conducted to analyze the usability of our tool with expert users. Results shows that the edition of haptic feedback is intuitive for these users.