Content uploaded by Elena Márquez Segura
Author content
All content in this area was uploaded by Elena Márquez Segura on Aug 29, 2018
Content may be subject to copyright.
Designing Future Social Wearables with Live Action Role
Play (Larp) Designers
Elena Márquez Segura, James Fey, Ella Dagan, Samvid Niravbhai Jhaveri, Jared Pettitt,
Miguel Flores, Katherine Isbister
University of California, Santa Cruz
Santa Cruz, CA, USA
{elena.marquez; jfey; ella; snjhaver; jpettitt; mflore21; katherine.isbister}@ucsc.edu
ABSTRACT
Designing wearable technology that supports physical and
social engagement in a collocated setting is challenging. In
this research, we reached out to an expert community of
crafters of social experiences: larpers (live action role
players). Larpers and larp designers have a longstanding
tradition of designing and making use of a variety of
elements, such as costumes, physical objects, environments,
and recently also digital artifacts. These are crafted in
support of co-experience values that we argue can inform
the design of social wearables. We engaged in a co-design
process with a game designer and co-founder of a larp
production company, and embedded the resulting social
wearables in a larp. Here, we present the results of this
design and implementation process, and articulate design
affordances that resonate with our larp designer’ values.
This work may inspire and inform researchers and
designers creating wearable technology that is aimed at
supporting collocated engagement.
Author Keywords
Social wearables; wearables; Research through Design;
larp; collocated social play; embodied interaction; DiY.
ACM Classification Keywords
H.5.m. Information interfaces and presentation (e.g., HCI):
Miscellaneous.
INTRODUCTION
Wearables–smart watches, glasses, wristbands, etc.—are a
growing segment of computing technology in everyday life
[54]. These devices by their nature impact in-person social
interactions–they are meant to be used while on the go, and
thus become intertwined with everyday social contexts and
activities. Yet the first wave of consumer wearables has
focused primarily on personal data collection and personal
data monitoring, with little concern for the impact of these
technologies on interaction between people in the same
space [53]. As we introduce technology ever deeper into
everyday social situations, we argue it is important both to
avoid degrading the quality of in-person social interaction,
and ideally, to enhance it. In this paper we focus on the
following question: How can we design social wearables,
i.e. wearables that support and augment users’ in-the-
moment collocated social experience?
To help answer this question, we engaged in a Research
through Design [17,74,75] process in collaboration with a
member of a group of experts in designing rich collocated
co-experiences: larp (live action role play) designers. Larps
fall somewhere between physical games and participatory
theatre, and typically feature highly interactive narratives
[60,62] played through performance in the physical world.
Larp designers aim to create rich and immersive physical
and social collocated game experiences. These designers
put the experience of the group—the collective
experience—at the core of their design values [41]. To
deliver a quality shared experience, they craft designed
elements such as props, costumes, and transformations of
the physical play space [41,60,72]. We argue that larp
designers’ values and design practices can potentially offer
important insights relevant to the design space of social
wearables, which could be inspirational to us and others
trying to augment collocated social interaction.
Here, we report on a co-design process with a game
designer and co-founder of a games and larp production
company, which culminated in two design instances that
reflected a set of social affordances aligned with larp
designers’ co-experience values. These designs were
deployed in a Battlestar Galactica larp (live action role play
game) at the renowned American game convention
Dexcon20.
The contributions of this paper are: 1), we present designed
exemplars of social wearables for larping, which are new to
larps [41], and can inspire others in that domain. 2) We
share results from our co-design and implementation
experience with larp designers. Finally, 3) we introduce
social affordance strategies that could be of potential future
value in contexts outside larp, for others interested in
designing wearables to augment in-person social
interaction. These social affordances support larp designers’
primary value of supporting co-presence and co-experience
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies
are not made or distributed for profit or commercial advantage and
that copies bear this notice and the full citation on the first page.
Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To
copy otherwise, or republish, to post on servers or to redistribute to
lists, requires prior specific permission and/or a fee. Request
permissions from Permissions@acm.org.
CHI 2018, April 21–26, 2018, Montreal, QC, Canada
© 2018 Copyright is held by the owner/author(s). Publication rights
licensed to ACM.
ACM 978-1-4503-5620-6/18/04…$15.00
https://doi.org/10.1145/3173574.3174036
above all, privileging the collective experience. As such,
the work could be inspirational for others designing
collocated physical and social interaction.
BACKGROUND
Here we provide context for larp practice, then discuss HCI
work focused on collocated play and interaction.
Larps, Larpers, and Technology in Larps
Larps are collaborative role-playing experiences in which
players engage in pretend play, acting out their characters’
actions in a fictional world, represented by the physical
world where the gameplay takes place [34,60,62]. Larp is
an incredibly diverse medium [62], which can involve
costumes, make up, accessories, set or improvised rules,
props, objects embedded in the space, etc. Some of these
are provided by larp organizers, and some are crafted,
appropriated, or used by the players, in order to create
expressive characters [60].
There are many larp styles and genres, but they usually
have some common elements [34]: i) Physical enactment,
of the characters’ actions, often employing similar actions
to those their characters would do; ii) a strong
improvisational component, useful to respond to challenges
posed by larp organizers and other larpers, as well as for
bridging differences between the fictional larp world and
the physical world where the game takes place; iii) a lack of
audience beyond the other players—larpers’ engagement in
pretend play and theatrical performances happens for their
own sake, not for spectators.
The type of larp we embedded our technology in for this
research is American freeform. This style typically involves
a simplified rules system and an emphasis on players’
intense focused play, on improvisation, and commitment to
the co-creation of a good story [62]. Freeform larps are
strongly narrative, and very often involve game masters
(GMs) or storytellers who presents players with story
events and challenges that they have to role-play [52]. This
style typically also includes physical props to help the
players get immersed and act in the fictional world [62].
During these larps, a very interesting “magic circle” [22]
frames the play experience, characterized by the specific
space where the larp takes place, a set of agreed upon rules
and interesting shared values and goals that frame the play
experience, as well as a rich ecology of physical and digital
artifacts in support of those values, rules, stories, and the
larp world. Many elements that constitute this circle are
crafted in advance by larp organizers and larpers, who work
together to co-create characters’ stories, scenarios and props
used in-game [60]. The magic circle also evolves as the
gameplay unfolds; lived, sustained, and changed by players
in support of their experience [2,13,16,42].
Larpers as Designers, Crafters, and Technologists
Many larpers and larp designers are crafters, supporting
their target aesthetic experiences by assembling and
building elements to support the larp characters and world
[7]. In recent years, larpers have adopted open hardware
(e.g. Arduino) and digital fabrication tools, equipment, and
materials to create props and costumes [7], sometimes
combining these materials with traditional design materials,
such as old/analogic technology (e.g. oscilloscopes) and
everyday technology (phones, GPS, cameras) [41]. Larp
designers share with maker/DiY communities their
bricolage and repurposing culture [7,48]– like makers, larp
designers and crafters often see these materials as
“unfinished” or “raw” products, which they modify to suit
their agenda [57]. They also share with maker/DiY
communities an emphasis on sharing and creativity over
commercial and financial profit purposes [31]. Finally, DIY
communities are typically considered early adopters of new
cultural practices and represent a vibrant creative spirit
[31]. We consider larp designers a community of early
adopters and crafters of technologies with a unique focus on
a joint social experience not at the center of the maker/DiY
movements, that can help us frame values and affordances
for wearable technology.
Technology in Larps to Support Co-experience
Our recent taxonomy of technology use in larps maps some
of the ways these artifacts support collocated co-experience
[41]. Larpers use technology to simulate objects and
phenomena in the fictional world, e.g. technology that
represents particular powers, magic or skills of certain
characters (similar to the commercial props from
LARPtronics [32,33] and ThinkGeek [68]). Larpers also
use technology to simulate complex machines, such as a
spaceship navigation system or the control room from a
nuclear plant (similar to e.g. the commercial Sci-fi scanner
apps in [58,59]). Larpers also use technology to support
communication between players, between players and
NPCs (non-player characters1) or larp organizers, and
communication between organizers.
Some of these technologies support games rules,
mechanics, and states, required for gameplay. For example,
technology is used to keep track of players’ scores or
location2. Some technologies are useful in the gameplay but
are hidden because they do not fit the story world (are non-
diegetic). For example a watch is used to make sure the
player is somewhere on time, or a GPS to find a location, or
a music device used by a larper to get in a certain mood and
better connect with their character.
Larpers filter technologies through the primary question of
how well they will fit into the larp world, before deciding to
use them. If the larp uses non-diegetic technology to make
the experience work, then players work hard to hide or
1Players not fully participating in the game for their own
larp experience, but to fill some practical role in the game,
such as providing players with an interesting story element.
2 Although at the edge of larps, inspiring examples are laser
tag inspired wearables (e.g. [64], or [69]).
camouflage it appropriately [41]. Larp designers care most
when choosing or excluding technologies, about whether
they will support immersion and the experience of the many
[41]. Technology that works well to support game
mechanics and to engage individual players, may be
discarded if it fails to support all players’ engagement with
one another and with the larp world [41]. Quotes from larp
designers concerning this design value include: “[…] the
player's *eyes* should be reserved for interacting with the
larp, not with the tech.” [41], and that tech should enable
players to “interact with the rest of the team socially”, and
to remain connected to “the narrative aspects” of the larp.
Relevant HCI Concepts and Other Related Work
An important concept in the arena of collocated social
interaction, widely used in HCI, is that of space. Drawing
from embodied, situated, and ecological perspectives of
perception and action, we follow Dourish [14] considering
this concept to be not only concerned with architectural
elements or physical properties and objects, but with how
people are configured in it: “how far apart they are, how
they interfere with lines of sights, how actions fall off at a
distance, and so on. By configuring the space in different
ways, different kinds of behaviors can be supported.” [14]
Socio-spatial configurations have been explored in the
context of games (e.g. [12]). They are shown to impact the
player’s emotional experience and their experience of fun
[26,36,37,55]. De Kort and Ijsselsteijn [12] explain how
these configurations give rise to particular social
affordances, allowing for particular social interaction
processes “such as awareness, monitoring, mimicry,
reinforcement, verbal communication and nonverbal
immediacy behaviors.” [12], We argue that socio-spatial
contingencies can be influenced by the technology design,
and that technology design should be influenced by socio-
spatial contingencies [39,40].
To describe socio-spatial configuration and social
affordances, we found concepts in proxemics interaction
helpful. Proxemics [19,20] help us understand interesting
spatial relationships between objects and people and how
these impact and are impacted by social activities and
interactions [38,61]. Proxemics has been used in HCI for
the design of interactive technology [3,18,38,51], and
extended to generate design-specific theory (e.g. [30]).
In our project we used proxemics concepts to think about
social affordances and here we will use them to describe
important design features and insights. The original
proxemic distances introduced by Hall were useful to
describe affordances in relation to the space from the
wearer or technology. Ordered in terms of proximity to
one’s body and closeness of interaction: the intimate
distance zone, with access to sensorial stimuli such as
touch, smell, and low sound (e.g. whisper); the personal
zone, with access to the normal sound, smell, and touch
(people still in range); the social zone, were people are
farther from range, but still normal voice is heard; and the
public distance zone that characterize public speaking, with
access to loud or projected voice [20,38]. Visual stimuli are
accessible from all zones, but farther ones depend more on
if the target is within view.
As for zones with regard to multi-person configurations,
We also found Kendon’s concepts of F-formation (face- or
facing-formation) [11,28] useful. This work looks at
people’s bodily orientation when engaged in mutual
interaction, and transactional segments, which refers to the
immediate space in front of a person, and the focus of their
attention and action [11,28]. Kendon illustrated how people
orient themselves in different formations to share a relevant
transactional space with their co-interactant [28]. These
formations range from more closed (e.g. face-to-face
formations) or more open (e.g. side-to-side formations).
Other concepts from the HCI literature that were useful in
the design process were Reeves et al.’s taxonomy of
interaction interfaces, classified depending on whether their
manipulation and effects are visible from an spectator point
of view [56]. When these are not, we have a secretive
interface; when they are, an expressive one. Magical
interfaces hide the manipulations but show their effects,
while suspenseful interfaces do the opposite: show the
manipulations and hide the effect. While technology
affordances shape much of the visibility of manipulations
and effects from a spectator’s perspective, socio-spatial
contingencies influence this visibility: while cell phones are
typically secretive interfaces for people in the public zone,
they are not for those e.g. in the user’s personal zone in a
side-by-side formation. This taxonomy has been useful in
many applications in HCI (e.g [24,45,67]); here, we used it
to think about social affordances related to users in different
personal distance zones.
Our work intersects with others in pervasive games [49,71]
and more generally in HCI centered around designing rich
collocated social experiences [4,5,21,43,44,50]. However,
the use of wearables in games and other play activities is an
area that has just started to form [1,23,65,66] and research
papers in the area are scarce. A research focus that is
gathering attention is wearables as game controllers
[8,9,47,65,66]. In [65], Joshua and Karen Tanenbaum
present interesting conceptual ways to approach this design
space centered around the use of costumes and props to
support physical and social game play. The authors argue
that props and costumes are essential to construct,
experience, express and transform player’s identities. They
illustrate the potential of their approach through the design
of costumes, props, narration, and sets in their latest Magia
Transformo mixed reality game [27]. Relevant work in the
domain of wearables for role play games is Buruk et al.’s
[8,9]. In [9] they present design implications, although in
the domain of table top games. In the arena of collocated
social play, Isbister et al.’s [1,23] works with wearables is
inspiring, in particular the concept of interdependent
wearables: “wearables designed to require shared attention
and mutual awareness, with interdependent functionality
that encourages and rewards collocated interaction. ” [23]
Isbister et al. reported interesting social dynamics and
increased feeling of connection between players after the
game, attributed to the affordances of the controllers and
the key game design elements of collaboration through
physical contact. This resonates with other works in HCI
(e.g. [35,73]) and in games and play in particular (e.g.
[10,46]) that foreground social touch as an interesting
design resource that we explored with our designs.
METHOD
Following a Research through Design approach [17,74,75],
we engaged in a 6-month design process where we iterated
3 social wearables to augment a particular larp: Battlestar
Galactica. Tales of the Rising Star (BSGtales for short),
featured as a signature larp at the renowned American
games convention Dexcon20. Building on our previous
work that included co-design activities with external
experts (e.g. [24]), we actively engaged a co-designer:
Shoshana Kessock, a larper, larp-and-game designer, and
co-founder of Phoenix Outlaw Productions. They, together
with Michael Malecki and his team at Eleventh Hour
Productions, had run BSGtales since 2013. Shoshana’s role
was to ensure that we grasped and incorporated in our
designs the specific lore of the larp, and design values of
larp more broadly. On our end, we led the design of the
prototypes. We also worked to articulate both the values
from our co-designer, as well as the social affordances and
strategies that arose as we developed the exemplar designs.
Co-design Process and Materials Details
In the design process, we first focused on identifying
opportunities in the game mechanics behind BSGtales,
where technology could add to the players’ experience. To
do so, our team engaged in a thorough study of Shoshana’s
documentation, including postmortems of this larp (e.g.
[29]), images and videos shared in social media (e.g. [15]),
and the preparation package that players receive when they
register to participate in the larp. We also held several
interviews with Shoshana to fact check potential
assumptions and initial ideas. In subsequent sessions, we
brainstormed ideas, discussed proof of concepts, and the
potential integration of the designs in the larps’ backstory.
To enable flexible and frequent iterations, we made use of
Adafruit’s wearable electronics platforms: FLORA and
Circuit Playground. Both are small light arduino-
compatible microcontrollers specifically designed to build
wearables (e.g. can be sewable), which can communicate
with a range of sensors and actuators. These iterations were
tested in lab and discussed with Shoshana. On her end, she
designed a backstory for these new pieces of technology,
which was discussed during our co-design sessions.
Deployment and Study
The design process culminated in deployment of 36 copies
of two of our social wearable designs, which were used by
19 larpers during the BSGtales larp at Dexcon20.
Data collection was planned with primary sensitivity to not
disrupting the larp event. Two onsite researchers were
invited by the larp organizers to join their crew to record
the event with two mobile handheld cameras, while one
participated as an NPC (non-player character). After the
larp, we wrote down field observations, which were
completed with notes from the video recordings (a more
rigorous video analysis was not possible due to the
recording’s lightning conditions and image quality), and
gathered feedback from larpers and organizers, including
Shoshana, Michael, and their teams. Information from all of
these sources was categorized into salient themes:
interesting, frequent and peculiar reactions and behaviors
related to how our technology impacted co-experience,
situations where larpers used the technology in intended as
well as in interesting unintended ways. These themes were
then related to each other and also considered in light of the
social affordance constructs that guided the design work.
DESIGNING FOR BSGTALES LARP
BSGtales is a freeform larp experience that lets players get
immersed in the high-tension space drama of the Battlestar
Galactica (TV show) universe in an original game setting.
In BSGtales, players enter the world of a medical spaceship
that escaped the destruction of the twelve colonies by the
Cylons, a cybernetic civilization at war with humans. Those
onboard the ship are running and under constant threat from
Cylon forces. A premise of BSGtales was the creation of a
full-immersion experience within “the confines of a
ballroom setting” by carefully crafting an “immersive
atmosphere through prop-building” and a focus on more
freeform role playing styles [29]. To support the illusion of
inhabiting and acting in the Battlestar Galactica universe,
players are provided with props, such as fake weapons. The
larp organizers also designed the setting where the larp
takes place, including objects that used simple mechanisms
for dramatic effect, as well as “interactive technology”.
Opportunities for Design
In our pre-study, we identified three opportunities for
wearable designs to augment the play experience: one
related to a play rule, and two related to in-game score-
keeping. BSGtales use a system of color stickers to indicate
the type of physical contact that players are comfortable
with. Wearing a red sticker means that light physical
contact is allowed; a white sticker means no physical
contact, so fights need to be talked through.
In-game challenges and tasks may require particular skills,
and they always require a certain amount of time and
mental energy (ME) for their fulfillment. Challenges take
some ME to be completed, which is the game’s currency.
The amount of ME that tasks take depends on their level of
difficulty, which is set up by storytellers in the larp. For
example, treating injuries can take from 4 to 7 ME points.
Players start with 10 points of ME and need to keep this
score balanced to be able to play and participate in tasks.
ME can be recovered in different ways, such as staying off
duty, using some in-game items (e.g. foods, drinks, or
drugs), and socializing. Another key score is health. Players
play with a total of 10 points, which can be reduced during
gameplay due to fights, sickness, etc. Health points can be
regained over time, or through medical attention.
In prior BSGtales events, players would mentally keep
track of their own scores and act accordingly. E.g. an
exhausted engineer would likely drag themselves to the bar
for a drink and chat with friends to recover ME. This would
both help one player connect with their character, as well as
help others understand what is going on with this character.
However, Shoshana commented that maintaining bodily
cues consistent with one’s scores and one’s character’s
story during long bits of the gameplay is challenging, and
these cues are not always legible during gameplay. Besides
looking for bodily cues and characters’ behavior, a way
players ascertain others’ state is by stepping out of
character to ask them directly. In BSGtales they use an
agreed-upon gesture of putting a hand over one’s head to
signal out-of-character communication. This is often useful
and necessary, but it means a momentary break in the
players’ immersion, which we saw as a design opportunity.
Design Goals
Our design team focused on two important co-experience
values for larp designers in general [41], and for this larp in
particular: supporting immersion and the experience of the
many. After our pre-study, we identified potential aspects
that hindered these co-experience values and discussed
technology roles that could address these issues. A clear
technology role that emerged during the pre-study was
tracking the characters’ scores. In addition, and in support
of a rich co-experience, we wanted to explore how to also
support in-person and in-character expression and
communication, support connection with one’s character,
and address the need for out of character communication.
Drawing upon the initial investigation of this design
context, relevant theory, and our prior work [23–
25,40,41,44], we generated a working set of social
affordance ideas to support the design, as follows:
• Supporting and augmenting the expressivity and
readability of verbal and non-verbal cues players used as
indicators of physical and mental states. We called this a
social signaling affordance.
• Augmenting players’ signaling mechanisms to regulate
the type and style of interaction they considered
acceptable. We called this a social appropriateness
affordance.
• Facilitating spectatorship, which we called a spectator
sensitivity affordance. We envisioned this to be a version
of the social signaling affordance specifically designed
for readability of social cues from a public social
distance.
• Supporting players to better connect with the physical
and emotional experiences of their characters. We called
this emotional resonance affordance.
These affordances were taken into consideration during the
design process, and here we will use them to describe our
designs and their in-game impact.
Final Designs
Health Prototype
This prototype would be worn by players like a dog tag,
similar to those used in the BSG world, which would keep
track of and represent the health score of the character (see
Figure 1). Each of the 10 LED lights represent one health
‘point’ (this in-game score ranged from 0 to 10), which
would be controlled manually by the player. The prototype
included an Adafruit Circuit Playground with an additional
Neopixel and Vibrating Mini Motor Disk attached to it. The
center ring of Neopixels displayed the user's current health
value on a 10-point scale: green communicates good health
(points>5), yellow fair health (2<points≤5), and red health
critically low (points≤2). If the health value reached zero,
all LEDs would flash red. The health value was
manipulated with two buttons on the face of the device: the
upper one to increase the number of LEDs lit, and the lower
one to decrease them. The motor had two functions, which
could be changed using the built-in switch: i) heartbeat
mode: it vibrated for 0.2s – 0.5s with an on and off interval
from 1.5s – 2.5s, which was mapped to the health of the
player: the less health, the more spaced the vibrations; ii)
navigation mode: the motor vibrated when the health value
was raised or lowered. In either mode, we used the onboard
speaker as an additional navigation feature, playing a tone
whose frequency corresponded with the health value (the
higher the frequency, the healthier). To represent the style
of physical interaction preferred by the player, we added an
external Neopixel that could be changed to either red (no
physical contact allowed) or blue (physical contact allowed)
by pressing both face buttons of the device.
Mental Energy Prototype
The final mental energy (ME) design would keep track and
represent the ME score of the character (ranging from 0 to
10), and would be controlled manually by the player (see
Figures 1 and 2). The prototype included two connected
Neopixel strips with five LEDs, affixed to the user’s
Figure 1. The pendant is the health prototype; the shoulder
strips and armband are the mental energy prototype.
shoulders with Velcro. Each light would represent one ME
point. The strips ran to a Circuit Playground attached to a
flexible armband worn on the upper arm. There were two
modes to raise and lower the number of LEDs using the
built-in switch in the Playground: i) with two capacitive
touch pads made with copper tape, welded to either three
top or bottom pins of the Playground, and sewed to the
flexible arm band; ii) with the two buttons on the face of the
Circuit Playground (see Figure 2). For either mode, the top
pad/button increased the ME, lighting up one LED on each
shoulder at a time, while the lower one decreased it in the
opposite way. We also used the onboard speaker as an
additional navigation feature, the same way as with the
Health prototype.
The Design Process
Here we focus on the design process, highlighting key
design variations and considerations.
Health Prototype Iteration Highlights
We wanted the effects of the manipulations of this
prototype to be visible mainly in face-to-face formations
from a personal distance zone [20,38] but also from a social
zone, in a similar way that health symptoms are manifested
in everyday life via physiological body cues (e.g. blood,
skin color, etc.). We decided to use lights and position them
on the chest –a good position for an expressive or magical
interface [56] that also fit well with the metaphor of the
heart as health indicator, since it was positioned over/near
the actual organ. The first iteration made use of an Adafruit
Flora board, an attached 16 LED RGB NeoPixel Ring, and
a custom-built pressure sensor to control the health value.
With this version we investigated appropriate score
mappings and symbolisms. We settled on a one-on-one
mapping (one score, one light), and a three-color traffic
light inspired system to represent good, acceptable, and bad
health. This was simple and intuitive, important qualities
since larpers would only get a quick training with the
prototypes in a short pre-larp briefing.
With the remaining 6 lights left in the NeoPixel Ring, we
explored the integration a color system to signal the type of
physical contact preferred by players, which was
appreciated by Shoshana and remained throughout posterior
design iterations. However, in line with larpers’ preference
for using more robust technology to support essential game
aspects [41] and given the importance of the health score,
we decided to use the Playground microcontroller instead,
which had built-in functionality for lights and input. But
with this design decision, we lost the extra LEDs to signal
preferred style of physical contact (the Playground only had
10 built-in LEDs), which we patched with external lights.
We wanted this feature visible from intimate and personal
distance zones [20,38] in a face-to-face formation [28],
likely orientation and formation before engaging in physical
action, like combat. For this close-range visibility
requirement, we decided to use only one external Neopixel.
We discussed creating a case to “camouflage” this design in
an object that belonged to the larp world, usually
appreciated by larpers [41]. But producing an easily
reproducible casing that accommodated slight variations of
the prototypes in time for the event was challenging.
Shoshana deemphasized the importance of the case to make
the wearable fit well in the BSG world; the raw look of the
Playground would likely work to support player immersion.
The bodily position of this prototype made us realize the
potential of the Motor Disk’s vibrations to trigger certain
emotions and associations. During in-lab testing sessions,
we tried the navigation mode of the Motor Disk (vibration
mode implemented first). With the prototype hanging over
our chest, a random rhythmic increase and decrease of the
health scores reminded us of our heartbeat. We related this
to insights from our previous work about how some larpers
use external sensorial stimuli non-diegetic to the larp
(which they would hide from others) to connect with their
characters’ emotions. We liked heart-like vibration for that
purpose for the health prototype and after testing several
vibration parameters, we settled on frequency (it could be
controlled well with the Playground’s built-in functions). A
debated design aspect was the vibration and score mapping.
We discussed: a) a direct mapping, i.e. the less healthier,
the less vibration frequency, and b) an inverse mapping, i.e.
the less healthier, the more frequency. Lower health scores
in a) suggested to us an extinguishing heartbeat, while
option b) triggered in us an urgent call for action. Option a)
was interesting from an immersionist perspective (focused
Figure 2. Increasing and decreasing ME using the built-in
buttons on the device.
Figure 3. ME prototype augmenting body postures of high
(left) and low (center) energy. Exploring social touch as an
interaction and input modality (right).
on immersion in character and in the larp world), option b)
was interesting from a more gamist one (focused on solving
game challenges) [6]. Given the immersive freeform play
character of BSGtales, we settled on a).
However, we were reluctant to lose the haptic navigation
feedback we used the motor for initially. That cue was
important to avoid an artifact-focused type of interaction
[44,70] when using the device, which could be detrimental
to the physical and social engagement of players [44,70].
We decided to keep both modes, and allow players to
decide which one to use and when.
Mental Energy Iteration Highlights
Powerful images that shaped our design ideas were bodily
cues associated with high and low energy: for the former,
an upright and open position, with shoulders back and open
chest; For the latter, a more closed, bowed, and sunken
posture, with shoulders hunched or dropped (See Figure 3).
These would be visible from both social and public distance
zones [20,38]. The increased visibility need of this
prototype in comparison to the other one made us think
about lights visible in any type of formation [28], and
direction of the room. Lights around the shoulders would fit
well with this, as well as with the bodily imagery described
before. We explored different light outputs, like the
NeoPixels, but settled on NeoPixel strips for affixing ease.
Input wise, we considered designing a device that was
amenable to social interaction. To regain ME in BSGtales,
players often resorted to socialization, which made us think
about allowing others to interact with this device. This
informed our choice of input bodily placement: the arms
would potentially allow both personal and social access
(arms are commonly understood as acceptable touch area
for friends and acquaintances [63] (See Figure 3). In
principle the lower arm/ wrist/ hand were considered
because these are more socially accepted touch areas [63],
and had good personal access. However, this would have
required long wires, likely uncomfortable and unappealing,
or wireless communication, which would add a technical
complexity we were unable to tackle due to time
constraints. We settled on an upper arm location. Imagery
of rubbing one’s tired and sore shoulders, and social
gestures of reassurance (e.g. a squeeze/tap/press/caress) on
the shoulders drove the exploration of different input
sensors, including: sliders like the Adafruit ribbon sensor
(that would detect a caress type of gesture) displayed on
Figure 4; or the Playground built-in accelerometer that
would detect a tap; as well as external conductive strips
(caressing, tapping, and pressing gestures). We liked the
flexibility of the strips, which allowed multiple gestures,
and with which we experienced the least noise issues, but
were concerned about their robustness. With limited time
for testing and refinement of this choice prior to
deployment, we designed a back-up input modality using
the buttons embedded in the Playground.
In-game Role of the Prototypes and Affordances
Following our previous technology in larps taxonomy [41],
the main roles of the designs were: i) tracking important
character’s scores, and ii) communication devices between
players3. We carefully designed the devices to augment
players’ connection with the physical and mental states of
their and others’ characters, and to support the expressivity
and readability of verbal and non-verbal cues, indicators of
these states. In the following, we describe design decisions
we made to better afford the end experience goals related to
the main roles of the designs:
Social signaling: The health prototype was designed to
provide a substitute visualization for physiological cues that
players could not fake during gameplay (e.g. injuries,
paleness due to fatigue). Both this and the ME prototypes
were intended to amplify bodily cues players would likely
use through role play to show health/energy and the lack
thereof. We hoped this would help players’ role play their
own state, and better understand other characters’ states.
Spectator sensitivity: Both designs made use of lights,
visible in a medium sized room, provided that players were
unoccluded. In particular, we designed the ME (with its
position around the shoulders) to be visible from a public
distance zone, to give players a good readability of the
atmosphere of certain areas at a glance.
Emotional resonance: The constant and pulsating haptic
output of the Health prototype, felt against to the chest, was
designed to support the player’s bodily and emotional
synchronization and resonance with their character’s health,
following the metaphor of a beating heart.
Social appropriateness: For the Health prototype, the
external NeoPixel was designed to signal the player’s
preferred physical and interactional style. This could be
changed at any time during the larp, to reflect players’
preference shifts. For the ME prototype, we designed the
location and input mechanism in a way that would likely be
3 An additional role emerged during the design process,
simulating other technology [41], when Shoshana suggested
using the technology diegetically as part of the story of
some of the players: the technology would be implanted by
the Cylons in captured humans to be aware of their health.
She thought this was an interesting backstory for her NPCs.
Figure 4. ME prototype along the way to the final design
using the Adafruit ribbon sensor.
considered socially accessible, and could allow players to
consider interacting with one another’s devices.
OBSERVATIONS
There were ~100 larpers who signed up or where recruited
by the larp’s production teams to play BSGtales at
Dexcon20. We brought 15 prototypes of each design to be
used by 10 NPCs. However, larps can be extremely
improvisational and the BSGtales team ended up having
only 3 NPCs, so onsite they invited one of us who had some
experience playing larps to participate as a NPC. They also
offered new players the possibility of using the rest of our
devices. After the larp, Shoshana commented that in the
pre-larp debrief session she had asked them: “who wants to
be involved in this little medical technology plot?” and
almost everybody raised their hands: “everybody was super
excited to try new technology”. Finally, we deployed a total
of 36 prototypes (21health, 15ME), used by 19 people (15
players, 4 NPC) during the larp, which lasted about 4 hours.
Shoshana presented new players with the following
backstory: they were wounded in the war with Cylons, and
got sick. The Cylons captured and cured them, implanting
these devices, which somehow seemed to keep track of
their health. She also told the players to decide how they
felt about the technology and the Cylons. After this brief,
these players came to an off-larp room where we affixed the
devices and briefly explained their functionality.
When the larp started, all these players were in a human
farm where Cylons were keeping them. The NPCs were in a
separate room, a cave. They were playing the roles of rebels
who were hiding from the Cylons and had rescued some
humans from that farm. The rest of the players were old
BSGtales larpers, all of whom where in a spaceship (other
six rooms and a hall), which was about to crash into the
planet where the farm and cave were.
The BSGtales Larp Experience
In-game, players used the devices to track their characters’
scores and signal their preferred style of physical
interaction throughout the whole game. They were robust
enough to work and except for one case, they did not
require onsite fixes. We realized however, that the
capacitive touch pads of some of the ME prototypes peeled
off their fasteners, but we didn’t intervene since the players
didn’t come to the off-larp repair room.
In the following, and to give the reader a sense of how the
devices were used in-game, we describe several larp facets
with scenes experienced by the NPC author of this paper
(first and second scenes) and witnessed by the two
researchers in our team, tasked with video recording the
larp (second to fourth scenes). The scenes are selected to
illustrate a wide range of interesting behaviors that speak of
the different social affordances. They are completed with
comments from players after the larp, and quotes from post-
game interviews with Shoshana, Michael, and their teams.
Facet 1: Ice-breaker and Backstory
Many players commented on how the wearables functioned
as icebreakers. One described how people were reacting to
them because they were very visually evident. After the
NPC co-author of this paper came out of the cave room,
there were plenty of times larpers would approach and,
upon looking at or pointing to the wearables they would
ask: “how are you feeling?” even when the devices didn’t
reflect any sign of bad health or wellbeing. Many would
care for the health of those who were “modified by the
Cylons.” Unlike in other larp experiences, there was no
need to explain essential parts of the character’s backstory
in order to engage in a casual conversation. Often players
with devices would come up to this NPC and ask: “are you
removing them?” or “why would/wouldn’t somebody want
to remove these things?” This was a conversation held often
and soon players knew one another’s feelings about the
implanted Cylons’ technology. Some players were
suspicious of them because they weren’t sure about if they
put everyone at risk (e.g. if Cylons were tracking them).
This example shows how our devices provided essentials
about the wearer’s backstory at a glance. Shoshana
commented that the devices made a big difference in the
game experience on this regard: “when you talk to someone
and you say yes I’ve been modified by the Cylons people are
not going to remember… there are a hundred people in that
space. But putting lights on someone so they can visually
see… it is literally between good role playing and bad role
playing […], and that’s a HUGE deal”.
Facet 2: Connecting with One’s Character
After the elaborately choreographed moment where a foam
wall separating the two groups of players, those with the
prototypes on the planet and those coming in the ship, is
knocked down, military representatives of the ship make
contact with a group of humans who seemed to be hiding.
In this dimly lit cave-like room, they spot them by the glow
of their medallions and shoulder pads. All the medallions
glow green, except one in yellow. This player is leaning
against a wall for support. The leader from the expedition
spots him: “What’s wrong with that one?” “He took a hit
during his escape from the Cylon base,” says the leader of
the group hiding in the cave. The players from the
expedition go back to report to the ship.
After a long while, a new military expedition enters the
cave: “What does that light on your chest mean?” asks one
of the new arrivals to the character, now standing in an
upright position. The player quickly adjusts his pose, and
slumps back against the wall reflecting his injured status.
Later, when they are walking through the ship, this player
drops to his knees, clutches his chest, and surreptitiously
presses down on the device to lower his health and change
the LEDs from yellow to red. Somebody shouts: “Get a
doctor, he won’t make it to sickbay.” This example shows
how the device was not only augmenting, but also making
up for a lack of bodily cues that communicated important
information about the character’s state of being. Shoshana
commented “in games like this it’s really hard to keep
people imagining” and said our devices “visually helped
them <players wearing the devices> get into the <right>
headspace of who they were and how they were feeling.”
Facet 3: Doctors Removing Cylon Implants
The first time the doctor crew tried to remove one of these
Cylon implants, the chief doctor turned to the nearest
storyteller and stepping out of character asked “What does
my character see?”4 The storyteller came up with the next
stage of the narrative: They were seeing a sensitive piece of
equipment that was keeping the person alive. They would
need the help of another character with technical skills to
assist in the extraction. The meaning of the LEDs suddenly
transformed: each would signify a pin inserted in the
player’s body, which they would need to remove and seal
the subsequent wound, making sure they didn’t disable the
device while working to remove it. Later the players
learned what this meant: the act of removing the device
involved some damage cost (decrease in health) since the
device produced an electric charge when extracted. This
made them include third players in following surgeries to
share the damage cost. This is how one of the ensuing
extractions unfolded: The Doctor uses the haptic motor as a
metronome to tie his actions to. The LEDs give characters
feedback for the in-game actions. As the LEDs decreased,
the players move closer to their goal. The flashing red at the
end is used as a cue to the player receiving the fictional
electric shock, who would dramatically throw themselves
against the wall. This example shows how this device was
re-contextualized to serve a gameplay purpose by the
storyteller. The players in turn came up with and developed
an interesting new meaning of the outputs of the device to
time their actions and adding depth and dramatic effect.
After the larp, a doctor told us that the devices supported
his role playing and in-game experience: they helped him
imagine a real implant he “had to get out of the person.” He
explained how he wasn’t sure if the devices were meant to
be used diegetically or not (did his character see that device
or not?), and about their functionality and interactivity. He
got some information from the storyteller and better
understood its interactivity as he was manipulating the
device, which he and his team used to role play.
Facet 4: Expressing and Understanding Behaviors
As part of the climax of the narrative for this larp, there was
a mysterious computer interface in the human farm. One of
the players wearing our designs took point on working on
this interface to try to understand its function and make use
of it. This task required ME, and as time went by, the
player’s ME significantly dwindled. Somebody nearby
asked this player to rest, and players around took the
opportunity to socialize with this player and focused on
4 Players not wearing the devices hadn’t been briefed about
their interactivity and only knew they were Cylon implants.
other dramatic points in the narrative experience while that
character rested and regained ME. After a while, this
character resumed working on the task. This challenge was
a central point in the plot, and it took quite a lot of mental
energy to solve, and several cycles like this one occurred.
This example shows how our device helped the players to
communicate their state without breaking out of character,
and how other players would show understanding and
support towards shifts of action. Michael observed that
there was more sense of urgency among people who
interacted with the larpers whose energy read low on their
devices. He commented how the devices helped create a
deeper sense of immersion in the game.
DISCUSSION
The experience excerpts above show how larp organizers
and larpers took our designs and used them in-game, both
in ways than we designed them for (first, second, and last
facet), and also in new creative ways (third facet), in the
service of their personal and collective experience. In the
following, we review how our designs were successful
fulfilling the roles they had been designed for (tracking the
characters’ scores and preferred physical interaction style,
supporting in-person and in-character expression and
communication, mitigating the need for out of character
communication, and connection with one’s character) in the
light of the social affordances that supported these roles
(social signaling, spectator sensibility, social
appropriateness, and emotional resonance).
Tracking, often witnessed by our team, helped players to
offload on the technology the task of remembering and
reporting scores to others, which presumably impacted their
in-game immersion positively. Players were not seen
interacting with others’ devices as we had hoped (ME
prototype), except in medical situations. Most players were
using the two buttons on the face of the Playground instead
of the capacitive pads. Players had not been briefed about
this possible function before the larp, which might have
contributed to this effect. However, we think that future
designs could better support the social affordance of
appropriateness and help encourage others interact with
one’s device, with bigger, more robust, and identifiable
surface areas on the elastic band. This social affordance was
well supported when conveying the preferred interaction
style. Michael commented that it helped him to know in
advance if he could or could not touch other players, and to
change his behavior accordingly. He praised how this sign
was well integrated, visible –yet camouflaged— in a
diegetic story-relevant object (health device) that players
would continuously check before and as they interacted.
The prototypes clearly supported the larpers’ expressivity
and communication capabilities (see e.g. facets 1, 2, 4),
which we attributed to the affordances of social signaling,
and spectator sensibility our designs supported. Regarding
connection with one’s character, the haptic feedback proved
to be an important feature, in particular the heartbeat mode
of the health prototype (facets 2, 3). Many players
commented that they used and liked that mode and had it on
continuously; some others told us they would switch modes
during the game, while few others told us they only used
the navigation mode. Michael commented that stimuli like
smell or vibrations close to the body, can have a strong
triggering power, which can help players connect with their
characters and role play, but can also bring up unintended
emotional reactions. He complimented the design decision
of including a switch to let players in control if and when to
use this powerfully evocative stimulus.
Our devices also decreased the need of out of character
communication to provide relevant backstories (facet 1),
explain their characters’ states (facets 2, 4) or in-game
actions (facet 4), and tell others their preferred style of
interaction. We relate this to the affordances of social
signaling, spectator sensibility, and social appropriateness.
We did see a device-related break out of character that was
due to lack of pre-larp briefing (facet 3). However, with
minimal onsite backstory, the players figured out the
interactivity of the devices, and how to use them to help
them connect with their characters, which added depth and
dramatic effect to their and others’ actions.
Overall, through the design and in-larp use of our devices,
we observed that our social affordances were viable, and
that they worked in support of the larp designers’ co-
experience values of immersion and support of the many.
The wearables worked for the intended functions in support
of play, and were also flexible enough to support new
interesting functions, such as that of external stimuli to
allow other players connect with their characters (facet 3).
Our design exemplars were robust enough and amenable to
these affordances and values. Like in [24], we argue that
this is the result of a design process in which important
design decisions were informed by relevant concepts in
HCI, but were also made through continuous reference to
larp designers’ co-experience values and continuous
communication with our co-designer. Finally, we found
valuable testing and further developing prototypes together
with larpers, while they are embedded in the magic circle of
play of the larp. We see live improvisations such as the one
presented here (facet 3) as an interesting sort of onsite co-
design iterations with players that resonates with previous
work in embodied co-design activities [42]. Play is
transformative and can be used to better understand how
players envision artifacts in support of the experiences they
want to have [2,42]. Designs that are fluid and open to
improvisation often surprise designers with unintended and
interesting uses [24] that can be further supported in the
design process [2,42]. Our designs were adaptable enough
and supportive of interesting improvisations. These
improvisations capitalized on the social affordances and
gave us ideas for future iterations, ranging from design
features to new technology roles.
CONCLUSION
We set out to explore the potential for wearables to
augment collocated social interaction, by engaging in a
Research through Design process to create wearables
embedded in a larp, working closely with a larp designer.
With the help of our larp design expert, we identified
promising roles for the technology to support larp
designers’ co-experience values of immersion and the
experience of the many: tracking the characters’ scores and
preferred physical interaction style, supporting in-person
and in-character expression and communication, connection
with one’s character, and mitigating the need for out of
character communication.
We took the concept of social affordances and articulated
some to guide the design process towards these roles and
co-experience values: social signaling, spectator sensitivity,
social appropriateness, and emotional resonance. To
translate these into design choices and features, we made
use of concepts in HCI for the design of technology in
collocated social spaces, such as the spectator experience
and concepts in proxemics. These choices were frequently
discussed with our expert larp designer, to make sure that
they would fit and support the larp’s magic circle of play.
The completed designs, novel in the domain of larps [41]
embodied well the values of immersion and the experience
of the many, performing functions including keeping track
of and displaying scores, indicating preferred physical
interaction styles, supporting players’ expressivity and
communication with others, and enhancing players’
connection with their own characters. Reflecting about the
positive reception of our designs, Michael commented that
“larpers want to use technology in their games as long as
the technology supports the immersion of the game.” In fact
players adapted the wearables to suit their own narrative
purposes, using them in unanticipated ways.
Many of the affordance patterns and principles we
developed and tested in this play situation have potential to
apply to the design of wearables outside the context of play.
We are currently drawing upon the lessons learned in this
project, to create wearables that support interaction in
everyday non-game contexts.
ACKNOWLEDGMENTS
Our deepest thanks to our social wearables co-designer,
Shoshana Kessock, who invited us onboard the BSGtales.
Special thanks to Michael Malecki for sharing his insights
about the technology within the BSGtales. Thanks to
Avonelle Wing and Vincent Salzillo, the people behind
Dexcon, as well as our host Elsa Sjunneson-Henry. Thanks
to the rest of the BSGtales crew and all the larpers who
played. Special thanks to those who used our designs and
gave us feedback. Thanks to Lola Segura for modeling for
us, to Edward Melcer for his support during Dexcon20, and
to Jared Duval for his feedback on this paper.
REFERENCES
1. Kaho Abe and Katherine Isbister. 2016. Hotaru: The
Lightning Bug Game. In Proceedings of the 2016 CHI
Conference Extended Abstracts on Human Factors in
Computing Systems (CHI EA ’16), 277–280.
https://doi.org/10.1145/2851581.2889472
2. Jon Back, Elena Márquez Segura, and Annika Waern.
2017. Designing for Transformative Play. ACM Trans.
Comput.-Hum. Interact. 24, 3: 18:1–18:28.
https://doi.org/10.1145/3057921
3. Till Ballendat, Nicolai Marquardt, and Saul Greenberg.
2010. Proxemic Interaction: Designing for a Proximity
and Orientation-aware Environment. In ACM
International Conference on Interactive Tabletops and
Surfaces (ITS ’10), 121–130.
https://doi.org/10.1145/1936652.1936676
4. Steve Benford, Gabriella Giannachi, Boriana Koleva,
and Tom Rodden. 2009. From Interaction to
Trajectories: Designing Coherent Journeys Through
User Experiences. In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems
(CHI ’09), 709–718.
https://doi.org/10.1145/1518701.1518812
5. Nadia Bianchi-Berthouze. 2013. Understanding the
Role of Body Movement in Player Engagement.
Human–Computer Interaction 28, 1: 40–75.
https://doi.org/10.1080/07370024.2012.688468
6. Petter Bøckman. 2003. The Three Way Model. In As
Larp Grows Up. Theory and Methods in Larp, Morten
Gade, Line Thorup and Mikkel Sander (eds.).
Knudepunkt.
7. Leah Buechley and Hannah Perner-Wilson. 2012.
Crafting Technology: Reimagining the Processes,
Materials, and Cultures of Electronics. ACM Trans.
Comput.-Hum. Interact. 19, 3: 21:1–21:21.
https://doi.org/10.1145/2362364.2362369
8. Oğuz Turan Buruk, Ismet Melih Özbeyli, and Oğuzhan
Özcan. 2017. Augmented Table-Top Role-Playing
Game with Movement-Based Gameplay and Arm-
Worn Devices. In Proceedings of the 2017 ACM
Conference Companion Publication on Designing
Interactive Systems (DIS ’17 Companion), 289–292.
https://doi.org/10.1145/3064857.3079176
9. Oğuz Turan Buruk and Oğuzhan Özcan. 2016.
WEARPG: Game Design Implications for Movement-
based Play in Table-top Role-playing Games with
Arm-worn Devices. In Proceedings of the 20th
International Academic Mindtrek Conference
(AcademicMindtrek ’16), 403–412.
https://doi.org/10.1145/2994310.2994315
10. Mert Canat, Mustafa Ozan Tezcan, Celalettin
Yurdakul, Eran Tiza, Buğra Can Sefercik, Idil Bostan,
Oğuz Turan Buruk, Tilbe Göksun, and Oğuzhan
Özcan. 2016. Sensation: Measuring the Effects of a
Human-to-Human Social Touch Based Controller on
the Player Experience. In Proceedings of the 2016 CHI
Conference on Human Factors in Computing Systems
(CHI ’16), 3944–3955.
https://doi.org/10.1145/2858036.2858418
11. T. Matthew Ciolek and Adam Kendon. 1980.
Environment and the Spatial Arrangement of
Conversational Encounters. Sociological Inquiry 50,
3–4: 237–271. https://doi.org/10.1111/j.1475-
682X.1980.tb00022.x
12. Yvonne A. W. De Kort and Wijnand A. Ijsselsteijn.
2008. People, Places, and Play: Player Experience in a
Socio-spatial Context. Comput. Entertain. 6, 2: 18:1–
18:11. https://doi.org/10.1145/1371216.1371221
13. Bernard DeKoven. 2011. Coliberation. DeepFUN.
Retrieved February 1, 2016 from
http://www.deepfun.com/coliberation/
14. Paul Dourish. 2001. Where the Action Is: The
Foundations of Embodied Interaction. The MIT Press,
Cambridge, MA, USA.
15. Benjamin Ehrenreich. 2015. Battlestar Galactica.
Tales of the Rising Star LARP - To Those Lost.
Retrieved January 7, 2018 from
https://www.youtube.com/watch?v=W_rNP7L3SUU
16. Andrew Fluegelman. 1976. New Games Book. Main
Street Books, Garden City, NY, USA.
17. William Gaver. 2012. What Should We Expect from
Research Through Design? In Proceedings of the
SIGCHI Conference on Human Factors in Computing
Systems (CHI ’12), 937–946.
https://doi.org/10.1145/2207676.2208538
18. Jens Emil Grønbæk, Henrik Korsgaard, Marianne
Graves Petersen, Morten Henriksen Birk, and Peter
Gall Krogh. 2017. Proxemic Transitions: Designing
Shape-Changing Furniture for Informal Meetings. In
Proceedings of the 2017 CHI Conference on Human
Factors in Computing Systems (CHI ’17), 7029–7041.
https://doi.org/10.1145/3025453.3025487
19. Edward Twitchell Hall. 1963. A System for the
Notation of Proxemic Behavior. American
Anthropologist 65, 5: 1003–1026.
20. Edward Twitchell Hall. 1966. The hidden dimension.
Doubleday, Garden City, NY, USA.
21. Eva Hornecker and Jacob Buur. 2006. Getting a Grip
on Tangible Interaction: A Framework on Physical
Space and Social Interaction. In Proceedings of the
SIGCHI Conference on Human Factors in Computing
Systems (CHI ’06), 437–446.
https://doi.org/10.1145/1124772.1124838
22. Johan Huizinga. 1955. Homo Ludens: A Study of the
Play-Element in Culture. Beacon Press, Boston, MA,
USA.
23. Katherine Isbister, Kaho Abe, and Michael Karlesky.
2017. Interdependent Wearables (for Play): A Strong
concept for design. In In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems
(CHI ’17).
24. Katherine Isbister, Elena Márquez Segura, Suzanne
Kirkpatrick, Xiaofeng Chen, Syed Salahuddin, Gang
Cao, and Raybit Tang. 2016. Yamove! A movement
synchrony game that choreographs social interaction.
Human Technology: An Interdisciplinary Journal on
Humans in ICT Environments 12, 1: 74–102.
http://dx.doi.org/10.17011/ht/urn.201605192621
25. Katherine Isbister, Elena Márquez Segura, and Edward
Melcer. 2018. Social Affordances at Play: Game
Design Toward Socio-Technical Innovation. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (CHI ’18).
26. Esther Jakobs, Antony S. R. Manstead, and Agneta H.
Fischer. 1996. Social context and the experience of
emotion. Journal of Nonverbal Behavior 20, 2: 123–
142. https://doi.org/10.1007/BF02253073
27. Ke Jing, Natalie Nygaard, and Joshua Tanenbaum.
2017. Magia Transformo: Designing for Mixed Reality
Transformative Play. In Extended Abstracts
Publication of the Annual Symposium on Computer-
Human Interaction in Play (CHI PLAY ’17 Extended
Abstracts), 421–429.
https://doi.org/10.1145/3130859.3131339
28. Adam Kendon. 2010. Spacing and Orientation in Co-
present Interaction. In Proceedings of the Second
International Conference on Development of
Multimodal Interfaces: Active Listening and Synchrony
(COST’09), 1–15. https://doi.org/10.1007/978-3-642-
12397-9_1
29. Shoshana Kessock. 2013. So Say We All: DexCon
2013 Gets Some BSG – Shoshana Kessock. Shoshana
Kessock. Writer, Game Designer, and Unreality Expert
Since 1982. Retrieved January 7, 2018 from
https://shoshanakessock.com/2013/07/11/so-say-we-
all-dexcon-2013-gets-some-bsg/
30. Peter Gall Krogh, Marianne Graves Petersen, Kenton
O’Hara, and Jens Emil Groenbaek. 2017. Sensitizing
Concepts for Socio-spatial Literacy in HCI. In
Proceedings of the 2017 CHI Conference on Human
Factors in Computing Systems (CHI ’17), 6449–6460.
https://doi.org/10.1145/3025453.3025756
31. Stacey Kuznetsov and Eric Paulos. 2010. Rise of the
Expert Amateur: DIY Projects, Communities, and
Cultures. In Proceedings of the 6th Nordic Conference
on Human-Computer Interaction: Extending
Boundaries (NordiCHI ’10), 295–304.
https://doi.org/10.1145/1868914.1868950
32. LARPtronics. 2014. Gauntlet at Stage Three.
Retrieved from
https://www.youtube.com/watch?v=XZMgy1YSkmU
33. LARPtronics. 2016. LARPtronics Monocle. Retrieved
from
https://www.youtube.com/watch?v=3lOE2z_hW8w
34. LarpWiki. 2017. Live action role-playing game
(LARP). Retrieved September 18, 2017 from
http://larpwiki.labcats.org/index.php?title=Live_action
_role-playing_game_(LARP)
35. Kazuki lida and Kenji Suzuki. 2011. Enhanced Touch:
A Wearable Device for Social Playware. In
Proceedings of the 8th International Conference on
Advances in Computer Entertainment Technology
(ACE ’11), 83:1–83:2.
https://doi.org/10.1145/2071423.2071524
36. Regan L. Mandryk, Kori M. Inkpen, and Thomas W.
Calvert. 2006. Using psychophysiological techniques
to measure user experience with entertainment
technologies. Behaviour & Information Technology 25,
2: 141–158.
https://doi.org/10.1080/01449290500331156
37. Tony Manstead. 2005. The social dimension of
emotion. The Psychologist 18, 8: 484–487.
38. N. Marquardt and S. Greenberg. 2015. Proxemic
Interactions: From Theory to Practice. Morgan &
Claypool Publishers, San Rafael, CA, USA. Retrieved
March 24, 2016 from
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumb
er=7056253
39. Elena Márquez Segura. 2013. Body Games : Designing
for movement-based play in co-located social settings.
Retrieved January 24, 2015 from http://www.diva-
portal.org/smash/record.jsf?pid=diva2%3A699066&ds
wid=-9066
40. Elena Márquez Segura and Katherine Isbister. 2015.
Enabling Co-Located Physical Social Play: A
Framework for Design and Evaluation. In Game User
Experience Evaluation, Regina Bernhaupt (ed.).
Springer International Publishing, Cham, Switzerland,
209–238. Retrieved June 15, 2015 from
http://link.springer.com/chapter/10.1007/978-3-319-
15985-0_10
41. Elena Márquez Segura, Katherine Isbister, Jon Back,
and Annika Waern. 2017. Design, Appropriation, and
Use of Technology in Larps. In Proceedings of the
12th International Conference on the Foundations of
Digital Games (FDG ’17), 53:1–53:4.
https://doi.org/10.1145/3102071.3106360
42. Elena Márquez Segura, Laia Turmo Vidal, Asreen
Rostami, and Annika Waern. 2016. Embodied
Sketching. In Proceedings of the SIGCHI Conference
on Human Factors in Computing Systems (CHI ’16),
6014–6027. https://doi.org/10.1145/2858036.2858486
43. Elena Márquez Segura, Annika Waern, Luis Márquez
Segura, and David López Recio. 2016. Playification:
The PhySeEar Case. In Proceedings of the 2016
Annual Symposium on Computer-Human Interaction in
Play (CHI PLAY ’16), 376–388.
https://doi.org/10.1145/2967934.2968099
44. Elena Márquez Segura, Annika Waern, Jin Moen, and
Carolina Johansson. 2013. The Design Space of Body
Games: Technological, Physical, and Social Design. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (CHI ’13), 3365–3374.
https://doi.org/10.1145/2470654.2466461
45. Joe Marshall, Steve Benford, and Tony Pridmore.
2010. Deception and Magic in Collaborative
Interaction. In Proceedings of the SIGCHI Conference
on Human Factors in Computing Systems (CHI ’10),
567–576. https://doi.org/10.1145/1753326.1753410
46. Joe Marshall and Paul Tennent. 2017. Touchomatic:
Interpersonal Touch Gaming In The Wild. In
Proceedings of the 2017 Conference on Designing
Interactive Systems (DIS ’17), 417–428.
https://doi.org/10.1145/3064663.3064727
47. Tiago Martins, Teresa Romão, Christa Sommerer,
Laurent Mignonneau, and Nuno Correia. 2008.
Towards an Interface for Untethered Ubiquitous
Gaming. In Proceedings of the 2008 International
Conference on Advances in Computer Entertainment
Technology (ACE ’08), 26–33.
https://doi.org/10.1145/1501750.1501757
48. David A. Mellis and Leah Buechley. 2014. Do-it-
yourself Cellphones: An Investigation into the
Possibilities and Limits of High-tech Diy. In
Proceedings of the 32Nd Annual ACM Conference on
Human Factors in Computing Systems (CHI ’14),
1723–1732. https://doi.org/10.1145/2556288.2557309
49. Markus Montola, Jaakko Stenros, and Annika Waern.
2009. Pervasive Games: Theory and Design. Morgan
Kaufmann Publishers, Burlington, MA, USA.
50. Florian `Floyd’ Mueller, Martin R. Gibbs, Frank
Vetere, and Darren Edge. 2017. Designing for Bodily
Interplay in Social Exertion Games. ACM Trans.
Comput.-Hum. Interact. 24, 3: 24:1–24:41.
https://doi.org/10.1145/3064938
51. Florian Mueller, Sophie Stellmach, Saul Greenberg,
Andreas Dippon, Susanne Boll, Jayden Garner, Rohit
Khot, Amani Naseem, and David Altimira. 2014.
Proxemics Play: Understanding Proxemics for
Designing Digital Play Experiences. In Proceedings of
the 2014 Conference on Designing Interactive Systems
(DIS ’14), 533–542.
https://doi.org/10.1145/2598510.2598532
52. Nordic Larp Wiki. 2014. Freeform. Nordic Larp Wiki.
Retrieved August 30, 2017 from
https://nordiclarp.org/wiki/Freeform
53. Don Norman. 2013. The Paradox of Wearable
Technologies. MIT Technology Review 116. Retrieved
September 17, 2017 from
https://www.technologyreview.com/s/517346/the-
paradox-of-wearable-technologies/
54. Price Waterhouse Cooper. Wearable Technology
Future is Ripe for Growth – Most Notably among
Millennials, Says PwC US. PwC. Retrieved January 6,
2015 from http://www.pwc.com/us/en/press-
releases/2014/wearable-technology-future.jhtml
55. Niklas Ravaja, Timo Saari, Marko Turpeinen, Jari
Laarni, Mikko Salminen, and Matias Kivikangas.
2006. Spatial Presence and Emotions During Video
Game Playing: Does It Matter with Whom You Play?
Presence: Teleoper. Virtual Environ. 15, 4: 381–392.
https://doi.org/10.1162/pres.15.4.381
56. Stuart Reeves, Steve Benford, Claire O’Malley, and
Mike Fraser. 2005. Designing the Spectator
Experience. In Proceedings of the SIGCHI Conference
on Human Factors in Computing Systems (CHI ’05),
741–750. https://doi.org/10.1145/1054972.1055074
57. David Roedl, Shaowen Bardzell, and Jeffrey Bardzell.
2015. Sustainable Making? Balancing Optimism and
Criticism in HCI Discourse. ACM Trans. Comput.-
Hum. Interact. 22, 3: 15:1–15:27.
https://doi.org/10.1145/2699742
58. Dan Miller Shroeder. 2013. Scientific Sci-fi Scanner
app for iPhone and Android. Retrieved from
https://www.youtube.com/watch?v=FwznllyxkQ8
59. Dan Miller Shroeder. Scientific Sci-Fi Scanner. Dan
MS. Retrieved from http://www.dan-
ms.com/portfolios/scientific-sci-fi-scanner/
60. David Simkins. 2014. The Arts of LARP: Design,
Literacy, Learning and Community in Live-Action Role
Play. McFarland.
61. Robert Sommer. 2002. Personal Space in a Digital
Age. In Handbook of Environmental Psychology,
Robert B. Bechtel and Arza Churchman (eds.). John
Wiley & Sons, New York, NY, USA.
62. Lizzie Stark. 2012. Leaving Mundania: Inside the
Transformative World of Live Action Role-playing
Games. Paw Prints. Retrieved August 21, 2017 from
http://lizziestark.com/books/leaving-mundania/
63. Juulia T. Suvilehto, Enrico Glerean, Robin I. M.
Dunbar, Riitta Hari, and Lauri Nummenmaa. 2015.
Topography of social touching depends on emotional
bonds between humans. Proceedings of the National
Academy of Sciences 112, 45: 13811–13816.
https://doi.org/10.1073/pnas.1519231112
64. Dean Takahashi. 2016. SuperSuit modernizes laser tag
with wearable gaming gear. VentureBeat. Retrieved
January 8, 2018 from
https://venturebeat.com/2016/09/27/supersuit-
modernizes-laser-tag-with-wearable-gaming-gear/
65. Joshua Tanenbaum and Karen Tanenbaum. 2015.
Envisioning the Future of Wearable Play: Conceptual
Models for Props and Costumes as Game Controllers.
66. Joshua Tanenbaum, Karen Tanenbaum, Katherine
Isbister, Kaho Abe, Anne Sullivan, and Luigi
Anzivino. 2015. Costumes and Wearables As Game
Controllers. In Proceedings of the Ninth International
Conference on Tangible, Embedded, and Embodied
Interaction (TEI ’15), 477–480.
https://doi.org/10.1145/2677199.2683584
67. Burak S. Tekin and Stuart Reeves. 2017. Ways of
Spectating: Unravelling Spectator Participation in
Kinect Play. In Proceedings of the 2017 CHI
Conference on Human Factors in Computing Systems
(CHI ’17), 1558–1570.
https://doi.org/10.1145/3025453.3025813
68. ThinkGeek. 2013. Technomancer Digital Wizard
Hoodie from ThinkGeek. Retrieved from
https://www.youtube.com/watch?v=Mo3jPjOblFQ
69. ThinkGeek. Electronic FPS Laser Battle Jacket.
ThinkGeek. Retrieved January 8, 2018 from
http://www.thinkgeek.com/product/16a6/?cpg=fbl_16a
6
70. Jakob Tholander and Carolina Johansson. 2010.
Design Qualities for Whole Body Interaction: Learning
from Golf, Skateboarding and BodyBugging. In
Proceedings of the 6th Nordic Conference on Human-
Computer Interaction: Extending Boundaries
(NordiCHI ’10), 493–502.
https://doi.org/10.1145/1868914.1868970
71. Annika Waern, Markus Montola, and Jaakko Stenros.
2009. The Three-sixty Illusion: Designing for
Immersion in Pervasive Games. In Proceedings of the
SIGCHI Conference on Human Factors in Computing
Systems (CHI ’09), 1549–1558.
https://doi.org/10.1145/1518701.1518939
72. Matthew Webb. 2016. Harnessing the Glowing
Rectangle: Using Mobile and Computer Technology in
LARP Play. Game Wrap 1, 64–69.
73. Y. Yamaguchi, H. Yanagi, and Y. Takegawa. 2013.
Touch-shake: Design and implementation of a physical
contact support device for face-to-face communication.
In 2013 IEEE 2nd Global Conference on Consumer
Electronics (GCCE), 170–174.
https://doi.org/10.1109/GCCE.2013.6664789
74. John Zimmerman, Jodi Forlizzi, and Shelley Evenson.
2007. Research Through Design As a Method for
Interaction Design Research in HCI. In Proceedings of
the SIGCHI Conference on Human Factors in
Computing Systems (CHI ’07), 493–502.
https://doi.org/10.1145/1240624.1240704
75. John Zimmerman, Erik Stolterman, and Jodi Forlizzi.
2010. An Analysis and Critique of Research Through
Design: Towards a Formalization of a Research
Approach. In Proceedings of the 8th ACM Conference
on Designing Interactive Systems (DIS ’10), 310–319.
https://doi.org/10.1145/1858171.1858228