ChapterPDF Available

Learning with the Body: A Design Framework for Embodied Learning Games and Simulations

Authors:

Abstract and Figures

In recent years, embodiment—the notion that cognition arises not just from the mind, but also through our bodies’ physical and social interactions with the world around us—has become an important concept to enhance the design of educational games and simulations within communities such as Human–Computer Interaction (HCI), games, and learning science. However, the broad number of academic domains that define and operationalize embodiment differently has often led researchers and designers to employ a black box of design decisions—resulting in a substantial breadth of seemingly unrelated educational systems. This inconsistent application of embodiment is a substantial drawback when trying to design embodied technology to support particular use cases such as learning, and ultimately becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within existing educational designs. In this chapter, we present our work toward opening the black box of embodied design decisions by combining differing conceptual and design approaches for embodied learning games and simulations into a unified taxonomical design framework. We describe the motivation and creation process for the framework, explain its dimensions, and provide examples of its use. Ultimately, such a framework helps to explicate high-level design decisions by providing a unifying foundation for the description and categorization of embodied learning systems, and encourages new designs by identifying problematic design spaces and unexplored research/design terrain. Our design framework will benefit educational game designers and researchers by mapping out a more precise understanding of how to incorporate embodiment into the design of educational games and simulations.
Content may be subject to copyright.
161
Learning with the Body
A Design Framework for
Embodied Learning Games
and Simulations
Edward F. Melcer and Katherine Isbister
University of California, Santa Cruz, USA
CONTENTS
7.1 Introduction ................................................................................................... 162
7.2 The Embodiment Problem ............................................................................163
7.2.1 A Human–Computer Interaction Perspective of Embodiment .........163
7.2.2 A Cognitive Linguistics Perspective of Embodiment ....................... 164
7.2.3 A Cognitive Science Perspective of Embodiment.............................164
7.2.4 An Articial Intelligence and Robotics Perspective
of Embodiment ................................................................................. 165
7.3 Embodiment in a Design Framework for Games and Simulations ...............165
7.4 Background and Related Work...................................................................... 166
7.4.1 Design Frameworks ...........................................................................166
7.4.2 Embodied Learning Taxonomies and Frameworks ...........................167
7.5 Toward a Design Framework for Embodied Learning Games
and Simulations ............................................................................................. 168
7.5.1 Surveying Existing Embodied Educational Games
and Simulations ................................................................................. 168
7.5.2 Creating the Design Framework .......................................................169
7.5.3 Design Space Dimensions of the Framework ................................... 169
7.5.3.1 Physicality ..........................................................................169
7.5.3.2 Transforms ..........................................................................171
7.5.3.3 Mapping .............................................................................171
7.5.3.4 Correspondence ..................................................................172
7.5.3.5 Mode of Play ...................................................................... 172
7.5.3.6 Coordination .......................................................................172
7.5.3.7 Environment ....................................................................... 172
7.6 Applying the Design Framework ..................................................................172
7.6.1 Direct Embodied Physicality Categories ..........................................174
7
162 Software Engineering Perspectives in Computer Game Development
7.6.1.1 Full-Body Congruency ....................................................... 174
7.6.1.2 Finger-Based Congruency .................................................. 174
7.6.2 Enacted Physicality Categories ......................................................... 174
7.6.2.1 Whole-Body Position ......................................................... 174
7.6.2.2 Embedded Phenomena ....................................................... 175
7.6.3 Manipulated Physicality Categories ..................................................175
7.6.3.1 Tangible Blocks ..................................................................175
7.6.3.2 Tangible Tabletops ..............................................................175
7.6.3.3 Tangible Objects .................................................................176
7.6.4 Surrogate Physicality Categories ......................................................176
7.6.4.1 Tangible Spaces ..................................................................176
7.6.5 Augmented Physicality Categories ................................................... 176
7.6.5.1 “Touchless” Motion-Based .................................................176
7.7 Conclusion.....................................................................................................178
List of Categorized Embodied Learning Games and Simulations ......................... 179
References .............................................................................................................. 186
7.1 INTRODUCTION
There are a number of reasons why physical interaction and embodiment—an
emergent property from the interactions between brain, body, and the physical/
social environment [57]—are important factors to consider in the design of educa-
tional technologies and games. From a technical perspective, the low cost, quantity,
and ubiquity of sensors in recent years have made them easily accessible to incor-
porate into the construction of both commercial and custom hardware. For instance,
modern smartphones commonly have more than half a dozen sensors built-in that
allow for sensing user actions and the surrounding environment—i.e., proximity
sensors for detecting distance, accelerometers and gyroscopes for detecting move-
ment and rotation, GPS for detecting location, barometers for detecting atmo-
spheric pressure, multiple cameras, and so forth. Additionally, the large-scale
commercial success of hardware that utilizes physical sensing, such as the Wii
[78], and software that employs physical interactions on such hardware, for exam-
ple, Pokémon Go [64], demonstrates that there is a growing market and public
interest for such products. Even commercial educational game systems utilizing
augmented reality to incorporate physical interactivity, such as Osmo, have found
notable success in recent years [68]. From the academic perspective, there is also a
large body of work in the Human–Computer Interaction (HCI), games, and learn-
ing science disciplines demonstrating the potential benets of incorporating physi-
cal interactions into learning, such as improved spatial recall and mental
manipulation [24, 31], more intuitive interfaces and mappings [15, 84, 95, 129,
145], increased engagement [25, 37, 150], and greater positive feelings toward
learning content and science in general [18, 80, 93, 139].
However, while there are a number of educational games and simulations that
have incorporated physical interaction, very little is understood about what aspects of
physicality actually result in benecial learning outcomes from these designs. There
is effectively a black box of design decisions that researchers and developers often
Learning with the Body 163
employ when creating their educational systems. For instance, the physical interac-
tions designed into a game teaching basic physics concepts could range from having
players run around a room [99], slide down a large inatable slide [87], play with
tangible objects [42], move ngers on a tablet [34], or even swing limbs to enact
learning concepts [7].
Looking deeper into the theoretical concepts underlying this black box reveals a
commonly used set of terms: embodiment, embodied cognition, and embodied inter-
action to name a few. Unfortunately, embodiment and related terms are remarkably
broad and fuzzy constructs—with denitions and operationalization that differs dras-
tically depending upon the academic domain in which they are utilized [92, 94]. This
large breadth of interpretation for “embodiment” is a likely cause of the seemingly
ambiguous design choices underlying many educational games and simulations
employing physical interactions. Furthermore, it becomes problematic when trying
to understand where and how embodiment occurs in these systems, and which design
elements help to facilitate embodied learning. Therefore, for designers seeking to
utilize embodiment, the differences in approach to physicality, collaboration, and
interaction pose a signicant hurdle.
However, we posit that this fuzziness actually presents a new space of design pos-
sibility. One where, through careful exploration and synthesis of embodiment across its
many domains, we can better understand the various ways physicality and embodiment
can be employed in order to enhance different learning outcomes within educational
technology. One particularly useful approach that can bridge conceptual differences
between existing systems and domains is the creation of a design framework [38, 119].
In this chapter, we present the motivation, construction, and application of our taxo-
nomical design framework for embodied learning games and simulations.
7.2 THE EMBODIMENT PROBLEM
As mentioned earlier, denitions and operationalization of embodiment differ drasti-
cally depending upon the academic perspectives that are utilized during the design
process—resulting in a large breadth of designs employing embodiment. Notably,
the extensive range of interpretations for embodiment has been well documented
across multiple disciplines, for example, [142, 154], and is a likely cause for the
broad and seemingly ambiguous black box of design choices underlying many edu-
cational systems employing embodiment. In order to illustrate this point, the follow-
ing subsections give a brief introduction to embodiment from several disciplinary
perspectives, and point to examples of various educational systems that utilized one
or more of these perspectives in their design.
7.2.1 A HumAn–Computer InterACtIon perspeCtIve of embodIment
In general, HCI takes a phenomenological approach to embodiment [50, 57, 89],
expanding upon ideas from phenomenology that were rst introduced into HCI through
the work of Suchman [134] and Winograd et al. [143]. In a general sense, embodiment
is centered around the notion that human reasoning and behavior is connected to, or
inuenced by our bodies and their physical/social experience and interaction with the
164 Software Engineering Perspectives in Computer Game Development
world around us [114]. Some of the recent seminal work on embodiment within HCI
that stems from phenomenology is by Dourish [35], where he coined the term embod-
ied interaction to capture a number of research trends and ideas in HCI around tangible,
social, and ubiquitous computing. The tangibles’ community within HCI has since
built upon this interpretation of embodied interaction, additionally focusing on the use
of space and movement [55]. In this way, embodied interaction refers to the creation,
manipulation, and sharing of meaning through engaged interaction with artifacts [110,
116], and includes material objects and environments in the process of meaning mak-
ing and action formation [133]. A number of systems have employed embodied inter-
action utilizing physical objects and the surrounding space within HCI, including
Espaistory [86], MirrorFugue [147, 148], TASC [28], Archimedes [87], Mapping Place
[29], Buildasound [117], and CoCensus [118]. Notably, these games and simulations
that employ embodied interaction tend to place the player in a physical space where
they can physically manipulate interactive tangible tabletops, blocks, and objects.
7.2.2 A CognItIve LInguIstICs perspeCtIve of embodIment
The cognitive linguistics perspective addresses embodiment through the notion of a
conceptual/embodied metaphor [75, 122]. Both in language and cognition, metaphors
help us to understand or experience one concept (target domain) in terms of another
(source domain) [19]. When the source domain of a metaphor involves schemata that
have arisen from experience relating to the body’s movements, orientation in space,
or its interaction with objects [75], it becomes a conceptual or embodied metaphor
[62]. Two notable types of embodied metaphors are Ontological—where an abstract
concept is represented as something concrete, such as an object, substance, container,
or person—and Orientational—where concepts are spatially related to each other.
Some systems that have employed embodied metaphors in the design of their objects
and interactions include Sound Maker [12, 13], Springboard [10, 11, 84], and MoSo
Tangibles [19, 20]. These games and simulations commonly have players utilize
objects and their bodies to more intuitively enact the embodied metaphor.
7.2.3 A CognItIve sCIenCe perspeCtIve of embodIment
The cognitive science perspective addresses embodiment through the notion of
embodied cognition, focusing on how sensorimotor activity can inuence human
learning, knowing, and reasoning [4]. However, embodied cognition is also a divided
term for education, where Wilson [142] identied six distinct views including (1)
cognition is situated; (2) cognition is time-pressured; (3) we off-load cognitive work
onto the environment; (4) the environment is part of the cognitive system; (5) cogni-
tion is for action; and (6) off-line cognition is body-based.
Notably, when it comes to learning, embodied cognition primarily considers how
human cognition is fundamentally grounded in sensory–motor processes and in our
body’s internal states [60]. In this way, the body is typically the main focus of embod-
ied cognition (rather than the environment or objects) and serves as the central frame-
work for our understanding of and interactions with the world [128]. As a result, much
of the literature applying embodied cognition has a primary focus on the amount, type,
Learning with the Body 165
and proper integration of bodily engagement, for example, [47, 63, 122, 131, 141].
This body-centric perspective of embodied cognition has also carried over into the
design of educational technology, where games and simulations tend to focus on the
utilization of sensors to either (1) map full-body interaction and congruency to learn-
ing content through the use of gestures or (2) track whole-body enactment of learning
material. Some notable systems that have employed embodied cognition include the
Mathematical Imagery Trainer [3, 56], SpatialEase [37], Math Propulsion [96], Fingu
[21], MEteor [79–81], and Embodied Poetry [52]. However, it is also important to
note that recent work in cognitive science has begun to develop theories of embodied
cognition that utilize broader cognitive representations, and move toward more embed-
ded, extended, and enactive perspectives (e.g., [12, 58, 89, 140]).
7.2.4 An ArtIfICIAL InteLLIgenCe And robotICs perspeCtIve of embodIment
The robotics and articial intelligence (AI) perspective of embodiment is another
distinct approach within HCI and human–robot interaction (HRI) that views embodi-
ment as having a physical body where software, sensors, etc., are housed, and ulti-
mately enables social/physical interactions [137]. Ziemke [154] has examined
different approaches to embodiment within the AI and cognitive science literature,
identifying six different uses of embodiment across research domains in order to
identify what kind of body is required for embodied cognition and AI. The six notions
identied are: (1) structural coupling between agent (AI/Robot) and environment; (2)
historical embodiment where embodiment is the reection or result of a history of
agent–environment interaction, and in many cases coadaptation; (3) physical embodi-
ment where systems/software has a physical instantiation; (4) organismoid embodi-
ment where cognition might be limited to physical bodies that have a similar form
and sensorimotor capacities to living bodies; (5) organismic embodiment where cog-
nition is limited only to living bodies; and (6) social embodiment where states of the
body—such as postures, arm movements, and facial expressions—arise during social
interaction and play central roles in social information processing [22]. Notably,
there has been a wide range of research utilizing and exploring the implications of
embodiment within HRI in recent years, for example, [45, 72, 124, 126, 138].
7.3 EMBODIMENT IN A DESIGN FRAMEWORK FOR GAMES AND
SIMULATIONS
As discussed above, embodiment and related terms such as embodied cognition and
embodied interaction have many different interpretations and applications across a
wide range of academic domains. For instance, HCI tends to view embodiment from
a phenomenological perspective where embodiment is a physical and social phenom-
ena that unfolds in real time and space as a part of the world in which we are situated
[35]. However, learning and cognitive science views tend to be more oriented purely
on the body and bodily engagement as a central focus for embodiment [63, 128, 131].
In order to capture a large corpus of embodied designs in our design framework, we
take a broad, encompassing perspective of embodiment: centering it around the
notion that human reasoning and behavior is connected to, or inuenced by our
166 Software Engineering Perspectives in Computer Game Development
bodies and their physical/social experience and interaction with the world [114]. This
can be seen as an iterative relationship, where reasoning and behavior can shape
interaction as well as the other way round, yet also complex because of the context,
time, space, emotion, etc., in which interaction is situated. In this way, cognition
becomes less brain-based and instead more embodied, embedded, extended, and
enactive [12, 30, 58, 89, 140].
7.4 BACKGROUND AND RELATED WORK
Our goal in providing a design framework for embodied learning games and simula-
tions is to bridge conceptual gaps and resulting design choices made from the differ-
ing uses of embodiment in various domains. In this section we present an overview
of the background and related work on design frameworks, their applications, and
existing embodied learning taxonomies/frameworks.
7.4.1 desIgn frAmeworks
A design framework is an important HCI tool that provides a common language for
designers and researchers to discuss design knowledge, generate prototypes, formu-
late research questions, and conceptualize empirical studies [16]. In this way, design
frameworks can help designers conceptualize nuances of particular technologies and
formalize the creative process [38]. In interface design, design frameworks have been
used to provide terminology to categorize ideas [111], as well as organize complex
concepts into logical hierarchies [107]. Notably, there are different kinds of design
frameworks. Taxonomical design frameworks, such as the one presented in this chap-
ter, are created by treating a set of taxonomical terms as orthogonal dimensions in a
design space. The resulting matrix of design choices provides structure for classica-
tion and comparison of designs [119]. The completed design framework provides a
means to compare existing systems and encourage new designs by providing a unify-
ing foundation for the description and categorization of systems. Furthermore, the
methodical lling-in of this structure helps to categorize existing concepts, identify
problematic design spaces, differentiate ideas, and identify unexplored terrain [38].
Some examples of taxonomic design frameworks include [38, 46, 92, 94]. However,
one notable drawback of taxonomical design frameworks is that they do not elucidate
the relations between concepts as other types of frameworks can.
Other important types of design frameworks include descriptive frameworks and
explanatory frameworks. Descriptive frameworks provide sensitizing concepts,
design considerations, and heuristics to inform design, but do not specify details
about how and why certain causes create their effects [16]. Some examples of exist-
ing descriptive frameworks include [8, 36, 88, 116, 120, 127]. Conversely, explana-
tory frameworks not only offer concepts, relations and descriptions, but also provide
explanatory accounts of framework relations [16]. They are among the more power-
ful types of frameworks since explanatory frameworks specically explicate the rela-
tions between concepts, which enable them to be utilized in the development of
testable hypotheses linking learning constructions, interactional behaviors, and
Learning with the Body 167
design features. A good exemplar of explanatory design frameworks is by Antle and
Wise [16].
7.4.2 embodIed LeArnIng tAxonomIes And frAmeworks
Similar to the many interpretations of embodiment, embodied learning taxonomies
and frameworks also have vastly different interpretations of physicality, motion, col-
laboration, and interaction. Johnson-Glenberg et al. [63] created an embodied learn-
ing taxonomy that species the strength of embodiment as a combination of the
amount of motoric engagement, gestural congruency to learning content, and immer-
sion. Black et al. [26] created the Instructional Embodiment Framework (IEF) which
consists of various forms of physical embodiment (i.e., direct, surrogate, and aug-
mented) as well as imagined embodiment (i.e., explicit and implicit) where the indi-
vidual can embody action and perception through imagination. Skulmowski and Rey
[131] developed a theoretical taxonomy of educational embodiment research in order
to highlight the possibilities and challenges involved in translating basic embodiment
research into application. Their taxonomy consists of a 2 × 2 grid with task integra-
tion (incidental vs. integrated) as one dimension and bodily engagement (low vs.
high) as the other.
In the domain of tangibles, Fishkin [46] presented a taxonomy for the analysis of
tangible interfaces which views embodiment as the distance between input and out-
put where embodiment can be full (output device is input device), nearby (output is
directly proximate to input device), environmental (output is “around” the user), or
distant (output is on another screen or in another room). A related framework by
Price et al. [116] for tangible learning environments focuses on different possible
artifact-representation combinations and the role that they play in shaping cognition.
The physical–digital links of these combinations are conceptualized into four distinct
dimensions:
Location—The different location couplings between physical artifacts and
digital representations.
Dynamics—The ow of information during interaction (e.g., is feedback
immediate or delayed).
Correspondence—The degree to which the physical properties of objects are
closely mapped to the learning concepts.
Modality—Different representation modalities in conjunction with artifact
interaction.
Antle and Wise [16] also proposed an important framework to aid tangible user inter-
face (TUI) design for learning through the lenses of theories of cognition and learn-
ing. Their Tangible Learning Design Framework provides 12 guidelines to inform
the design of the ve interrelated elements of TUI learning environments by drawing
from ve perspectives of cognition: Information Processing, Constructivism,
Embodied Cognition, Distributed Cognition, and Computer-Supported Collaborative
Learning.
168 Software Engineering Perspectives in Computer Game Development
There have also been a number of other taxonomies created to address various use
cases and target populations with tangibles. Leong et al. [76] created a conceptual
framework for constructive assemblies intended to facilitate systematic investigation
and critical consideration of a special subset of TUIs that involve the interconnection
of modular parts. Antle [8] created the CTI framework, a tool to conceptualize how
the unique features of tangible and spatial interactive systems can be harnessed for
children under the age of twelve. CTI consists of ve themes: Space for Action,
Perceptual Mappings, Behavioral Mappings, Semantic Mappings, and Space for
Friends. Finally, focusing more on the collaborative aspects of tangibles, Tissenbaum
et al. [135] developed the Divergent Collaboration Learning Mechanisms (DCLM)
framework for recognizing and coding collaboration and divergent learning—i.e.,
where learners collaborate while seeking divergent goals, ideas, and conceptions—in
tabletop learning environments.
7.5 TOWARD A DESIGN FRAMEWORK FOR EMBODIED
LEARNING GAMES AND SIMULATIONS
In order to provide initial steps toward addressing the black box of design decisions
for embodied learning systems, we created a taxonomical design framework for
embodied learning games and simulations. A design framework is an important tool
for bridging conceptual differences between diverse domains and existing systems
[119], and in this instance helps to clarify various key design decisions underlying
existing embodied educational games and simulations. While the taxonomical nature
of this framework does not explicate the relations between concepts, by making high-
level design decisions more apparent it can be employed to categorize existing
embodied educational games and simulations, identify problematic design space for
embodied educational systems, and identify design gaps between different dimen-
sions to generate novel embodied educational tools.
7.5.1 surveyIng exIstIng embodIed eduCAtIonAL gAmes And sImuLAtIons
In order to create the design framework, we conducted an extensive literature review
for published examples of embodied learning games and simulations in the following
HCI, games, and learning science venues:
ACM Conference on Human Factors in Computing Systems (CHI)
International Conference on Tangible, Embedded, and Embodied Interactions
(TEI)
Foundations of Digital Games Conference (FDG)
ACM Interaction Design and Children Conference (IDC)
International Journal of Game-Based Learning (IJGBJ)
International Journal of Computer-Supported Collaborative Learning
(ijCSCL)
International Journal of Arts and Technology (IJART)
Learning with the Body 169
International Journal of Child-Computer Interaction (IJCCI)
British Journal of Educational Technology (BJET)
Computers and Education
ACM Transactions on Computer–Human Interaction (TOCHI)
Notably, the core nature of all games is embodied to some extent, and leaving it to the
authors’ judgment whether a game was embodied or not could introduce signicant
personal bias. Therefore, for the purpose of this research, only papers that explicitly
mentioned embodiment or related terms (e.g., embodied learning, embodied cogni-
tion, and embodied interaction) were collected for the literature review. We also per-
formed a tree search of references and citations from the initial papers collected and
included seminal papers concerning embodiment that were not in the aforementioned
venues to increase the breadth of works collected—for example, [116, 118] were not
in the initial list of surveyed papers, but identied through the tree search. In addi-
tion, we examined related frameworks and taxonomies in subdomains and communi-
ties such as TEI [8, 16, 46, 76, 104, 112, 116], cognitive and learning science [26, 63,
131], and mixed reality [38, 121] in order to identify key taxonomical terms for our
design framework. The nal list contained over 90 papers concerning embodiment
and descriptions of designs for a total of 66 distinct embodied learning games and
simulations (see Table 7.1). This list is not intended to be exhaustive, but does repre-
sent a diverse selection of designs that could be drawn upon when creating a design
framework.
7.5.2 CreAtIng tHe desIgn frAmework
Bottom up, open coding was then performed following the process described by Ens and
Hincapié-ramos [38] in order to distill a set of 25 candidate dimensions that t concepts
found in the reviewed literature and designs. Candidate dimensions were iteratively
reduced and combined into a set small enough for a concise framework. Afterwards, we
presented the framework to experts in HCI, game design, and learning science for feed-
back and additional renements. The nal design framework consists of the seven
dimensions shown in Figure 7.1 (see Table 7.1 for all systems coded by their design
dimensions). Dimensions were then further organized into three groups based on their
overarching design themes within the construct of embodiment (i.e., physical body and
interactions, social interactions, and the world where interaction is situated).
7.5.3 desIgn spACe dImensIons of tHe frAmework
The following subsections briey describe each dimension of the design framework,
the corresponding values/design choices of the dimension, and the underlying theoreti-
cal basis with examples of existing educational games employing that design approach.
7.5.3.1 Physicality
This dimension ultimately describes how learning is physically embodied in a sys-
tem. Building off of the above and other central perspectives of embodiment, we
170 Software Engineering Perspectives in Computer Game Development
have identied ve key forms of physical interaction that commonly arise from
employing embodiment (see Figure 7.2).
1) The Direct Embodied value refers to an embodied cognition and learning sci-
ence approach where the body plays the primary constituent role in cognition
[128]. This form of embodiment and resulting physical interactions focus on
gestural congruency and how the body can physically represent learning con-
cepts [63]. For instance, a full-body interaction game where players contort
their bodies to match letters shown on a screen [105]
2) The Enacted value refers to Direct Embodiment from the Instructional
Embodiment Framework [26], and to enactivism which focuses on knowing as
physically doing [53, 77]. This form of embodiment, and resulting physical
interactions, centers more on acting/enacting out knowledge through physical
action of statements or sequences. For example, a gravitational physics game
where payers walk along (i.e., enact) the trajectory an asteroid would travel in
the vicinity of planets and their gravitational forces [80]
FIGURE 7.1 The design framework for embodied learning games and simulations. Similar
dimensions are clustered under a group based on an overarching design theme, and the differ-
ent values/design choices for each dimension are shown.
FIGURE 7.2 The ve distinct values for the physicality dimension of the design framework.
Consists of Direct Embodied, Enacted, Manipulated, Surrogate, and Augmented values.
Learning with the Body 171
3) The Manipulated value refers to the tangible embodied interactions of HCI/
TEI [90] and the use of manipulatives in learning science [110]. This form of
embodiment and resulting physical interactions arise from utilization of
embodied metaphors and interactions with physical objects [19], and the
objects’ physical embodiment of learning concepts in the material itself [61,
112]. For example, a tabletop environment where the surface shows projec-
tions illustrating light reection, absorption, transmission, and refraction of
colored blocks using a light source that can be moved around [113]
4) The Surrogate value refers to the Instructional Embodiment Framework con-
cept of Surrogate Embodiment, where learners manipulate a physical agent or
“surrogate” representative of themselves to enact learning concepts [26]. This
form of embodiment and resulting physical interactions are often used in sys-
tems with an interactive physical environment that is directly tied to a real-time
virtual simulation [44, 71]. For instance, a game where children use stuffed
animals to enact and better understand animal foraging behaviors [48]
5) The Augmented value refers to the Instructional Embodiment Framework
notion of Augmented Embodiment, where combined use of a representational
system (e.g., avatar) and augmented feedback system (e.g., Microsoft Kinect
and TV screen) embed the learner within an augmented reality system. This
form of embodiment and resulting physical interactions are most commonly
found in systems where learners’ physical actions are mapped as input to con-
trol digital avatars in virtual environments [100]. For example, a game where
children control a polar bear’s movement in the virtual environment by rotat-
ing their accelerometer-equipped gloves in a “swimming” motion [82, 83]
7.5.3.2 Transforms
This dimension conceptualizes a space, describing the relationships between physi-
cal or digital actions and the resulting physical or digital effects in the environment.
We utilize the transform types of Physical action => Physical effect (PPt), Physical
action => Digital effect (PDt), and Digital action => Physical effect (DPt) from
Rogers et al. [121] to describe the many forms of existing systems. Notably, Digital
action => Digital effect (DDt) was excluded since these are the standard forms of
interacting with a system (e.g., clicking digital buttons to move a digital avatar) and
do not support embodied interactions.
7.5.3.3 Mapping
This dimension borrows from the notion of Embodiment from the taxonomy for
tangible interfaces by [46] and Location from the tangible learning environment
framework by Price et al. [116] which both describe the different spatial locations of
output in relation to the object or action triggering the effect (i.e., how input is spa-
tially mapped to output). Mappings can be Discrete—input and output are located
separately (e.g., an action triggers output on a nearby screen); Co-located—input and
output are contiguous (e.g., an action triggers output that is directly adjacent or over-
laid on the physical space); and Embedded—input and output are embedded in the
same object.
172 Software Engineering Perspectives in Computer Game Development
7.5.3.4 Correspondence
This dimension builds upon the notion of Physical Correspondence from the tangible
learning environment framework by Price et al. [116] which refers to the degree to
which the physical properties of objects are closely mapped to the learning concepts.
We expand this concept to also include physical actions (e.g., congruency of gestures
or physical manipulations to learning concepts [63, 131]). Correspondence can be
Symbolic—objects and actions act as common but abstract signiers to the learning
concepts (e.g., arranging programming blocks to learn coding [145]); Indexical
physical properties and actions only correlate with or imply the learning concept (e.g.,
throwing a ball and watching it fall to learn about gravity [6]); or Literal—physical
properties and actions are closely mapped to the learning concepts and metaphor of the
domain (e.g., playing an augmented physical guitar to learn nger positioning [66]).
7.5.3.5 Mode of Play
This dimension species how individuals socially interact and play within a system.
The system can facilitate Individual, Collaborative, or Competitive play for
learner(s). Plass et al. [109] found differing learning benets for each mode of play,
suggesting it is also an important dimension to consider for learning outcomes.
7.5.3.6 Coordination
This dimension highlights how individuals in a system may have to socially coordi-
nate their actions [1] in order to successfully complete learning objectives. Social
coordination can occur with Other Player(s) and/or in a socio-collaborative experi-
ence with digital media typically in the form of NPC(s) [136]. Conversely, social
coordination can also be of limited focus in a design and not occur or even be sup-
ported—i.e., None.
7.5.3.7 Environment
This dimension refers to the learning environment in which the educational content
is situated. Environments can be either Physical, Mixed, or Virtual [121]. While
transforms conceptualise a space through the description of actions and effects, the
environment dimension focuses on the actual space where learning occurs. For
instance, a PDt transform can occur in drastically different learning environments
(Figure 7.3). In some systems, a player’s physical actions are tracked but only used
as input to control a virtual character in a virtual environment [82, 83]. In other sys-
tems, the player’s physical actions are tracked and mapped to control digital effects
overlaid on an augmented physical space or mixed reality environment [80]. Others
still have players situated in a completely physical environment where their physical
actions are tracked primarily to keep score or digitally maintain information related
to learning content that is displayed during the interaction [48].
7.6 APPLYING THE DESIGN FRAMEWORK
In order to demonstrate the potential application of this design framework, we pres-
ent three use cases in the following subsections. Specically, we highlight the
Learning with the Body 173
framework’s ability to (1) categorize existing embodied educational games and simu-
lations, (2) identify problematic design spaces for embodied educational systems,
and (3) identify design gaps between different dimensions to generate novel embod-
ied educational designs.
EXAMPLE 7.1 CATEGORIZING EXISTING EMBODIED GAMES
AND SIMULATIONS
One fundamental feature of any framework is its descriptive capability. To
exemplify how designs of existing embodied learning games and simulations
can be described using the framework, we applied it to the 66 systems sur-
veyed. For each design, we assigned dimensional values and cataloged the
results (see Table 7.1). This methodical approach provided a means to system-
atically compare and contrast the different designs [38]. One important point to
note is that the framework does not perfectly partition every design into dimen-
sional values. There were some cases where multiple values within a dimen-
sion would match a single design or the design description would leave a
chosen value open to interpretation. However, these minor discrepancies are
acceptable since the intentions of a design framework are to make the designer
aware of important design choices and help them weigh the potential benets
of these choices, rather than provide a set of arbitrary sorting bins [38].
During the analysis and cataloging process, varieties of similar designs
emerged and were reasonably described by nine distinct categories (see Figures
7.4 and 7.5). We found the majority of reviewed designs (58 of 66) to be very
good t for one of the categories, despite all nine categories only representing
a small portion of the full design space expressed by the framework. Similar to
the assignment of dimensional values, categories are not absolute. Therefore,
we include designs with minor variations in a category so long as they t
closely to the overall characteristics of that group.
FIGURE 7.3 Three systems illustrating PDt transforms in different learning environments.
Left—physical actions are mapped as input into a virtual environment [83]. Middle—physical
actions are mapped as input into a mixed reality environment that is overlaid on physical space
[80]. Right—physical actions occur in a physical learning environment and are only tracked to
digitally maintain and display information related to the physical interaction [48].
174 Software Engineering Perspectives in Computer Game Development
7.6.1 dIreCt embodIed pHysICALIty CAtegorIes
7.6.1.1 Full-Body Congruency
This category describes designs that employ full-body interactions with all or a por-
tion of the body being utilized as input into a mixed reality environment (Figure 7.4).
The mapping of input to output is discrete and sensor-based (e.g., utilizing some
form of IR or computer vision tracking), where players see augmented video feed-
back of themselves moving to match virtual objects or actions depicted on a screen.
The educational focus of these systems is on mirroring a learning concept through
bodily or gestural congruency, and instances include using the body to match shapes
of alphabet letters [37, 105, 152] and geometric shapes [96].
7.6.1.2 Finger-Based Congruency
This category is conceptually similar to full-body congruency in that the educational
focus of designs is on mirroring a learning concept through physical or gestural con-
gruency. However, the interaction focus is instead on usage of ngers to achieve this
congruency. This results in an embedded mapping of input to output on a physical
device (e.g., tablet) where gameplay is situated in a virtual environment. Examples
of this design category include usage of ngers to represent the numbers in a part–
whole relation [21] and the velocity of a moving object [34].
7.6.2 enACted pHysICALIty CAtegorIes
7.6.2.1 Whole-Body Position
This category is one of the largest set of systems categorized (12 designs)—focusing
on tracking simple aspects of a player’s body, such as their location in physical space,
to enact learning concepts in a mixed reality environment (Figure 7.4). These
FIGURE 7.4 A parallel coordinates graph showing the categories and corresponding design
choices found during analysis of existing systems that utilize Direct Embodied and Enacted
physicality.
Learning with the Body 175
systems typically rely on augmenting the physical space with a co-located mapping
of input through motion tracking and output through top to down projections [67, 80,
81] or through different modalities such as sound [13].
7.6.2.2 Embedded Phenomena
This category is a class of simulations that embed imaginary dynamic phenomena—
scientic or otherwise—into the physical space of classrooms [97]. As a result of this
design approach, interaction revolves around enacting techniques performed by real
world professionals in order to measure the phenomena through usage of devices
embedded into the physical classroom environment that provide augmented feed-
back about a specic phenomenon or scientic activity. Examples of this design
category include simulations of earthquake trilateration [97] and subterranean water
ow [103].
7.6.3 mAnIpuLAted pHysICALIty CAtegorIes
7.6.3.1 Tangible Blocks
This category describes designs that utilize notions of tangibility and embodied inter-
action from HCI and TEI communities combined with concepts of modularity from
Computer Science (Figure 7.5). Players physically manipulate/program a set of tan-
gible blocks with embedded sensing capabilities and feedback systems. These blocks
interact within the physical environment and are usually symbolically representative
of computing concepts [125, 145, 146].
7.6.3.2 Tangible Tabletops
This category describes designs that similarly utilize notions of tangibility and
embodied interaction as Tangible Blocks, but instead focus on the usage of symbolic
FIGURE 7.5 A parallel coordinates graph showing the categories and corresponding design
choices found during analysis of existing systems that utilize Manipulated, Surrogate, and
Augmented physicality.
176 Software Engineering Perspectives in Computer Game Development
tangibles or gestures in conjunction with a virtual world displayed on an interactive
tabletop. The setups are commonly found in public spaces such as museums and
typically facilitate large-scale social interactions. Tangible tabletop designs have
been employed to teach educational concepts around energy consumption [40], sus-
tainability [9], nanoscale [99], and African concepts for mapping history [29].
7.6.3.3 Tangible Objects
This category describes designs that utilize various tangible objects and embodied
interaction as input into virtual learning environments. Physical manipulation of the
tangible object results in a discrete and intuitive mapping to a virtual representation
of learning content. Tangible object designs have been utilized to teach a variety of
concepts such as urban planning [129] and heart anatomy [130].
7.6.4 surrogAte pHysICALIty CAtegorIes
7.6.4.1 Tangible Spaces
This category builds upon a space-centered view of tangible embodied interaction
where interactive spaces rely on combining physical space and tangible objects with
digital displays (Figure 7.5) [55]. The design focus is on creating a tangible physical
environment for the player to actively manipulate—complete with a physical surro-
gate avatar that the player controls—and discretely mapping physical changes in that
space to a virtual world that either mirrors or augments the physical one. Tangible
spaces have been used to teach programming [44], animal foraging behavior [48],
and diurnal motion of the sun [71].
7.6.5 Augmented pHysICALIty CAtegorIes
7.6.5.1 “Touchless” Motion-Based
Another large set of categorized designs (11 systems), designs in this category
employ a discrete mapping of players’ physical actions as input into a virtual world
(Figure 7.5). The use of a “touchless” interaction paradigm exploits sensing devices
which capture, track, and decipher body movements and gestures so that players do
not need to wear additional aides [23]. Unlike full-body congruency, the focus is not
on mirroring a learning concept through the body, but instead a player’s physical
actions are mapped to control a digital avatar in the virtual world. As a result, rather
than seeing a video of themselves, players will see silhouettes, digital avatars, or a
rst-person perspective. These systems have been utilized to teach concepts around
geometric shapes [73], climate change [83], and peer-directed social behaviors [25].
EXAMPLE 7.2 IDENTIFYING PROBLEMATIC DESIGN SPACES
Another benet of the design framework is that it allows for systematic exami-
nation of design elements within existing systems, identifying potential prob-
lematic design spaces. As an example of this usage, we examine the Tangible
Earth system (see Figure 7.6) where the authors had to create and use an
Learning with the Body 177
assessment framework to identify and understand problems the system encoun-
tered [71]. Tangible Earth is designed to support learning of the sun’s diurnal
motion and Earth’s rotation. It consists of a doll-like avatar, a globe, and a
rotating table to represent the Earth and its rotation, an electrical light repre-
senting the Sun, and a laptop running VR universe simulator. Learners would
physically manipulate the rotation of the earth and position/rotation of the ava-
tar to observe simulated changes in Sun’s position from the avatar’s
perspective.
One of the more signicant problems identied by Kuzuoka et al. [71] for
Tangible Earth was that learners spent very little time looking at the tangibles
themselves (e.g., globe, lamp, and avatar), instead focusing primarily on the
VR simulation in the laptop. This proved to be especially problematic for
manipulation of the avatar, where users would frequently forget the position of
its body and orientation of its head. This often caused the Sun to appear or
disappear unexpectedly in the simulation, confusing learners and learning con-
cepts. By analyzing this issue with the design framework, we identied a
potential problematic design space (see Figure 7.7).That is, learners had dif-
culty remembering the position of a physical agent representative of them-
selves (surrogate embodiment) because all of their physical actions were
mapped to digital effects (PDt transform) in a simulated world (virtual envi-
ronment) that directly mirrored the physical one (literal correspondence).
This difculty makes sense considering remembering the physical position/
orientation of a surrogate avatar in both the real world and the virtual world
simultaneously would introduce a signicant amount of extraneous cognitive
load [108]. As a result, the design framework suggests that the intersection of
surrogate embodiment, PDt transforms, literal correspondence, and virtual
environments is a problematic design space that should be carefully considered
when designing future embodied learning systems.
FIGURE 7.6 The Tangible Earth embodied learning system by Kuzuoka et al. [71].
178 Software Engineering Perspectives in Computer Game Development
7.7 CONCLUSION
Embodiment is a concept that has gained widespread usage within communities such
as HCI, games, and learning science in recent years. However, the broad number of
EXAMPLE 7.3 IDENTIFYING DESIGN GAPS
Perhaps the most important benet of the design framework is that it allows for
methodically plugging existing systems back into the different dimensions to
identify gaps and unexplored terrain [38]. As an illustration of this usage, we
ll in an example pairing between the two dimensions of Physicality and
Transforms (see Figure 7.8). This provides examples of relevant combinations
between these two dimensions in the embodied learning systems literature that
was surveyed.
Examining Figure 7.8, there are a number of design gaps for existing embod-
ied learning games and simulations. Notably, most of these gaps highlight
either unexplored or undervalued research directions within the literature base
they were drawn from. Some of the more potentially useful pairings in the
identied design gaps are Embodied + PPt, Manipulated + DPt, Surrogate +
PPt, and Surrogate + DPt, where interesting future system designs could evolve
from utilizing one of these pairings. For instance, using a Surrogate + PPt pair-
ing could lead to the design and research of physically embodied educational
board games. Additionally, a Surrogate + DPt pairing could lead to an asym-
metric computational thinking game where one player controls and interacts
with a physical avatar while another player digitally designs the physical
courses and obstacles for the rst player to complete.
FIGURE 7.7 A problematic design space identied in Tangible Earth [71].
Learning with the Body 179
academic domains that dene and operationalize embodiment has often resulted in a
black box of design decisions. This inconsistent application of embodiment is a sub-
stantial drawback when trying to design embodied technology to support particular
use cases such as learning, where understanding the “why” of outcomes is essential.
Our design framework for embodied learning games and simulations is an important
tool that begins to open the black box of design decisions for embodied learning
technologies—mapping out a more precise understanding of how to incorporate
embodiment into the design of educational games and simulations. Ultimately, such
a framework helps to explicate high-level design decisions by providing a unifying
foundation for the description and categorization of embodied learning systems, and
encourages new designs by identifying problematic design spaces and unexplored
research/design terrain [38].
In this chapter, we created a design framework by collecting 66 exemplars of
embodied learning games and simulations, followed by the application of a bottom
up, open coding method to distill seven core design dimensions. We then demon-
strated the design framework’s importance for a variety of use cases. First, we cate-
gorized existing embodied educational games and simulations, identifying nine
distinct categories that encompassed the majority of existing designs. Then, we
employed the framework to identify a problematic design space for an existing
embodied educational system, Tangible Earth [71], that was found to be ineffective
for learners. Finally, we concluded by utilizing the framework to identify design gaps
between the Physicality and Transforms dimensions—highlighting some of the par-
ticularly interesting gaps that are yet to be explored within the literature base sur-
veyed (e.g., embodied educational board games).
LIST OF CATEGORIZED EMBODIED LEARNING GAMES AND
SIMULATIONS
All 66 embodied learning game designs identied through the literature survey,
coded for each of the design framework dimensions, and organized into categories of
common design patterns are shown in Table 7.1. Values/design choices that do not
cleanly t into dened categories are highlighted.
FIGURE 7.8 Example Pairings between the Physicality and Transform Dimensions for a
Random Subset of 20 Surveyed Systems
180 Software Engineering Perspectives in Computer Game Development
TABLE 7.1
List Of Categorized Embodied Learning Games and Simulations.
Full-Body Congruency Finger-Based Congruency
Design
Word
Out!
[105,
152]
Mathematical
Imagery
Trainer
[3, 56]
Math
Propulsion
[96]
Hand
Velocity
[7]
Spatial
Ease
[37]
Finger
Velocity
[34]
Mirror
Fugue
[147,
148]
Andante
[148]
Fingu
[21]
AR
Guitar
[66]
Physical
Interaction
Physicality
Direct Embodied X X X X X X X X X X
Enacted X X X X
Manipulated X
Surrogate
Augmented
Transforms
PPt
PDt X X X X X X X X X X
DPt
Mapping
Discrete X X X X X
Co-located
Embedded X X X X X
Correspondence
Symbolic
Indexical
Literal X X X X X X X X X X
Social
Interaction Mode of Play
Individual X X X X X X X
Collaborative X X X
Competitive
Coordination
Other Player X X X
NPC
None X X X X X X X
World Environment
Physical X
Mixed X X X X X X
Virtual X X X
(Continued)
Learning with the Body 181
Whole-Body Position
Design Archimedes
[87]
Sound
Maker
[12, 13]
Wobble
Board
Game
[73]
MEteor
[79–81]
Springboard
[10, 11, 84]
CoCensus
[118]
Music
Paint
Machine
[101,
102]
Embodied
Poetry
[52, 67]
Lands
of
Fog
[98]
Learning
Physics
through
Play
Project
[39]
Social
Robot
[136]
Der
Schwarm
[51]
Physical
Interaction
Physicality
Direct
Embodied X X X
Enacted X X X X X X X X X X X
Manipulated
Surrogate
Augmented X
Transforms
PPt
PDt X X X X X X X X X X X X
DPt
Mapping
Discrete X X X X X
Co-located X X X X X X X
Embedded
Correspondence
Symbolic X X X
Indexical X
Literal X X X X X X X X
Social
Interaction
Mode of Play
Individual X X X X X X
Collaborative X X X X X X
Competitive
Coordination
Other Player X X X X X X
NPC X X X
None X X X X
World Environment
Physical
Mixed X X X X X X X X X X X
Virtual X
TABLE 7.1 (Continued)
List Of Categorized Embodied Learning Games and Simulations.
(Continued)
182 Software Engineering Perspectives in Computer Game Development
Embedded Phenomena Tangible Blocks
Design
Roomquake
[97]
Espaistory
[86]
AquaRoom
[103]
Astronaut
Challenge
[65]
WallCology
[65]
Note
Code
[70]
Electronic
Blocks
[145, 146]
roBlocks
[125]
Buildasound
[117]
Physical
Interaction
Physicality
Direct
Embodied
Enacted X X X X X
Manipulated X X X X X X X
Surrogate
Augmented
Transforms
PPt X X X X
PDt X X X X X
DPt X
Mapping
Discrete
Co-located
Embedded X X X X X X X X X
Correspondence
Symbolic X X X X
Indexical
Literal X X X X X
Social
Interaction
Mode of Play
Individual X X
Collaborative X X X X X X X
Competitive
Coordination
Other Player X X X X X X
NPC X
None X X
World Environment
Physical X X X X X X X
Mixed X
Virtual X
TABLE 7.1 (Continued)
List Of Categorized Embodied Learning Games and Simulations.
(Continued)
Learning with the Body 183
Tangible Tabletops Tangible Objects
Design
Nano
Zoom
[99]
Mapping
Place
[29]
LightTable
[42, 113,
114]
Flow of
Electrons
[32, 33]
Youtopia
[17,
144]
Touch
Wire
[123]
Futura
[9,
14,
132]
Eco
Planner
[40]
MoSo
Tangibles
[19, 20]
Phono
Blocks
[43]
Paper-
Based
Urban
Planning
[129]
Kurio
[2]
TUI
Heart
[130]
Physical
Interaction
Physicality
Direct
Embodied X
Enacted
Manipulated X X X X X X X X X X X X X
Surrogate
Augmented
Transforms
PPt
PDt X X X X X X X X X X X X X
DPt
Mapping
Discrete X X X X X X
Co-located
Embedded X X X X X X X X X
Correspondence
Symbolic X X X X
Indexical
Literal X X X X X X X X X
Social
Interaction
Mode of Play
Individual X X X X
Collaborative X X X X X X X X X
Competitive
Coordination
Other Player X X X X X X X X X
NPC
None X X X X
World Environment
Physical X X
Mixed X X X
Virtual X X X X X X X X
TABLE 7.1 (Continued)
List Of Categorized Embodied Learning Games and Simulations.
(Continued)
184 Software Engineering Perspectives in Computer Game Development
Tangible Spaces Touchless Motion-Based
Design
Tangible
Programming
Space [44]
Tangible
Earth
[71]
Hunger
Games
[48]
Snark
[115]
Bump
Bash
[23]
A Mile
in My
Paws
[82,
83]
Target
Kick
[23]
Kinems
[69]
Children
Make
Terrible
Pets
[54]
Motion
Autism
Game
[25]
BESIDE
[153]
Kinect
Recycling
Game
[59]
TASC
[28]
Sorter
Game
[73]
Physical
Interaction
Physicality
Direct
Embodied
Enacted
Manipulated X
Surrogate X X X
Augmented X X X X X X X X X X X
Transforms
PPt
PDt X X X X X X X X X X X X X X
DPt
Mapping
Discrete X X X X X X X X X X X X X X
Co-located
Embedded
Correspondence
Symbolic X X X X X X X X X X X
Indexical
Literal X X X
Social
Interaction
Mode of Play
Individual X X X X X X
Collaborative X X X X X X X
Competitive X X X
Coordination
Other Player X X X X X X X
NPC X X X
None X X X X X
World Environment
Physical X
Mixed X
Virtual X X X X X X X X X X X X
TABLE 7.1 (Continued)
List Of Categorized Embodied Learning Games and Simulations.
(Continued)
Learning with the Body 185
Miscellaneous
Design
Scratch Direct
Embodiment
[41]
Wiimote
Gravity
[6]
EstimateIT!TM
[18]
Tangible
Landscape
[49, 106]
Embodied
Puppet
[31, 91]
EarthShake
[149–151]
ALERT [27,
28]
Human
SUGOROKU
[5, 100]
Direct
Embodied
Physical
Interaction
Physicality
Enacted X
Manipulated X X X
Surrogate X
Augmented X X X
Transforms
PPt X X X X
PDt X X X X X X X
DPt X
Mapping
Discrete X X X X X X
Co-located X
Embedded X X
Correspondence
Symbolic X X X X
Indexical X
Literal X X X
Social
Interaction
Mode of Play
Individual X X X
Collaborative X X X X X
Competitive X
Coordination
Other Player X X X X X X
NPC
None X X X
World Environment
Physical X X X
Mixed X X X X X
Virtual X
TABLE 7.1 (Continued)
List Of Categorized Embodied Learning Games and Simulations.
186 Software Engineering Perspectives in Computer Game Development
REFERENCES
[1] Olivier Oullier, Gonzalo C. De Guzman, Kelly J. Jantzen, Julien Lagarde, and J. A. Scott
Kelso Social coordination dynamics: Measuring human bonding. Social Neuroscience
3, 2 (2008).
[2] Ron Wakkary, Marek Hatala, Kevin Muise, Karen Tanenbaum, Greg Corness, Bardia
Mohabbati, and Jim Budd. 2009. Kurio: A museum guide for families. In Proceedings of
the third international conference on Tangible, embedded, and embodied interaction -
TEI’09. 215–222.
[3] Dor Abrahamson. 2014. Building educational activities for understanding: An elabora-
tion on the embodied-design framework and its epistemic grounds. International
Journal of Child-Computer Interaction 2, 1, 1–16.
[4] Dor Abrahamson and Arthur Bakker. 2016. Making sense of movement in embodied
design for mathematics learning. Cognitive Research: Principles and Implications 1, 1, 33.
[5] Takayuki Adachi, M Goseki, K Muratsu, Hiroshi Mizoguchi, Miki Namatame, Masanori
Sugimoto, Fusako Kusunoki, Etsuji Yamaguchi, Shigenori Inagaki, and Yoshiaki
Takeda. 2013. Human SUGOROKU: Full-body interaction system for students to learn
vegetation succession. In Interaction Design and Children. 364–367.
[6] Zeynep Ahmet, Martin Jonsson, Saiful Islam Sumon, and Lars Erik Holmquist. 2011.
Supporting embodied exploration of physical concepts in mixed digital and physical
interactive settings. In Proceedings of TEI ’11.
[7] Stamatina Anastopoulou, Mike Sharples, and Chris Baber. 2011. An evaluation of mul-
timodal interactions with technology while learning science concepts. British Journal of
Educational Technology 42, 2, 266–290.
[8] Alissa N Antle. 2007. The CTI framework: Informing the design of tangible systems for
children. In Proceedings of the 1st International Conference on Tangible and Embedded
Interaction. ACM, 195–202.
[9] Alissa N Antle, Allen Bevans, Josh Tanenbaum, Katie Seaborn, and Sijie Wang. 2011.
Futura: Design for collaborative learning and game play on a multi-touch digital table-
top. In Proceedings of the fth international conference on Tangible, embedded, and
embodied interaction. ACM, 93–100.
[10] Alissa N Antle, Greg Corness, and Allen Bevans. 2011. Springboard: Designing image
schema based embodied interaction for an abstract domain. In Whole Body Interaction.
Springer, 7–18.
[11] Alissa N Antle, Greg Corness, and Allen Bevans. 2013. Balancing justice: Comparing
whole body and controller-based interaction for an abstract domain. International
Journal of Arts and Technology 6, 4, 388–409.
[12] Alissa N Antle, Greg Corness, and Milena Droumeva. 2008. What the body knows:
Exploring the benets of embodied metaphors in hybrid physical digital environments.
Interacting with Computers 21, 1–2, 66–75.
[13] Alissa N Antle, Milena Droumeva, and Greg Corness. 2008. Playing with the sound
maker: Do embodied metaphors help children learn?. In Proceedings of the 7th interna-
tional conference on Interaction design and children. ACM, 178–185.
[14] Alissa N Antle, Joshua Tanenbaum, Allen Bevans, Katie Seaborn, and Sijie Wang. 2011.
Balancing act: Enabling public engagement with sustainability issues through a multi-
touch tabletop collaborative game. In IFIP Conference on Human-Computer Interaction.
Springer, 194–211.
[15] Alissa N Antle and Sijie Wang. 2013. Comparing motor-cognitive strategies for spatial
problem solving with tangible and multi-touch interfaces. In Proceedings of the 7th
International Conference on Tangible, Embedded and Embodied Interaction. ACM,
65–72.
Learning with the Body 187
[16] Alissa N Antle and Alyssa F Wise. 2013. Getting down to details: Using theories of
cognition and learning to inform tangible user interface design. Interacting with
Computers 25, 1 (2013), 1–20.
[17] Alissa N Antle, Alyssa F Wise, Amanda Hall, Saba Nowroozi, Perry Tan, Jillian Warren,
Rachael Eckersley, and Michelle Fan. 2013. Youtopia: A collaborative, tangible, multi-
touch, sustainability learning activity. In Proceedings of the 12th International
Conference on Interaction Design and Children. ACM, 565–568.
[18] Ivon Arroyo, Matthew Micciollo, Jonathan Casano, Erin Ottmar, Taylyn Hulse, and Ma
Mercedes Rodrigo. 2017. Wearable learning: multiplayer embodied games for math. In
Proceedings of the Annual Symposium on Computer-Human Interaction in Play. ACM,
205–216.
[19] Sasykia Bakker, Alissa N Antle, and Elise Van Den Hoven. 2012. Embodied metaphors
in tangible interaction design. Personal and Ubiquitous Computing 16, 4, 433–449.
[20] Saskia Bakker, Elise Van Den Hoven, and Alissa N Antle. 2011. MoSo tangibles:
Evaluating embodied learning. In Proceedings of the Fifth International Conference On
Tangible, Embedded, And Embodied Interaction. ACM, 85–92.
[21] Wolmet Barendregt and Berner Lindström. 2012. Development and evaluation of Fingu:
A mathematics iPad game using multi-touch interaction. In Proceedings of the 11th
International Conference on Interaction Design and Children. 204–207.
[22] Lawrence W Barsalou, Paula M Niedenthal, Aron K Barbey, and Jennifer A Ruppert.
2003. Social embodiment. Psychology of Learning and Motivation 43 (2003), 43–92.
[23] Laura Bartoli, Clara Corradi, Politecnico Milano, and Matteo Valoriani. 2013. Exploring
motion-based touchless games for autistic children’s learning. In Proceedings of the
12th International Conference on Interaction Design and Children. 102–111.
[24] G E Baykal, I Veryeri Alaca, A E Yantaç, and T Göksun. 2018. A review on complemen-
tary natures of tangible user interfaces (TUIs) and early spatial learning. International
Journal of Child-Computer Interaction (2018).
[25] Arpita Bhattacharya, Mirko Gelsomini, Patricia Pérez-Fuster, Gregory D Abowd, and
Agata Rozga. 2015. Designing motion-based activities to engage students with Autism
in classroom settings. In IDC 2015. 69–78.
[26] J. B. Black, A. Segal, J. Vitale, and C. L. Fadjo. 2012. Embodied cognition and learning
environment design. In Theoretical Foundations of Learning Environments. 198–223.
[27] Winslow S Burleson, Danielle B Harlow, Katherine J Nilsen, Ken Perlin, Natalie Freed,
Camilla Norgaard Jensen, Byron Lahey, Patrick Lu, and Kasia Muldner. 2018. Active
Learning Environments with Robotic Tangibles: Children. IEEE Transactions on
Learning Technologies 1 (2018), 96–106.
[28] Jack Shen-Kuen Chang. 2017. The design and evaluation of embodied interfaces for
supporting spatial ability. In Proceedings of the Eleventh International Conference on
Tangible, Embedded, and Embodied Interaction. ACM. 681–684.
[29] Jean Ho Chu, Paul Clifton, Daniel Harley, Jordanne Pavao, and Ali Mazalek. 2015.
Mapping place: Supporting cultural learning through a lukasa-inspired tangible table-
top museum exhibit. In Proceedings of the 9th International Conference on Tangible,
Embedded, and Embodied Interaction - TEI ’15. 261–268.
[30] Andy Clark. 2008. Supersizing the mind: Embodiment, action, and cognitive extension.
Vol. 20. 286.
[31] Paul Clifton. 2014. Designing embodied interfaces to support spatial ability. In
Proceedings of TEI ’14. 309–312.
[32] Bettina Conradi, Martin Hommer, and Robert Kowalski. 2010. From digital to physical:
Learning physical computing on interactive surfaces. In ACM International Conference
on Interactive Tabletops and Surfaces. 249–250.
188 Software Engineering Perspectives in Computer Game Development
[33] Bettina Conradi, Verena Lerch, Martin Hommer, Robert Kowalski, Ioanna Vletsou, and
Heinrich Hussmann. 2011. Flow of electrons: An augmented workspace for learning
physical computing experientially. In Proceedings of the ACM International Conference
on Interactive Tabletops and Surfaces - ITS’11. 182–191.
[34] Mattias Davidsson. 2014. Finger Velocity–A Multimodal Touch Based Tablet
Application for Learning the Physics of Motion. In International Conference on Mobile
and Contextual Learning. Springer, 238–249.
[35] Paul Dourish. 2004. Where the Action is: The Foundations of Embodied Interaction.
MIT press.
[36] Darren Edge and Alan Blackwell. 2006. Correlates of the cognitive dimensions for tan-
gible user interface. Journal of Visual Languages & Computing 17, 4, 366–394.
[37] Darren Edge, Kai-yin Cheng, and Michael Whitney. 2013. SpatialEase: Learning lan-
guage through body motion. In Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (CHI’13). 469–472.
[38] Barrett Ens and Juan David Hincapié-ramos. 2014. Ethereal planes: A design frame-
work for 2D information spaces in 3D mixed reality environments. In Proceedings of the
2nd ACM Symposium on Spatial User Interaction.
[39] Noel Enyedy, Joshua A Danish, Girlie Delacruz, and Melissa Kumar. 2012. Learning
physics through play in an augmented reality environment. International Journal of
Computer-Supported Collaborative Learning 7, 3, 347–378.
[40] Augusto Esteves and Ian Oakley. 2011. Design for interface consistency or embodied
facilitation?. In CHI 2011 Embodied Interaction: Theory and Practice in HCI Workshop.
1–4.
[41] Cameron L. Fadjo and John B. Black. 2012. You’re in the game: Direct embodiment and
computational artifact construction. In Proceedings of the International Conference of
the Learning Sciences: Future of Learning (Vol. 2: Symposia).
[42] Taciana Pontual Falcão and Sara Price. 2011. Interfering and resolving: How tabletop
interaction facilitates co-construction of argumentative knowledge. International
Journal of Computer-Supported Collaborative Learning 6, 4, 539–559.
[43] Min Fan, Alissa N Antle, Maureen Hoskyn, Carman Neustaedter, and Emily S Cramer.
2017. Why tangibility matters: A design case study of at-risk children learning to read
and spell. In Proceedings of the 2017 CHI Conference on Human Factors in Computing
Systems. ACM, 1805–1816.
[44] Ylva Fernaeus and Jakob Tholander. 2006. Finding design qualities in a Tangible pro-
gramming space. In CHI 2006 Proceedings’06. 447–456.
[45] Kerstin Fischer, Katrin Lohan, and Kilian Foth. 2012. Levels of embodiment: Linguistic
analyses of factors inuencing Hri. In 7th ACM/IEEE International Conference on
Human-Robot Interaction (HRI), 2012. IEEE, 463–470.
[46] Kenneth P. Fishkin. 2004. A taxonomy for and analysis of tangible interfaces. Personal
and Ubiquitous Computing 8, 5, 347–358.
[47] Arthur M Glenberg. 2010. Embodiment as a unifying perspective for psychology. Wiley
Interdisciplinary Reviews: Cognitive Science 1, 4 (2010), 586–596.
[48] Alessandro Gnoli, Anthony Perritano, Paulo Guerra, Brenda Lopez, Joel Brown, and
Tom Moher. 2014. Back to the future: Embodied Classroom Simulations of Animal
Foraging. In Proceedings of the 8th International Conference on Tangible, Embedded
and Embodied Interaction - TEI’14. 275–282.
[49] Brendan Alexander Harmon. 2016. Embodied spatial thinking in tangible computing. In
Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and
Embodied Interaction. ACM, 693–696.
Learning with the Body 189
[50] Steve Harrison, Deborah Tatar, and Phoebe Sengers. 2007. The three paradigms of HCI.
In Alt. Chi. Session at the SIGCHI Conference on Human Factors in Computing Systems
San Jose, California, USA. 1–18.
[51] Anja Hashagen, Corinne Büching, and Heidi Schelhowe. 2009. Learning abstract con-
cepts through bodily engagement. In Proceedings of the 8th International Conference on
Interaction Design and Children - IDC’09. 234.
[52] Sarah Hatton, Ellen Campana, Andreea Danielescu, and David Bircheld. 2009.
Stratication: Embodied poetry works by high school students. In Proceedings of the 5th
ACM Conference on Creativity & Cognition. 463–464.
[53] Douglas L Holton. 2010. Constructivism + embodied cognition = enactivism: theoreti-
cal and practical implications for conceptual change. In AERA 2010 Conference.
[54] Bruce D Homer, Charles K Kinzer, Jan L Plass, Susan M Letourneau, Dan Hoffman,
Meagan Bromley, Elizabeth O Hayward, Selen Turkay, and Yolanta Kornak. 2014.
Moved to learn: The effects of interactivity in a Kinect-based literacy game for begin-
ning readers. Computers & Education 74 (2014), 37–49.
[55] Eva Hornecker and Jacob Buur. 2006. Getting a grip on tangible interaction: A frame-
work on physical space and social interaction. In Proceedings of the SIGCHI Conference
on Human Factors in Computing Systems. ACM, 437–446.
[56] Mark Howison, Dragan Trninic, Daniel Reinholz, and Dor Abrahamson. 2011. The
mathematical imagery trainer: From Embodied Interaction to Conceptual Learning. In
Proceedings of the 2011 annual conference on Human factors in computing systems
- CHI’11.
[57] Caroline Hummels and Jelle Van Dijk. 2015. Seven principles to design for embodied
sensemaking. In Proceedings of the Ninth International Conference on Tangible,
Embedded, and Embodied Interaction. ACM, 21–28.
[58] Jörn Hurtienne. 2009. Image schemas and design for intuitive use.
[59] José de Jesús, Luis González Ibánez and Alf Inge Wang. 2015. Learning recycling from
playing a kinect game. International Journal of Game-Based Learning (IJGBL) 5, 3,
25–44.
[60] Thea Ionescu and Dermina Vasc. 2014. Embodied cognition: challenges for psychology
and education. Procedia-Social and Behavioral Sciences 128 (2014), 275–280.
[61] Hiroshi Ishii. 2008. Tangible bits: Beyond pixels. In Proceedings of the 2nd interna-
tional conference on Tangible and Embedded Intreaction (TEI’08).
[62] Mark Johnson. 2013. The Body in the Mind: The Bodily Basis of Meaning, Imagination,
and Reason. University of Chicago Press.
[63] Mina C Johnson-Glenberg, David A Bircheld, Lisa Tolentino, and Tatyana Koziupa.
2014. Collaborative embodied learning in mixed reality motion-capture environments:
Two science studies. Journal of Educational Psychology 106, 1 (2014), 86.
[64] E Kain. 2016. Pokémon Go’is the biggest mobile game in US history-And it’s about to
top snapchat. (2016). https://www.forbes.com/sites/erikkain/2016/07/13/pokemon-go-
is-the-biggest-mobile-game-in-us-history-and-its-about-to-top-snapchat/
[65] Fengfeng Ke and Peter Carafano. 2016. Collaborative science learning in an immersive
ight simulation. Computers & Education 103 (2016), 114–123.
[66] Joseph R. Keebler, Travis J. Wiltshire, Dustin C. Smith, and Stephen M. Fiore. 2013.
Picking up STEAM: Educational implications for teaching with an augmented reality
guitar learning system. Lecture Notes in Computer Science (including subseries Lecture
Notes in Articial Intelligence and Lecture Notes in Bioinformatics) 8022 LNCS, PART
2 (2013), 170–178.
190 Software Engineering Perspectives in Computer Game Development
[67] A Kelliher, D Bircheld, E Campana, S Hatton, M Johnson-Glenberg, C Martinez, L
Olson, P Savvides, L Tolentino, K Phillips, and S Uysal. 2009. Smallab: A mixed-reality
environment for embodied and mediated learning. In MM’09 - Proceedings of the 2009
ACM Multimedia Conference, with Co-located Workshops and Symposiums. 1029–1031.
[68] Lora Kolodny. 2016. Osmo raises $24 million from Sesame, Mattel and other big names
in toys and education. (2016). https://techcrunch.com/2016/12/08/osmo-raises-
24-million-from-sesame-mattel-and-other-big-names-in-toys-and-education/
[69] Maria Kourakli, Ioannis Altanis, Symeon Retalis, Michail Boloudakis, Dimitrios
Zbainos, and Katerina Antonopoulou. 2017. Towards the improvement of the cognitive,
motoric and academic skills of students with special educational needs using Kinect
learning games. International Journal of Child-Computer Interaction 11 (2017),
28–39.
[70] Vishesh Kumar, Tuhina Dargan, Utkarsh Dwivedi, and Poorvi Vijay. 2015. Note code –
A tangible music programming puzzle tool. In Proceedings of the 10th International
Conference on Tangible, Embedded, and Embodied Interaction - TEI’15. 625–629.
[71] Hideaki Kuzuoka, Naomi Yamashita, Hiroshi Kato, Hideyuki Suzuki, and Yoshihiko
Kubota. 2014. Tangible earth: Tangible learning environment for astronomy education.
In Proceedings of the Second International Conference on Human-Agent Interaction.
23–27.
[72] Sonya S Kwak, Yunkyung Kim, Eunho Kim, Christine Shin, and Kwangsu Cho. 2013.
What makes people empathize with an emotional robot?: The impact of agency and
physical embodiment on human empathy for a robot. In 2013 IEEE RO-MAN. IEEE,
180–185.
[73] Chronis Kynigos, Zacharoula Smyrnaiou, and Maria Roussou. 2010. Exploring rules
and underlying concepts while engaged with collaborative full-body games. In
Proceedings of the 9th International Conference on Interaction Design and Children.
222.
[74] Byron Lahey, Winslow Burleson, Camilla Nørgaard Jensen, Natalie Freed, and Patrick
Lu. 2008. Integrating video games and robotic play in physical environments. In
Proceedings of the 2008 ACM SIGGRAPH Symposium on Video Games - Sandbox’08,
Vol. 1. 107.
[75] George Lakoff and Mark Johnson. 2008. Metaphors We Live Bby. University of Chicago
press.
[76] Joanne Leong, Florian Perteneder, Hans-Christian Jetter, and Michael Haller. 2017.
What a life!: Building a framework for constructive assemblies. In Proceedings of the
Tenth International Conference on Tangible, Embedded, and Embodied Interaction.
ACM, 57–66.
[77] Qing Li. 2012. Understanding enactivism: a study of affordances and constraints of
engaging practicing teachers as digital game designers. Educational Technology
Research and Development 60, 5, 785–806. http://link.springer.com/10.1007/
s11423-012-9255-4
[78] William Lidwell and Gerry Manacsa. 2011. Deconstructing Product Design: Exploring
the Form, Function, Usability, Sustainability, and Commercial Success of 100 Amazing
Products. Rockport Pub.
[79] Robb Lindgren and J Michael Moshell. 2011. Supporting children’s learning with body-
based metaphors in a mixed reality environment. In Proceedings of the 10th International
Conference on Interaction Design and Children - IDC’11, Vol. Ann Arbor. 177–180.
[80] Robb Lindgren, Michael Tscholl, and J Michael Moshell. 2013. MEteor: Developing
physics concepts through body- based interaction with a mixed reality simulation. In
Physics Education Research Conference - PERC’13. 217–220.
Learning with the Body 191
[81] Robb Lindgren, Michael Tscholl, Shuai Wang, and Emily Johnson. 2016. Enhancing
learning and engagement through embodied interaction within a mixed reality simula-
tion. Computers & Education 95 (2016), 174–187.
[82] Leilah Lyons. 2014. Exhibiting data: Using body-as-interface designs to engage visitors
with data visualizations. In Learning Technologies and the Body: Integration and
Implementation in Formal and Informal Learning Environments. Taylor & Francis
Group
[83] Leilah Lyons, Brenda Lopez Silva, Tom Moher, Priscilla Jimenez Pazmino, and Brian
Slattery. 2013. Feel the burn: Exploring design parameters for effortful interaction for
educational games. In Proceedings of the 12th International Conference on Interaction
Design and Children- IDC’13. 400–403.
[84] Anna Macaranas, Alissa N Antle, and Bernhard E Riecke. 2015. What is intuitive inter-
action? balancing users’ performance and satisfaction with natural user interfaces.
Interacting with Computers 27, 3 (2015), 357–370.
[85] Peter Malcolm, Tom Moher, Darshan Bhatt, Brian Uphoff, and Brenda López-Silva.
2008. Embodying scientic concepts in the physical space of the classroom. In
Proceedings of the 7th International Conference on Interaction Design and Children.
234–241.
[86] Laura Malinverni, Julian Maya, Marie-Monique Schaper, and Narcis Pares. 2017. The
World-as-Support: Embodied exploration, understanding and meaning-making of the
augmented world. In Proceedings of the 2017 CHI Conference on Human Factors in
Computing Systems. ACM, 5132–5144.
[87] Laura Malinverni, Brenda López Silva, and Narcís Parés. 2012. Impact of embodied
interaction on learning processes: design and analysis of an educational application
based on physical activity. In Proceedings of the 11th International Conference on
Interaction Design and Children. ACM, 60–69.
[88] Paul Marshall. 2007. Do tangible interfaces enhance learning?. In Proceedings Of The
1st International Conference on Tangible and Embedded Interaction. ACM, 163–170.
[89] Paul Marshall, Alissa Antle, Elise Van Den Hoven, and Yvonne Rogers. 2013.
Introduction to the special issue on the theory and practice of embodied interaction in
HCI and interaction design. ACM Transactions on Computer-Human Interaction
(TOCHI) 20, 1, 1.
[90] Paul Marshall, Sara Price, and Yvonne Rogers. 2003. Conceptualising tangibles to sup-
port learning. In IDC ’03: Proceedings of the 2003 Conference on Interaction Design
and Children. 101–109.
[91] Ali Mazalek, Sanjay Chandrasekharan, Michael Nitsche, Tim Welsh, Paul Clifton,
Andrew Quitmeyer, Firaz Peer, Friedrich Kirschner, and Dilip Athreya. 2011. I’m in the
Game : Embodied Puppet Interface Improves Avatar Control. In Proceedings of the
Fifth International Conference on Tangible, Embedded, and Embodied Interaction -
Tei’11. 129–136.
[92] Edward Melcer and Katherine Isbister. 2016. Bridging the physical divide: a design
framework for embodied learning games and simulations. In Proceedings of the 2016
CHI Conference on Human Factors in Computing Systems. doi:http://dx.doi.
org/10.1145/2851581.2892455.
[93] Edward F Melcer, Victoria Hollis, and Katherine Isbister. 2017. Tangibles vs. mouse in
educational programming games: Inuences on enjoyment and self-beliefs. In CHI’17
Extended Abstracts. ACM, Denver, CO, USA.
[94] Edward F Melcer and Katherine Isbister. 2016. Bridging the physical learning divides:
A design framework for embodied learning games and simulations. In Proceedings of
the 1st International Joint Conference of DiGRA and FDG.
192 Software Engineering Perspectives in Computer Game Development
[95] Edward F Melcer and Katherine Isbister. 2018. Bots & (Main)Frames: Exploring the
impact of tangible blocks and collaborative play in an educational programming game.
In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems.
ACM, Montreal, QC, Canada.
[96] Jason Mickelson and Wendy Ju. 2011. Math propulsion: Engaging math learners
through embodied performance & visualization. In Proceedings of the Fifth International
Conference on Tangible, Embedded, and Embodied Interaction - TEI’11. 101.
[97] Tom Moher, Syeda Hussain, Tim Halter, and Debi Kilb. 2005. Roomquake: Embedding
dynamic phenomena within the physical space of an elementary school classroom. In
Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems,
Vol. 2. 1665–1668.
[98] Joan Mora-Guiard, Ciera Crowell, Narcis Pares, and Pamela Heaton. 2017. Sparking
social initiation behaviors in children with Autism through full-body Interaction.
International Journal of Child-Computer Interaction 11 (2017), 62–71.
[99] J MoraGuiard and Narcis Pares. 2014. Child as the measure of all things: The body as a
referent in designing a museum exhibit to understand the nanoscale. In IDC’14.
[100] Tomohiro Nakayama, Takayuki Adachi, Keita Muratsu, Hiroshi Mizoguchi, Miki
Namatame, Masanori Sugimoto, Fusako Kusunoki, Etsuji Yamaguchi, Shigenori
Inagaki, and Yoshiaki Takeda. 2014. Human SUGOROKU: Learning support system of
vegetation succession with full-body interaction interface. In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems (CHI’14). 2227–2232.
[101] Luc Nijs and Marc Leman. 2014. Interactive technologies in the instrumental music
classroom: A longitudinal study with the Music Paint Machine. Computers & Education
73 (2014), 40–59.
[102] Luc Nijs, Bart Moens, Micheline Lesaffre, and Marc Leman. 2012. The Music Paint
Machine: stimulating self-monitoring through the generation of creative visual output
using a technology-enhanced learning tool. Journal of New Music Research 41, 1,
79–101.
[103] Francesco Novellis and Tom Moher. 2011. How real is ‘real enough’? designing arti-
facts and procedures for embodied simulations of science practices. In Proceedings of
the 10th International Conference on Interaction Design and Children. 90–98.
[104] C. O’Malley and S. Fraser. 2004. Literature review in learning with tangible technolo-
gies. Technical Report. 1–52 pages.
[105] Felicia Clare Paul, Christabel Goh, and Kelly Yap. 2015. Get creative with learning:
Word out! a full body interactive game. In Proceedings of the 33rd Annual ACM
Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA’15.
[106] Anna Petrasova, Brendan Harmon, Vaclav Petras, and Helena Mitasova. 2014. GIS-
based environmental modeling with tangible interaction and dynamic visualization. In
Proceedings of the 7th International Congress on Environmental Modelling and
Software, San Diego, CA, USA. 15–19.
[107] Catherine Plaisant, David Carr, and Ben Shneiderman. 1995. Image-browser taxonomy
and guidelines for designers. IEEE Software 12, 2, 21–32. doi:http://dx.doi.
org/10.1109/52.368260
[108] Jan L. Plass, Roxana Moreno, and Roland Brünken. 2010. Cognitive Load Theory.
Cambridge University Press.
[109] Jan L. Plass, Paul A. O’Keefe, Bruce D. Homer, Jennifer Case, Elizabeth O. Hayward,
Murphy Stein, and Ken Perlin. 2013. The impact of individual, competitive, and col-
laborative mathematics game play on learning, performance, and motivation. Journal of
Educational Psychology 105, 4, 1050–1066.
Learning with the Body 193
[110] Wim TJL Pouw, Tamara Van Gog, and Fred Paas. 2014. An embedded and embodied
cognition review of instructional manipulatives. Educational Psychology Review 26, 1,
51–72.
[111] Blaine A. Price, Ronald M. Baecker, and Ian S. Small. 1993. A principled taxonomy of
software visualization. Journal of Visual Languages & Computing 4, 3, 211–266.
doi:http://dx.doi.org/10.1006/jvlc.1993.1015
[112] Sara Price. 2008. A representation approach to conceptualizing tangible learning envi-
ronments. In Proceedings of the 2nd international conference on Tangible and embed-
ded interaction TEI 08. 151. doi:http://dx.doi.org/10.1145/1347390.1347425
[113] Sara Price, Taciana Pontual Falcão, Jennifer G Sheridan, and George Roussos. 2009.
The effect of representation location on interaction in a tangible learning environment.
In Proceedings of the 3rd International Conference on Tangible and Embedded
Interaction. ACM, 85–92.
[114] Sara Price and Carey Jewitt. 2013. A multimodal approach to examining’embodiment’in
tangible learning environments. In Proceedings of the 7th International Conference on
Tangible, Embedded and Embodied Interaction. ACM, 43–50.
[115] Sara Price, Yvonne Rogers, M. Scaife, D. Stanton, and H. Neale. 2003. Using tangibles
to promote novel forms of playful learning. Interacting with Computers 15, 2 (2003),
169–185.
[116] Sara Price, Jennifer G Sheridan, Taciana Pontual Falcao, and George Roussos. 2008.
Towards a framework for investigating tangible environments for learning. International
Journal of Arts and Technology 1, 3–4, 351–368.
[117] Mónica Rikić. 2013. Buildasound. In Proceedings of the 7th International Conference
on Tangible, Embedded, and Embodied Interaction - TEI ’13. 395–396.
[118] Jessica Roberts and Leilah Lyons. 2017. The value of learning talk: applying a novel
dialogue scoring method to inform interaction design in an open-ended, embodied
museum exhibit. International Journal of Computer-Supported Collaborative Learning
12, 4, 343–376.
[119] Warren Robinett. 1992. Synthetic Experience: A Taxonomy, Survey of Earlier Thought,
and Speculations on the Future. Technical Report. 1–30 pages.
[120] Yvonne Rogers and Henk Muller. 2006. A framework for designing sensor-based inter-
actions to promote exploration and reection in play. International Journal of Human-
Computer Studies 64, 1, 1–14.
[121] Yvonne Rogers, Mike Scaife, Silvia Gabrielli, Hilary Smith, and Eric Harris. 2002. A
conceptual framework for mixed reality environments: Designing novel learning activi-
ties for young children. Presence 11, 6, 677–686.
[122] Tim Rohrer. 2007. The body in space: Dimensions of embodiment. In Body, Language
and Mind. 339–378.
[123] Michael Saenz, Joshua Strunk, Sharon Lynn Chu, and Jinsil Hwaryoung Seo. 2015.
Touch wire: Interactive tangible electricity game for kids. In Proceedings of the 9th
International Conference on Tangible, Embedded, and Embodied Interaction-TEI’15.
655–659.
[124] Paul Schermerhorn and Matthias Scheutz. 2011. Disentangling the effects of robot
affect, embodiment, and autonomy on human team members in a mixed-initiative task.
In Proceedings from the International Conference on Advances in Computer-Human
Interactions. Citeseer, 236–241.
[125] Eric Schweikardt and Md Gross. 2008. The robot is the program: interacting with Ro
blocks. In Proceedings of the Second International Conference on Tangible, Embedded,
and Embodied Interaction - Tei’08. 167–168.
194 Software Engineering Perspectives in Computer Game Development
[126] Elena Márquez Segura, Michael Kriegel, Ruth Aylett, Amol Deshmukh, and Henriette
Cramer. 2012. How do you like me in this: user embodiment preferences for companion
agents. In International Conference on Intelligent Virtual Agents. Springer, 112–125.
[127] Orit Shaer, Nancy Leland, Eduardo H Calvillo-Gamez, and Robert J K Jacob. 2004. The
TAC paradigm: specifying tangible user interfaces. Personal and Ubiquitous Computing
8, 5 (2004), 359–369.
[128] Lawrence Shapiro. 2010. Embodied Cognition. Routledge.
[129] Tia Shelley, Leilah Lyons, Moira Zellner, and Emily Minor. 2011. Evaluating the
embodiment benets of a paper-based tui for spatially sensitive simulations. In Extended
Abstracts of the 2011 Conference on Human Factors in Computing Systems. 1375.
[130] Alexander Skulmowski, Simon Pradel, Tom Kühnert, Guido Brunnett, and Günter
Daniel Rey. 2016. Embodied learning using a tangible user interface: The effects of
haptic perception and selective pointing on a spatial learning task. Computers &
Education 92–93 (2016), 64–75.
[131] Alexander Skulmowski and Günter Daniel Rey. 2018. Embodied learning: introducing
a taxonomy based on bodily engagement and task integration. Cognitive Research:
Principles and Implications 3, 1, 6.
[132] Tess Speelpenning, Alissa N Antle, Tanja Doering, and Elise Van Den Hoven. 2011.
Exploring how tangible tools enable collaboration in a multi-touch tabletop game. In
IFIP Conference on Human-Computer Interaction. Springer, 605–621.
[133] Jürgen Streeck, Charles Goodwin, and Curtis LeBaron. 2011. Embodied interaction:
Language and body in the material world. Embodied Interaction Language and Body in
the Material World (2011), 1–28. http://books.google.com/books?id=vJlSewAACAAJ
[134] Lucy A Suchman. 1987. Plans and Situated Actions: The Problem of Human-Machine
Communication. Cambridge University Press.
[135] Mike Tissenbaum, Matthew Berland, and Leilah Lyons. 2017. DCLM framework:
Understanding collaboration in open-ended tabletop learning environments.
International Journal of Computer-Supported Collaborative Learning 12, 1, 35–64.
[136] Lisa Tolentino, Philippos Savvides, and David Bircheld. 2010. Applying game design
principles to social skills learning for students in special education. In Proceedings of
FDG ’10.
[137] Joshua Wainer, David J Feil-Seifer, Dylan A Shell, and Maja J Mataric. 2006. The role
of physical embodiment in human-robot interaction. In The 15th IEEE International
Symposium on Robot and Human Interactive Communication, 2006. IEEE, 117–122.
[138] Michael L Walters, Kheng Lee Koay, Dag Sverre Syrdal, Kerstin Dautenhahn, and René
Te Boekhorst. 2009. Preferences and perceptions of robot appearance and embodiment
in human-robot interaction trials. In Proceedings of New Frontiers in Human-Robot
Interaction.
[139] Chun-wang Wei, Hsin-hung Chen, and Nian-shing Chen. 2015. Effects of embodiment-
based learning on perceived cooperation process and social ow. In 7th World
Conference on Educational Sciences. Elsevier B.V., 608–613.
[140] Michael Wheeler. 2005. Reconstructing the Cognitive World: The Next Step. MIT press.
[141] Andrew D Wilson and Sabrina Golonka. 2013. Embodied cognition is not what you
think it is. Frontiers in Psychology 4 (2013), 58.
[142] Margaret Wilson. 2002. Six views of embodied cognition. Psychonomic Bulletin &
Review 9, 4, 625–636.
[143] Terry Winograd, Fernando Flores, and Fernando F Flores. 1986. Understanding
Computers and Cognition: A New Foundation for Design. Intellect Books.
Learning with the Body 195
[144] Alyssa Friend Wise, Alissa Nicole Antle, Jillian Warren, Aaron May, Min Fan, and Anna
Macaranas. 2015. What kind of world do you want to live in? positive interdependence
and collaborative processes in the tangible tabletop land-use planning game youtopia.
International Society of the Learning Sciences, Inc.[ISLS].
[145] Peta Wyeth. 2008. How young children learn to program with sensor, action, and logic
blocks. Journal of the Learning Sciences 17, 4, 517–550.
[146] Peta Wyeth and Helen C. Purchase. 2002. Tangible programming elements for young
children. In CHI ’02 extended abstracts on Human factors in computing systems - CHI
’02. 774.
[147] Xiao Xiao, Paula Aguilera, Jonathan Williams, and Hiroshi Ishii. 2013. MirrorFugue iii:
Conjuring the recorded pianist. In Extended Abstracts of the 2013 CHI Conference on
Human Factors in Computing Systems. 2891–2892.
[148] Xiao Xiao and Hiroshi Ishii. 2016. Inspect, embody, invent: A design framework for
music learning and beyond. In Proceedings of the 2016 CHI Conference on Human
Factors in Computing Systems. ACM, 5397–5408.
[149] Nesra Yannier, Scott E Hudson, Eliane Stampfer Wiese, and Kenneth R Koedinger.
2016. Adding Physicality to an Interactive Game Improves Learning and Enjoyment :
Evidence from Earthshake. ACM Transactions on Computer-Human Interaction
(TOCHI) 23, 4, 1–31.
[150] Nesra Yannier, Kenneth R. Koedinger, and Scott E. Hudson. 2013. Tangible collabora-
tive learning with a mixed-reality game: Earthshake. Articial Intelligence in Education
(2013).
[151] Nesra Yannier, Kenneth R. Koedinger, and Scott E. Hudson. 2015. Learning from mixed-
reality games: Is shaking a tablet as effective as physical observation? CHI’15.
[152] Kelly Yap, Clement Zheng, Angela Tay, Ching-Chiuan Yen, and Ellen Yi-Luen Do.
2015. Word out! learning the alphabet through full body interactions. In Proceedings of
the 6th Augmented Human International Conference on - AH’15. 101–108.
[153] Ryuichi Yoshida, Hiroshi Mizoguchi, Ryohei Egusa, Machi Saito, Miki Namatame,
Masanori Sugimoto, Fusako Kusunoki, Etsuji Yamaguchi, Shigenori Inagaki, and
Yoshiaki Takeda. 2015. BESIDE: Body experience and sense of immersion in digital
paleontological environment. In Proceedings of CHI’15. 1283–1288.
[154] Tom Ziemke. 2002. What’s that thing called embodiment?. In Proceedings of the 25th
Annual Meeting of the Cognitive Science Society. 1305–1310.
ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
While recent work has begun to evaluate the efficacy of educational programming games, many common design decisions in these games (e.g., single player gameplay using touchpad or mouse) have not been explored for learning outcomes. For instance, alternative design approaches such as collaborative play and embodied interaction with tangibles may also provide important benefits to learners. To better understand how these design decisions impact learning and related factors, we created an educational programming game that allows for systematically varying input method and mode of play. In this paper, we describe design rationale for mouse and tangible versions of our game, and report a 2x2 factorial experiment comparing efficacy of mouse and tangible input methods with individual and collaborative modes of play. Results indicate tangibles have a greater positive impact on learning, situational interest, enjoyment, and programming self-beliefs. We also found collaborative play helps further reduce programming anxiety over individual play.
Article
Full-text available
Research on learning and education is increasingly influenced by theories of embodied cognition. Several embodiment-based interventions have been empirically investigated, including gesturing, interactive digital media, and bodily activity in general. This review aims to present the most important theoretical foundations of embodied cognition and their application to educational research. Furthermore, we critically review recent research concerning the effectiveness of embodiment interventions and develop a taxonomy to more properly characterize research on embodied cognition. The main dimensions of this taxonomy are bodily engagement (i.e. how much bodily activity is involved) and task integration (i.e. whether bodily activities are related to a learning task in a meaningful way or not). By locating studies on the 2 × 2 grid resulting from this taxonomy and assessing the corresponding learning outcomes, we identify opportunities, problems, and challenges of research on embodied learning.
Conference Paper
Full-text available
We present a new technology-based paradigm to support embodied mathematics educational games, using wearable devices in the form of SmartPhones and SmartWatches for math learning, for full classes of students in formal in-school education settings. The Wearable Learning Games Engine is web based infrastructure that enables students to carry one mobile device per child, as they embark on math team-based activities that require physical engagement with the environment. These Wearable Tutors serve as guides and assistants while students manipulate, measure, estimate, discern, discard and find mathematical objects that satisfy specified constraints. Multi-player math games that use this infrastructure have yielded both cognitive and affective benefits. Beyond math game play, the Wearable Games Engine Authoring Tool enables students to create games themselves for other students to play; in this process, students engage in computational thinking and learn about finite-state machines. We present the infrastructure, games, and results for a series of experiments on both game play and game creation.
Conference Paper
Full-text available
Tangibles may be effective for reading applications. Letters can be represented as 3D physical objects. Words are spatially organized collections of letters. We explore how tangibility impacts reading and spelling acquisition for young Anglophone children who have dyslexia. We describe our theory-based design rationale and present a mixed-methods case study of eight children using our PhonoBlocks system. All children made significant gains in reading and spelling on trained and untrained (new) words, and could apply all spelling rules a month later. We discuss the design features of our system that contributed to effective learning processes, resulting in successful learning outcomes: dynamic colour cues embedded in 3D letters, which can draw attention to how letter(s) position changes their sounds; and the form of 3D tangible letters, which can enforce correct letter orientation and enable epistemic strategies in letter organization that simplify spelling tasks. We conclude with design guidelines for tangible reading systems.
Conference Paper
Full-text available
Constructive assemblies are tangible user interfaces (TUIs) that involve the interconnection of modular parts. As such, they offer a unique means to support us in our various and diverse activities. However, little information is available for understanding the intricacies of taking a modular, constructive assembly approach to TUI design. Based on an analysis of extensive data collected from interviews with eight world-class TUI experts, we propose a descriptive, conceptual framework to facilitate systematic investigation and critical consideration of constructive assemblies. The paper presents a lifecycle model for constructive assemblies and discusses their main design qualities and associated parameters. We demonstrate how this can be used to structure critical discussions by applying the principles to existing works and the design of our own constructive assembly.
Chapter
Recent research from a large number of fields has recently come together under the rubric of embodied cognitive science. Embodied cognitive science attempts to show specific ways in which the body shapes and constrains thought. I enumerate the standard variety of usages that the term "embodiment" currently receives in cognitive science and contrast notions of embodiment and experientialism at a variety of levels of investigation. The purpose is to develop a broad-based theoretic framework for embodiment which can serve as a bridge between different fields. I introduce the theoretic framework using examples that trace related research issues such as mental imagery, mental rotation, spatial language and conceptual metaphor across several levels of investigation. As a survey piece, this chapter covers numerous different conceptualizations of the body ranging from the physiological and developmental to the mental and philosophical; theoretically, it focuses on questions of whether and how all these different conceptualizations can form a cohesive research program.
Article
Spatial skills are essential for everyday tasks, and technology blends seamlessly into children's everyday environment. Since spatiality as a term is ubiquitous in experience this paper bridges literature in two fields: theories on early spatial learning in cognitive development and potential benefits of tangible user interfaces (TUIs) for supporting very young children's spatial skills. Studies suggest that the period between 2 and 4 years of age is critical for training spatial skills (e.g., mental rotation), which relate to further success in STEAM (science, technology, engineering, arts, and math) disciplines. We first present a review of the empirical findings on spatial skills, early interventions, and tools (i.e., narrative and gesture input) recommended for training preschool children's spatial skills. By situating the work within the use and benefits of manipulatives (e.g., building blocks, puzzles, shapes) combined with digital affordances in interaction design, we address the relevance of TUIs as complementary tools for spatial learning. We concentrate on the supporting properties of TUIs that enable playful learning, make storytelling more concrete, and provide embodiment effects through physicality. Through various products found in the market and literature that address the physical-digital convergence, we invite designers and researchers to consider design practices and applicable technology that build on present efforts and paradigms in this area. To contribute to this area, we conclude with a discussion of the gaps in design methods to develop technologies for children younger than 4 years old, and propose directions for future work to leverage new tools that serve very young children's spatial learning and possible inquiries for dual payoff.
Article
Museum researchers have long acknowledged the importance of dialogue in informal learning, particularly for open-ended exploratory exhibits. Novel interaction techniques like full-body interaction are appealing for these exploratory exhibits, but designers have not had a metric for determining how their designs are supporting productive learning talk. Moreover, with the incorporation of digital technologies into museums, researchers and designers now have the opportunity for in situ A/B testing of multiple exhibit designs not previously possible with traditionally constructed exhibits, which once installed were difficult and expensive to iterate. Here we present a method called Scoring Qualitative Informal Learning Dialogue (SQuILD) for quantifying idiosyncratic social learning talk, in order to conduct in situ testing of group learning at interactive exhibits. We demonstrate how the method was applied to a 2 × 2 experiment varying the means of control (full-body vs. handheld tablet controller) and the distribution of control (single-user-input vs. multi-user-input) of an interactive data map exhibit. Though pilot testing in the lab predicted that full-body and multi-input designs would best support learning talk, analysis of dialogue from 119 groups’ interactions revealed surprising nuances in the affordances of each. Implications for embodied interaction design are discussed.
Conference Paper
Shown in many longitudinal studies, spatial ability is important to learning and career success. This paper, inspired by [2,14], presents the 2nd generation of TASC (Tangibles for Augmenting Spatial Cognition) to illustrate how (re)design lessons can be learned, how existing evaluation methods can be applied, and how new evaluations may be generated or envisioned, when a TEI (tangible and embodied interaction) system is built to study spatial ability.