Conference PaperPDF Available

Bridging the Physical Learning Divides: A Design Framework for Embodied Learning Games and Simulations

Authors:

Abstract and Figures

Due to a broad conceptual usage of the term embodiment across a diverse variety of research domains, existing embodied learning games and simulations utilize a large breadth of design approaches that often result in seemingly unrelated systems. This becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within existing designs, as well as when trying to utilize embodiment in the design of new games and simulations. In this paper, we present our work on combining differing conceptual and design approaches for embodied learning systems into a unified design framework. We describe the creation process for the framework, explain its dimensions, and provide examples of its use. Our design framework will benefit educational game researchers by providing a unifying foundation for the description, categorization, and evaluation of designs for embodied learning games and simulations.
Content may be subject to copyright.
Proceedings of 1st International Joint Conference of DiGRA and FDG
© 2016 Authors. Personal and educational classroom use of this paper is allowed, commercial use requires
specific permission from the author.
Bridging the Physical Learning
Divides: A Design Framework for
Embodied Learning Games and
Simulations
Edward Melcer
Game Innovation Lab, NYU Tandon School of Engineering
5 MetroTech Center
Brooklyn, NY 11201
eddie.melcer@nyu.edu
Katherine Isbister
Social Emotional Technologies Lab, Department of Computational Media
University of California, Santa Cruz
Santa Cruz, CA 95064
katherine.isbister@ucsc.edu
ABSTRACT
Due to a broad conceptual usage of the term embodiment across a diverse variety of
research domains, existing embodied learning games and simulations utilize a large
breadth of design approaches that often result in seemingly unrelated systems. This
becomes problematic when trying to critically evaluate the usage and effectiveness of
embodiment within existing designs, as well as when trying to utilize embodiment in the
design of new games and simulations. In this paper, we present our work on combining
differing conceptual and design approaches for embodied learning systems into a unified
design framework. We describe the creation process for the framework, explain its
dimensions, and provide examples of its use. Our design framework will benefit
educational game researchers by providing a unifying foundation for the description,
categorization, and evaluation of designs for embodied learning games and simulations.
Keywords
Embodiment, Embodied Learning Games and Simulations, Design Framework
INTRODUCTION
Recent work on educational systems has shown the benefits of incorporating physicality,
motion, and embodiment into designs. For instance, improved spatial recall and mental
manipulation (Clifton, 2014; Rieser, Garing, & Young, 1994); more intuitive interfaces,
interactions, and mappings (Shelley, Lyons, Zellner, & Minor, 2011; Wyeth, 2008);
increased engagement (Bhattacharya, Gelsomini, Pérez-Fuster, Abowd, & Rozga, 2015;
Edge, Cheng, & Whitney, 2013; Yannier, Koedinger, & Hudson, 2013); greater positive
feelings towards learning content and science in general (Lindgren, Tscholl, & Moshell,
2013; Wei, Chen, & Chen, 2015; Yannier et al., 2013); and enhanced collaboration
(Ahmet, Jonsson, Sumon, & Holmquist, 2011; S. Price, Rogers, Scaife, Stanton, & Neale,
-- 2 --
2003; Yannier et al., 2013). This direction stems from the concept that cognition does not
only occur in the mind but is also supported by bodily activity (Shapiro, 2010); situated
in and interacting with our physical and social environment (Clark, 2008; Dourish, 2001).
However, when examining existing embodied learning games and simulations closely,
we find a large breadth of designs that result in seemingly unrelated systems (see Figure
1). This becomes problematic when trying to understand where and how embodiment
occurs in these systems, and which design elements help to facilitate embodied learning.
The problem is further aggravated by limited empirical validation of many systems
(Zaman, Vanden Abeele, Markopoulos, & Marshall, 2012), and a broad conceptual usage
of embodiment and related terms in a diverse variety of domains such as Human-
Computer Interaction (HCI), learning science, neuroscience, linguistics, and philosophy
(Birchfield et al., 2008; Rohrer, 2007; Ziemke, 2002). Therefore, for designers seeking to
utilize embodiment (i.e., an emergent property from the interactions between brain, body,
and the physical/social environment [Hummels & van Dijk, 2014]), the differences in
approach to physicality, collaboration, and interaction pose a significant hurdle. One
approach that can bridge conceptual differences between existing systems and domains is
the creation of a design framework (Ens & Hincapié-ramos, 2014; Robinett, 1992).
Figure 1: A spectrum of different embodied learning
systems. Left to right - Interactive Slide (Malinverni,
López Silva, & Parés, 2012), Electronic Blocks (Wyeth,
2008), Embodied Poetry (Kelliher et al., 2009),
SpatialEase (Edge et al., 2013), Eco Planner (Esteves &
Oakley, 2011).
BACKGROUND
Our goal in providing an embodied learning design framework is to bridge conceptual
gaps and resulting design choices made from the differing uses of embodiment in various
domains. In this section we present an overview of design frameworks, embodiment and
its application in educational games and simulations, and embodied learning taxonomies.
Design Frameworks
Design frameworks can help designers conceptualize nuances of particular technologies
and formalize the creative process (Ens & Hincapié-ramos, 2014). In interface design,
-- 3 --
design frameworks have been used to provide terminology to categorize ideas (B. A.
Price, Baecker, & Small, 1993) as well as organize complex concepts into logical
hierarchies (Plaisant, Carr, & Shneiderman, 1995). Design frameworks are created by
treating a set of taxonomical terms as orthogonal dimensions in a design space, and the
resulting matrix provides structure for classification and comparison of designs (Robinett,
1992). The completed design framework provides a means to critically examine designs
of existing systems and encourage new designs by providing a unifying foundation for
the description and categorization of systems. Furthermore, the methodical filling-in of
this structure helps to categorize existing concepts, differentiate ideas, and identify
unexplored terrain (Ens & Hincapié-ramos, 2014).
Embodiment, Embodied Cognition, and Embodied Interaction in
Educational Games and Simulations
Embodiment and related terms such as embodied cognition and embodied interaction
have many different interpretations and applications across a wide range of academic
domains. HCI tends to view embodiment from a phenomenological perspective where
embodiment is a physical and social phenomena that unfolds in real time and space as a
part of the world in which we are situated (Dourish, 2001). However, learning science
views tend to be more oriented on purely the body as a central focus for embodiment
(Johnson-Glenberg, Birchfield, Tolentino, & Koziupa, 2014; Rohrer, 2007). Moreover,
Ziemke (2002) has noted this divide in their work identifying six different uses of
embodiment across research domains (i.e., structural coupling, historical embodiment,
physical embodiment, organismoid embodiment, organismic embodiment, and social
embodiment). In order to encompass a large corpus of embodied designs in our design
framework, we take a broad perspective of embodiment: centering it around the notion
that human reasoning and behavior is connected to, or influenced by our bodies and their
physical/social experience and interaction with the world (S. Price & Jewitt, 2013). This
is seen as an iterative relationship, where reasoning and behavior can shape interaction as
well as the other way round, yet also complex because of the context, time, space,
emotion, etc. in which interaction is situated.
Embodied cognition is a similarly important but divided term for education, with Wilson
(2002) identifying six distinct views of embodied cognition where 1) cognition is
situated; 2) cognition is time-pressured; 3) we off-load cognitive work onto the
environment; 4) the environment is part of the cognitive system; 5) cognition is for
action; and 6) off-line cognition is body-based. In learning science, embodied cognition
considers how human cognition is fundamentally grounded in sensory-motor processes
and in our body's internal states (Ionescu & Vasc, 2014). As a result of this body-centric
perspective, learning science games and simulations explicitly addressing embodied
cognition tend to focus on the utilization of sensors to map full-body interaction and
congruency to learning content through the use of gestures (Barendregt & Lindström,
2012; Howison, Trninic, Reinholz, & Abrahamson, 2011; Johnson-Glenberg et al., 2014),
or to track whole-body enactment of learning material (Hatton, Campana, Danielescu, &
Birchfield, 2009; Lindgren et al., 2013). Conversely, HCI and subdomains such as
Tangible Embodied Interaction (TEI) view embodied cognition from a body-in-action
perspective where cognition is a coordination achieved through our brain, our body, and
the dynamic relationships between our body and the physical- and social environment
(Clark, 1997; Hummels & van Dijk, 2014). The resulting embodied cognition oriented
games and simulations in HCI and TEI tend to focus on a more social and collaborative
design, with sensors utilizing physical action as input into virtual or mixed reality worlds
(Clifton, 2014; Mickelson & Ju, 2011; Nakayama et al., 2014).
-- 4 --
Embodied interaction is a term coined by Dourish (2001) to capture a number of research
trends and ideas in HCI around tangible computing, social computing, and ubiquitous
computing. It refers to the creation, manipulation, and sharing of meaning through
engaged interaction with artifacts (Dourish, 2001), and includes material objects and
environments in the process of meaning making and action formation (Streeck, Goodwin,
& LeBaron, 2011). Games and simulations utilizing embodied interaction tend to place
the player in a physical space where they can physically manipulate interactive tangible
tabletops, blocks, and objects (Bakker, Hoven, & Antle, 2011; Chu, Clifton, Harley,
Pavao, & Mazalek, 2015; Esteves & Oakley, 2011; Rikić, 2013).
Embodied Learning Taxonomies
Similar to the many interpretations of embodiment, embodied learning frameworks and
taxonomies also have vastly different interpretations of physicality, motion,
collaboration, and interaction. Johnson-Glenberg et al (2014) created an embodied
learning taxonomy that specifies the strength of embodiment as a combination of the
amount of motoric engagement, gestural congruency to learning content, and immersion.
Black et al (2012) created the Instructional Embodiment Framework (IEF) which consists
of various forms of physical embodiment (i.e., direct, surrogate, and augmented) as well
as imagined embodiment (i.e., explicit and implicit) where the individual can embody
action and perception through imagination. In the TEI field, Fishkin's taxonomy (2004)
for the analysis of tangible interfaces views embodiment as the distance between input
and output where embodiment can be full (output device is input device), nearby (output
is directly proximate to input device), environmental (output is "around" the user), or
distant (output is on another screen or in another room). A related framework by Price
(2008) for tangible learning environments focuses on different possible artifact-
representation combinations and the role that they play in shaping cognition. The
physical-digital links of these combinations are conceptualized into four distinct
dimensions: locationthe different location couplings between physical artifacts and
digital representations; dynamicsthe flow of information during interaction (e.g., is
feedback immediate or delayed); correspondencethe degree to which the physical
properties of objects are closely mapped to the learning concepts; and modality
different representation modalities in conjunction with artifact interaction.
TOWARDS A DESIGN FRAMEWORK FOR EMBODIED LEARNING
GAMES AND SIMULATIONS
Creating the Design Framework
To create our design framework, we conducted an extensive literature review for
published examples of embodied learning games and simulations in venues such as CHI,
TEI, FDG, and Interaction Design and Children (IDC). Notably, the core nature of all
games is embodied to some extent. Therefore, for the purpose of this research, only
papers that explicitly mentioned embodiment or related terms (e.g., embodied learning,
embodied cognition, embodied interaction, etc) were collected/used in the literature
review. We also performed a tree search of references and citations from the initial papers
collected and seminal papers concerning embodiment. In addition, we examined related
frameworks and taxonomies in subdomains and communities such as TEI (Fishkin, 2004;
O’Malley & Fraser, 2004; S. Price, 2008), embodiment and embodied learning (Black et
al., 2012; Johnson-Glenberg et al., 2014), and mixed reality (Ens & Hincapié-ramos,
2014; Rogers, Scaife, Gabrielli, Smith, & Harris, 2002). Our final list contains papers
describing designs for a total of 48 different embodied learning games and simulations
-- 5 --
(for the complete list of designs and their categorization within our design framework, go
to: http://edwardmelcer.net/research/supplementary_framework_table.pdf). This list is
not intended to be exhaustive, but does represent a diverse selection of designs that could
be drawn upon when creating a design framework. Bottom up, open coding was then
performed following the process described by Ens & Hincapié-ramos (2014) in order to
distill a set of 25 candidate dimensions that fit concepts found in the reviewed literature
and designs. Candidate dimensions were iteratively reduced and combined into a set
small enough for a concise framework. Afterwards, we presented our framework to
experts in HCI, game design, and learning science for feedback and additional
refinements. The final design framework consists of 7 dimensions shown in Table 1. We
further organized the dimensions into three groups based on their overarching design
themes within the construct of embodiment (i.e., physical body and interactions, social
interactions, and the world where interaction is situated).
Group
Dimension
Values
Physical
Interaction
Physicality
Embodied
Manipulated
Surrogate
Augmented
Transforms
PPt
PDt
DPt
Mapping
Discrete
Co-located
Embedded
Correspondence
Symbolic
Literal
Social
Interaction
Mode of Play
Individual
Collaborative
Competitive
Coordination
Other Player(s)
NPC(s)
None
World
Environment
Physical
Mixed
Virtual
Table 1: Our design framework for embodied learning systems. Similar dimensions are
clustered under a group based on an overarching design theme, and the different values
for each dimension are shown.
Design Space Dimensions
Physicality describes how learning is physically embodied in a system and consists of
five distinct values. 1) The embodied value refers to an embodied cognition and learning
science approach where the body plays the primary constituent role in cognition (Shapiro,
2010). This form of embodiment focuses on gestural congruency and how the body can
physically represent learning concepts (Johnson-Glenberg et al., 2014). For instance, a
full body interaction game where players contort their bodies to match letters shown on a
screen (Paul, Goh, & Yap, 2015). 2) The enacted value refers to Direct Embodiment from
the IEF (Black et al., 2012), and to enactivism which focuses on knowing as physically
doing (Holton, 2010; Li, 2012). This form of embodiment focuses more on
acting/enacting out knowledge through physical action of statements or sequences. For
example, a gravitational physics game where payers walk along (i.e., enact) the trajectory
an asteroid would travel in the vicinity of planets and their gravitational forces (Lindgren
et al., 2013). 3) The manipulated value refers to the tangible embodied interactions of
TEI (Marshall, Price, & Rogers, 2003) and the use of manipulatives in learning science
-- 6 --
(Pouw, van Gog, & Paas, 2014). This form of embodiment arises from utilization of
embodied metaphors and interactions with physical objects (Bakker, Antle, & van den
Hoven, 2012), and the objects' physical embodiment of learning concepts (Ishii, 2008; S.
Price, 2008). 4) The surrogate value refers to the IEF concept of Surrogate Embodiment,
where learners manipulate a physical agent or "surrogate" representative of themselves to
enact learning concepts (Black et al., 2012). This form of embodiment is often used in
systems with an interactive physical environment that is directly tied to a real-time virtual
simulation (Gnoli et al., 2014; Kuzuoka, Yamashita, Kato, Suzuki, & Kubota, 2014). 5)
The augmented value refers to the IEF notion of Augmented Embodiment, where
combined use of a representational system (e.g., avatar) and augmented feedback system
(e.g., Microsoft Kinect and TV screen) embed the learner within an augmented reality
system. This form of embodiment is most commonly found in systems where learners'
physical actions are mapped as input to control digital avatars in virtual environments
(Lyons, Silva, Moher, Pazmino, & Slattery, 2013; Nakayama et al., 2014).
Transforms conceptualize a space, describing the relationships between physical or
digital actions and the resulting physical or digital effects in the environment (Rogers et
al., 2002). We utilize the transform types of Physical action => Physical effect (PPt),
Physical action => Digital effect (PDt), and Digital action => Physical effect (DPt) from
Rogers et al (2002) to describe the many forms of existing systems.
Mapping borrows the notion of Embodiment from Fishkin's (2004) taxonomy and
Location from Price's (2008) tangible learning environment framework which describes
the different spatial locations of output in relation to the object or action triggering the
effect (i.e., how is input spatially mapped to output). Mappings can be discreteinput
and output are located separately (e.g., an action triggers output on a nearby screen); co-
locatedinput and output are contiguous (e.g., an action triggers output that is directly
adjacent or overlaid on the physical space); and embeddedinput and output are
embedded in the same object.
Correspondence builds upon the notion of Physical Correspondence from Price's (2008)
tangible learning environment framework which refers to the degree to which the
physical properties of objects are closely mapped to the learning concepts. We expand
this concept to also include physical actions (e.g., congruency of gestures or physical
manipulations to learning concepts). Correspondence can be symbolicobjects and
actions act as common signifiers to the learning concepts (e.g., arranging programming
blocks to learn coding); or literalphysical properties and actions are closely mapped to
the learning concepts and metaphor of the domain (e.g., playing an augmented guitar to
learn finger positioning).
Mode of Play specifies how individuals socially interact and play within a system. The
system can facilitate individual, collaborative, or competitive play for learner(s). Plass et
al (2013) found differing learning benefits for each mode of play, suggesting it is also an
important dimension to consider for learning outcomes.
Coordination highlights how individuals in a system may have to socially coordinate
their actions (Oullier, de Guzman, Jantzen, Lagarde, & Kelso, 2008) in order to
successfully complete learning objectives. Social coordination can occur with other
players and/or in a socio-collaborative experience with digital media typically in the form
of NPCs (Tolentino, Savvides, & Birchfield, 2010). Conversely, social coordination can
also be of limited focus in a design and not occur or even be supported.
-- 7 --
Environment refers to the learning environment in which the educational content is
situated. Environments can be either physical, mixed, or virtual (Rogers et al., 2002).
While transforms conceptualize a space through the description of actions and effects, the
environment dimension focuses on the actual space where learning occurs. For instance, a
PDt transform can occur in drastically different learning environments (see Figure 2). In
some systems, a player's physical actions are tracked but only used as input to control a
virtual character in a virtual environment (Lyons et al., 2013). In other systems, the
player's physical actions are tracked and mapped to control digital effects overlaid on an
augmented physical space or mixed reality environment (Lindgren et al., 2013). Others
still have players situated in a completely physical environment where their physical
actions are tracked primarily to keep score or digitally maintain information related to
learning content that is displayed during the interaction (Gnoli et al., 2014).
Figure 2: Three systems illustrating PDt transforms in
different learning environments. Left - physical actions
are mapped as input into a virtual environment (Lyons et
al., 2013). Middle - physical actions are mapped as input
into a mixed reality environment that is overlaid on
physical space (Lindgren et al., 2013). Right - physical
actions occur in a physical learning environment and are
only tracked to digitally maintain and display
information related to the physical interaction (Gnoli et
al., 2014).
APPLYING THE DESIGN FRAMEWORK FOR EMBODIED LEARNING
GAMES AND SIMULATIONS
Example 1 - Categorizing Existing Games and Simulations
One fundamental feature of any framework is its descriptive capability. To exemplify
how designs of existing embodied learning games and simulations can be described using
our framework, we applied it to the 48 systems identified in our earlier literature review.
For each design, we assigned dimensional values and cataloged the results (see
http://edwardmelcer.net/research/supplementary_framework_table.pdf). This methodical
approach provided us with a means to systematically compare and contrast the different
designs (Ens & Hincapié-ramos, 2014). One important point to note is that our
framework does not perfectly partition every design into dimensional values. There were
some cases where multiple values within a dimension would match a single design or the
design description would leave a chosen value open to interpretation. However, we
believe these minor discrepancies are acceptable since the intentions of a design
framework are to make the designer aware of important design choices and help them
weigh the potential benefits of these choices, rather than provide a set of arbitrary sorting
bins (Ens & Hincapié-ramos, 2014).
-- 8 --
During the analysis and cataloging process, a variety of similar designs emerged and were
reasonably described by 9 distinct categories (see Figures 3 & 4). We found the majority
of reviewed designs (42 of 48) to be a very good fit for one of the categories, despite all 9
categories only representing a small portion of the full design space expressed by the
framework. Similar to the assignment of dimensional values, categories are not absolute.
Therefore, we include designs with minor variations in a category so long as they fit
closely to the overall characteristics of that group.
Figure 3: A parallel coordinates graph showing the
categories found during analysis of existing designs that
utilize embodied and enacted physicality.
Embodied Physicality Categories (Figure 3)
Full-Body Congruency describes designs that employ full-body interactions with all or a
portion of the body being utilized as input into a mixed reality environment. The mapping
of input to output is discrete and sensor-based (e.g., utilizing some form of IR or
computer vision tracking), where players see augmented video feedback of themselves
moving to match virtual objects or actions depicted on a screen. The educational focus of
these systems is on mirroring a learning concept through bodily or gestural congruency,
and instances include using the body to match shapes of alphabet letters (Edge et al.,
2013; Paul et al., 2015; Yap, Zheng, Tay, Yen, & Do, 2015) and geometric shapes
(Mickelson & Ju, 2011).
Finger-Based Congruency is conceptually similar to full-body congruency in that the
educational focus of designs is on mirroring a learning concept through physical or
gestural congruency. However, the interaction focus is instead on usage of fingers to
achieve this congruency. This results in an embedded mapping of input to output on a
physical device (e.g., tablet) where gameplay is situated in a virtual environment.
Examples of this design category include usage of fingers to represent the numbers in a
part-whole relation (Barendregt & Lindström, 2012) and the velocity of a moving object
(Davidsson, 2014).
-- 9 --
Enacted Physicality Categories (Figure 3)
Whole-Body Position is one of the largest set of systems categorized (7 designs) and
focuses on tracking simple aspects of a player's body, such as their location in physical
space, to enact learning concepts in a mixed reality environment. These systems typically
focus on augmenting the physical space with a co-located mapping of input through
motion tracking and output through top down projections (Kelliher et al., 2009; Lindgren
et al., 2013) or through different modalities such as sound (Antle, Droumeva, & Corness,
2008).
Embedded Phenomena is a class of simulations that embed imaginary dynamic
phenomenascientific or otherwiseinto the physical space of classrooms (Moher,
Hussain, Halter, & Kilb, 2005). As a result of this design approach, interaction revolves
around enacting techniques performed by real world professionals in order to measure
and utilize devices embedded into the physical classroom environment that provide
augmented feedback about a specific phenomena. Examples of this design category
include simulations of earthquake trilateration (Moher et al., 2005) and subterranean
water flow (Novellis & Moher, 2011).
Figure 4: A parallel coordinates graph showing the
categories found during analysis of existing designs that
utilize manipulated, surrogate, and augmented
physicality.
Manipulated Physicality Categories (Figure 4)
Tangible Blocks describe designs that utilize notions of tangibility and embodied
interaction from HCI and TEI communities combined with concepts of modularity from
Computer Science. Players physically manipulate/program a set of tangible blocks with
embedded sensing capabilities and feedback systems. These blocks interact within the
physical environment and are usually symbolically representative of physical computing
concepts (Schweikardt & Gross, 2008; Wyeth & Purchase, 2002; Wyeth, 2008).
Tangible Tabletops describe designs that similarly utilize notions of tangibility and
embodied interaction, but instead focus on the usage of symbolic tangibles or gestures in
-- 10 --
conjunction with a virtual world displayed on an interactive tabletop. The setups are
commonly found in public spaces such as museums and typically facilitate large scale
social interactions. Tangible tabletop designs have been employed to teach educational
concepts around energy consumption (Esteves & Oakley, 2011), nanoscale (MoraGuiard
& Pares, 2014), and African concepts for mapping history (Chu et al., 2015).
Tangible Objects describe designs that utilize tangibles and embodied interaction as
input into virtual learning environments. Physical manipulation of the tangible object
results in a discrete and intuitive mapping to a virtual representation of learning content.
Tangible object designs have been utilized to teach a variety of concepts such as urban
planning (Shelley et al., 2011) and heart anatomy (Skulmowski, Pradel, Kühnert,
Brunnett, & Rey, 2016).
Surrogate Physicality Categories (Figure 4)
Tangible Spaces build upon a space-centered view of tangible embodied interaction
where interactive spaces rely on combining physical space and tangible objects with
digital displays (Hornecker & Buur, 2006). The design focus is on creating a tangible
physical environment for the player to actively manipulatecomplete with a physical
surrogate avatar that the player controlsand discretely mapping physical changes in that
space to a virtual world that either mirrors or augments the physical one. Tangible spaces
have been used to teach programming (Fernaeus & Tholander, 2006), animal foraging
behavior (Gnoli et al., 2014), and diurnal motion of the sun (Kuzuoka et al., 2014).
Augmented Physicality Categories (Figure 4)
"Touchless" Motion-Based designs employ a discrete mapping of players' physical
actions as input into a virtual world. The use of a "touchless" interaction paradigm
exploits sensing devices which capture, track and decipher body movements and gestures
so that players do not need to wear additional aides (Bartoli, Corradi, Milano, &
Valoriani, 2013). Unlike full-body congruency, the focus is not on mirroring a learning
concept through the body, but instead that a player's physical actions are mapped to
control a digital avatar in the virtual world. As a result, rather than seeing a video of
themselves, players will see silhouettes, digital avatars, or a first-person perspective.
These systems have been utilized to teach concepts around geometric shapes (Kynigos,
Smyrnaiou, & Roussou, 2010), climate change (Lyons et al., 2013), and peer-directed
social behaviors (Bhattacharya et al., 2015).
Example 2 - Identifying Problematic Design Spaces
One benefit of our design framework is that it allows us to systematically examine design
elements of existing systems, identifying potential problematic design spaces. As an
example of this usage, we examine the Tangible Earth system (see Figure 5) where the
authors had to create and use an assessment framework to identify/understand problems
the system encountered (Kuzuoka et al., 2014). Tangible Earth is designed to support
learning of the sun's diurnal motion and earth's rotation. It consists of a doll-like avatar, a
globe and rotating table to represent the earth and its rotation, an electrical light
representing the sun, and a laptop running VR universe simulator. Learners would
physically manipulate the rotation of the earth and position/rotation of the avatar to
observe simulated changes in sun's position from the avatar's perspective.
-- 11 --
Figure 5: The Tangible Earth embodied learning system
(Kuzuoka et al., 2014).
One of the more significant problems identified by Kuzuoka et al (2014) for Tangible
Earth was that learners spent very little time looking at the tangibles themselves (e.g.,
globe, lamp, and avatar), instead focusing primarily on the VR simulation in the laptop.
This proved to be especially problematic for manipulation of the avatar, where users
would frequently forget the position of its body and orientation of its head. This often
caused the sun to appear or disappear unexpectedly in the simulation, confusing learners
and learning concepts. By analyzing this issue with our design framework, we identified a
potential problematic design space (see Figure 6). Learners had difficulty remembering
the position of a physical agent representative of themselves (surrogate embodiment)
because all of their physical actions were mapped to digital effects (PDt) in a simulated
world (virtual environment). This difficulty makes sense considering remembering the
physical position/orientation of a surrogate avatar in both the real world and the virtual
world simultaneously would introduce a significant amount of extraneous cognitive load
(Plass, Moreno, & Brünken, 2010). As a result, our design framework suggests that the
intersection of surrogate embodiment, PDt transforms, and virtual environments is a
problematic design space that should be carefully considered when designing future
embodied learning systems.
Figure 6: Problematic design space identified by
Tangible Earth (Kuzuoka et al., 2014).
Example 3 - Identifying Design Gaps
Another benefit of our design framework is that it allows us to methodically fill in the
framework with existing systems to identify gaps and unexplored terrain (Ens &
Hincapié-ramos, 2014). As an illustration of this usage, we fill in example pairings
-- 12 --
between the two dimensions of Physicality and Transforms (see Table 2). This provides
examples of relevant combinations between these two dimensions in the embodied
learning systems literature.
Physicality
Embodied
Enacted
Manipulated
Surrogate
Augmented
Transforms
PPt
Scratch Direct
Embodiment
(Fadjo &
Black, 2012)
Electronic Blocks
(Wyeth, 2008);
roBlocks
(Schweikardt &
Gross, 2008)
PDt
SpatialEase
(Edge et al.,
2013); Word Out!
(Paul et al.,
2015);
Mathematical
Imagery Trainer
(Howison et al.,
2011)
Embodied
Poetry (Hatton
et al., 2009);
AquaRoom
(Novellis &
Moher, 2011);
MEteor
(Lindgren et
al., 2013)
Mapping Place
(Chu et al.,
2015); MoSo
Tangibles
(Bakker et al.,
2011); Eco
Planner (Esteves
& Oakley, 2011)
Hunger Games
(Gnoli et al.,
2014); Tangible
Programming
Space (Fernaeus
& Tholander,
2006); Tangible
Earth (Kuzuoka
et al., 2014)
Human
SUGOROKO
(Nakayama et al.,
2014); Bump Bash
(Bartoli et al.,
2013); Sorter Game
(Kynigos et al.,
2010)
DPt
ALERT (Lahey,
Burleson, Jensen,
Freed, & Lu, 2008)
Table 2: Example pairings between the Physicality and Transform dimensions.
Examining Table 2, we find several design gaps for existing embodied learning games
and simulations. Some of the more potentially useful pairings in the identified design
gaps are Embodied + PPt, Manipulated + DPt, Surrogate + PPt, and Surrogate + DPt,
where interesting future system designs could evolve from utilizing one of these pairings.
For instance, using a Surrogate + PPt pairing could lead to the design of physically
embodied educational board games. Additionally, a Surrogate + DPt pairing could lead to
an asymmetric computational thinking game where one player controls and interacts with
a physical avatar while another player digitally designs the physical courses and obstacles
for the first player to complete.
CONCLUSION AND FUTURE WORK
In this paper, we presented our design framework for embodied learning games and
simulations based on a detailed analysis of 48 existing embodied learning systems and
related frameworks/taxonomies from subdomains and communities such as TEI, HCI,
embodiment and embodied learning, and mixed reality. A design framework allows us to
systematically understand, analyze, and differentiate design elements of existing
embodied learning systems. This ultimately aids us in determining where and how
embodiment occurs in an educational system, and guides the application of specific
design choices in future systems. Future work will build games and simulations
addressing design gaps and problematic spaces identified by our framework, and test the
efficacy and learning outcomes of these systems. In broader application, this design
framework can also be used to guide construction of systems that methodically examine
questions of when and how embodied learning should be used within games/simulations;
which will help to further ground the framework and clarify its interpretation.
-- 13 --
BIBLIOGRAPHY
Ahmet, Z., Jonsson, M., Sumon, S. I., & Holmquist, L. E. (2011). Supporting embodied
exploration of physical concepts in mixed digital and physical interactive settings.
In Proceedings of TEI ’11.
Antle, A. N., Droumeva, M., & Corness, G. (2008). Playing with The Sound Maker: Do
Embodied Metaphors Help Children Learn? In Proceedings of the 7th international
conference on Interaction design and children - IDC ’08 (p. 178).
Bakker, S., Antle, A. N., & van den Hoven, E. (2012). Embodied metaphors in tangible
interaction design. In Personal and Ubiquitous Computing (Vol. 16).
Bakker, S., Hoven, E. Van Den, & Antle, A. N. (2011). MoSo Tangibles : Evaluating
Embodied Learning. In Proceedings of the fifth international conference on
Tangible, embedded, and embodied interaction - TEI ’11 (pp. 8592).
Barendregt, W., & Lindström, B. (2012). Development and evaluation of Fingu: a
mathematics iPad game using multi-touch interaction. In Proceedings of the 11th
International Conference on Interaction Design and Children (pp. 204207).
Bartoli, L., Corradi, C., Milano, P., & Valoriani, M. (2013). Exploring Motion-based
Touchless Games for Autistic Children’s Learning. In Proceedings of the 12th
International Conference on Interaction Design and Children (pp. 102111).
Bhattacharya, A., Gelsomini, M., Pérez-Fuster, P., Abowd, G. D., & Rozga, A. (2015).
Designing Motion-Based Activities to Engage Students with Autism in Classroom
Settings. In IDC 2015 (pp. 6978).
Birchfield, D., Thornburg, H., Megowan-Romanowicz, M. C., Hatton, S., Mechtley, B.,
Dolgov, I., & Burleson, W. (2008). Embodiment, Multimodality, and Composition:
Convergent Themes across HCI and Education for Mixed-Reality Learning
Environments. Advances in Human-Computer Interaction, 2008, 119.
Black, J. B., Segal, A., Vitale, J., & Fadjo, C. L. (2012). Embodied cognition and learning
environment design. In Theoretical foundations of learning environments (pp. 198
223).
Chu, J. H., Clifton, P., Harley, D., Pavao, J., & Mazalek, A. (2015). Mapping Place:
Supporting Cultural Learning through a Lukasa-inspired Tangible Tabletop
Museum Exhibit. In Proceedings of the 9th international conference on Tangible,
embedded, and embodied interaction - TEI ’15 (pp. 261268).
Clark, A. (1997). Being there: Putting brain, body, and world together again. MIT Press.
Clark, A. (2008). Supersizing the mind: embodiment, action, and cognitive extension.
Minds and Machines (Vol. 20). doi:10.1007/s11023-009-9162-6
Clifton, P. (2014). Designing embodied interfaces to support spatial ability. In
Proceedings of TEI ’14 (pp. 309312).
Davidsson, M. (2014). Finger Velocity A Multimodal Touch Based Tablet Application
for Learning the Physics of Motion. In Mobile as a MainstreamTowards Future
Challenges in Mobile Learning (pp. 238249).
Dourish, P. (2001). Where the Action Is: The Foundations of Embodied Interaction.
Edge, D., Cheng, K., & Whitney, M. (2013). SpatialEase: Learning Language through
Body Motion. In Proceedings of the SIGCHI Conference on Human Factors in
Computing Systems (CHI ’13) (pp. 469472).
Ens, B., & Hincapié-ramos, J. D. (2014). Ethereal Planes : A Design Framework for 2D
Information Spaces in 3D Mixed Reality Environments. In Proceedings of the 2nd
ACM symposium on Spatial user interaction.
Esteves, A., & Oakley, I. (2011). Design for interface consistency or embodied
facilitation? In CHI 2011 Embodied Interaction: Theory and Practice in HCI
Workshop (pp. 14).
Fadjo, C. L., & Black, J. B. (2012). You’re In the Game: Direct Embodiment and
-- 14 --
Computational Artifact Construction. In Proceedings of the International
Conference of the Learning Sciences: Future of Learning (Vol. 2: Symposia).
Fernaeus, Y., & Tholander, J. (2006). Finding Design Qualities in a Tangible
programming space. In CHI 2006 Proceedings ’06 (pp. 447456).
Fishkin, K. P. (2004). A taxonomy for and analysis of tangible interfaces. Personal and
Ubiquitous Computing, 8(5), 347358.
Gnoli, A., Perritano, A., Guerra, P., Lopez, B., Brown, J., & Moher, T. (2014). Back to
the future: Embodied Classroom Simulations of Animal Foraging. In Proceedings of
the 8th International Conference on Tangible, Embedded and Embodied Interaction
- TEI ’14 (pp. 275282).
Hatton, S., Campana, E., Danielescu, A., & Birchfield, D. (2009). Stratification:
Embodied Poetry Works by High School Students. In Proceedings of the 5th ACM
Conference on Creativity & Cognition (pp. 463464).
Holton, D. L. (2010). Constructivism + embodied cognition = enactivism: theoretical and
practical implications for conceptual change. In AERA 2010 Conference.
Hornecker, E., & Buur, J. (2006). Getting a grip on tangible interaction. In Proceedings
of the SIGCHI conference on Human Factors in computing systems - CHI ’06.
Howison, M., Trninic, D., Reinholz, D., & Abrahamson, D. (2011). The mathematical
imagery trainer: From Embodied Interaction to Conceptual Learning. In
Proceedings of the 2011 annual conference on Human factors in computing systems
- CHI ’11. doi:10.1145/1978942.1979230
Hummels, C., & van Dijk, J. (2014). Seven Principles to Design for Embodied
Sensemaking. In Proceedings of the Ninth International Conference on Tangible,
Embedded, and Embodied Interaction - TEI ’14. doi:10.1145/2677199.2680577
Ionescu, T., & Vasc, D. (2014). Embodied Cognition: Challenges for Psychology and
Education. Procedia - Social and Behavioral Sciences, 128, 275280.
Ishii, H. (2008). Tangible bits: beyond pixels. In Proceedings of the 2nd international
conference on Tangible and Embedded Intreaction (TEI ’08).
Johnson-Glenberg, M. C., Birchfield, D. a., Tolentino, L., & Koziupa, T. (2014).
Collaborative embodied learning in mixed reality motion-capture environments:
Two science studies. Journal of Educational Psychology, 106(1), 86104.
Kelliher, A., Birchfield, D., Campana, E., Hatton, S., Johnson-Glenberg, M., Martinez,
C., … Uysal, S. (2009). SMALLab: A mixed-reality environment for embodied and
mediated learning. In MM’09 - Proceedings of the 2009 ACM Multimedia
Conference, with Co-located Workshops and Symposiums (pp. 10291031).
Kuzuoka, H., Yamashita, N., Kato, H., Suzuki, H., & Kubota, Y. (2014). Tangible Earth:
Tangible Learning Environment for Astronomy Education. In Proceedings of the
second international conference on Human-agent interaction (pp. 2327).
Kynigos, C., Smyrnaiou, Z., & Roussou, M. (2010). Exploring rules and underlying
concepts while engaged with collaborative full-body games. In Proceedings of the
9th International Conference on Interaction Design and Children (p. 222).
Lahey, B., Burleson, W., Jensen, C. N., Freed, N., & Lu, P. (2008). Integrating video
games and robotic play in physical environments. In Proceedings of the 2008 ACM
SIGGRAPH symposium on Video games - Sandbox ’08 (Vol. 1, p. 107).
Li, Q. (2012). Understanding enactivism: a study of affordances and constraints of
engaging practicing teachers as digital game designers. Educational Technology
Research and Development, 60(5), 785806. doi:10.1007/s11423-012-9255-4
Lindgren, R., Tscholl, M., & Moshell, J. M. (2013). MEteor: Developing Physics
Concepts Through Body- Based Interaction With A Mixed Reality Simulation. In
Physics Education Research Conference - PERC ’13 (pp. 217220).
Lyons, L., Silva, B. L., Moher, T., Pazmino, P. J., & Slattery, B. (2013). Feel the burn:
-- 15 --
Exploring Design Parameters for Effortful Interaction for Educational Games. In
Proceedings of the 12th International Conference on Interaction Design and
Children - IDC ’13 (pp. 400403).
Malinverni, L., López Silva, B., & Parés, N. (2012). Impact of Embodied Interaction on
Learning Processes: Design and Analysis of an Educational Application Based on
Physical Activity. In Proceedings of IDC ’11 (pp. 6069).
Marshall, P., Price, S., & Rogers, Y. (2003). Conceptualising tangibles to support
learning. In IDC ’03: Proceedings of the 2003 conference on Interaction design and
children (pp. 101109). doi:10.1145/953536.953551
Mickelson, J., & Ju, W. (2011). Math Propulsion: Engaging Math Learners Through
Embodied Performance & Visualization. In Proceedings of the fifth international
conference on Tangible, embedded, and embodied interaction - TEI ’11 (p. 101).
Moher, T., Hussain, S., Halter, T., & Kilb, D. (2005). Roomquake: embedding dynamic
phenomena within the physical space of an elementary school classroom. In
Proceedings of ACM CHI 2005 Conference on Human Factors in Computing
Systems (Vol. 2, pp. 16651668). doi:10.1145/1056808.1056992
MoraGuiard, J., & Pares, N. (2014). Child as the measure of all things: the body as a
referent in designing a museum exhibit to understand the nanoscale. In IDC ’14.
Nakayama, T., Adachi, T., Muratsu, K., Mizoguchi, H., Namatame, M., Sugimoto, M., …
Takeda, Y. (2014). Human SUGOROKU: Learning Support System of Vegetation
Succession with Full-body Interaction Interface. In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems (CHI ’14) (pp. 22272232).
Novellis, F., & Moher, T. (2011). How Real is “Real Enough”? Designing Artifacts and
Procedures for Embodied Simulations of Science Practices. In Proceedings of the
10th International Conference on Interaction Design and Children (pp. 9098).
O’Malley, C., & Fraser, S. (2004). Literature review in learning with tangible
technologies.
Oullier, O., de Guzman, G. C., Jantzen, K. J., Lagarde, J., & Kelso, J. a S. (2008). Social
coordination dynamics: measuring human bonding. Social Neuroscience, 3(2).
Paul, F. C., Goh, C., & Yap, K. (2015). Get Creative With Learning: Word Out! A Full
Body Interactive Game. In Proceedings of the 33rd Annual ACM Conference
Extended Abstracts on Human Factors in Computing Systems - CHI EA ’15.
Plaisant, C., Carr, D., & Shneiderman, B. (1995). Image-browser taxonomy and
guidelines for designers. IEEE Software, 12(2), 2132.
Plass, J. L., Moreno, R., & Brünken, R. (2010). Cognitive Load Theory. Cambridge
University Press.
Plass, J. L., O’Keefe, P. A., Homer, B. D., Case, J., Hayward, E. O., Stein, M., & Perlin,
K. (2013). The impact of individual, competitive, and collaborative mathematics
game play on learning, performance, and motivation. Journal of Educational
Psychology, 105(4), 10501066. doi:10.1037/a0032688
Pouw, W. T. J. L., van Gog, T., & Paas, F. (2014). An Embedded and Embodied
Cognition Review of Instructional Manipulatives. Educational Psychology Review,
26(1), 5172. doi:10.1007/s10648-014-9255-5
Price, B. A., Baecker, R. M., & Small, I. S. (1993). A Principled Taxonomy of Software
Visualization. Journal of Visual Languages & Computing, 4(3), 211266.
Price, S. (2008). A representation approach to conceptualizing tangible learning
environments. In Proceedings of the 2nd international conference on Tangible and
embedded interaction TEI 08 (p. 151). doi:10.1145/1347390.1347425
Price, S., & Jewitt, C. (2013). A multimodal approach to examining “embodiment” in
tangible learning environments. In Proceedings of TEI ’13 (pp. 4350).
Price, S., Rogers, Y., Scaife, M., Stanton, D., & Neale, H. (2003). Using tangibles to
-- 16 --
promote novel forms of playful learning. Interacting with Computers, 15(2), 169
185. doi:10.1016/S0953-5438(03)00006-7
Rieser, J. J., Garing, A. E., & Young, M. F. (1994). Imagery, Action, and Young
Children’s Spatial Orientation: It's Not Being There That Counts, It's What One Has
in Mind. Child Development, 65(5), 1262. doi:10.2307/1131498
Rikić, M. (2013). Buildasound. In Proceedings of the 7th international conference on
Tangible, embedded, and embodied interaction - TEI ’13 (pp. 395396).
Robinett, W. (1992). Synthetic Experience: A Taxonomy, Survey of Earlier Thought, and
Speculations on the Future. Technical report.
Rogers, Y., Scaife, M., Gabrielli, S., Smith, H., & Harris, E. (2002). A conceptual
framework for mixed reality environments: Designing novel learning activities for
young children. Presence, 11(6), 677686.
Rohrer, T. (2007). The body in space: Dimensions of embodiment. In Body, language
and mind (pp. 339378).
Schweikardt, E., & Gross, M. (2008). The robot is the program: interacting with
roBlocks. In Proceedings of the second international conference on Tangible,
embedded, and embodied interaction - TEI ’08 (pp. 167168).
Shapiro, L. (2010). Embodied Cognition. Routledge.
Shelley, T., Lyons, L., Zellner, M., & Minor, E. (2011). Evaluating the Embodiment
Benefits of a paper-based TUI for Spatially Sensitive Simulations. In Extended
Abstracts of the 2011 Conference on Human Factors in Computing Systems (p.
1375). doi:10.1145/1979742.1979777
Skulmowski, A., Pradel, S., Kühnert, T., Brunnett, G., & Rey, G. D. (2016). Embodied
learning using a tangible user interface: The effects of haptic perception and
selective pointing on a spatial learning task. Computers & Education, 92-93, 6475.
Streeck, J., Goodwin, C., & LeBaron, C. (2011). Embodied interaction: language and
body in the material world. Embodied Interaction Language and Body in the
Material World, 128.
Tolentino, L., Savvides, P., & Birchfield, D. (2010). Applying game design principles to
social skills learning for students in special education. In Proceedings of FDG ’10.
Wei, C., Chen, H., & Chen, N. (2015). Effects of Embodiment-Based Learning on
Perceived Cooperation Process and Social Flow. In 7th World Conference on
Educational Sciences (pp. 608613). Elsevier B.V.
Wilson, M. (2002). Six views of embodied cognition. Psychonomic Bulletin & Review,
9(4), 625636. doi:10.3758/BF03196322
Wyeth, P. (2008). How Young Children Learn to Program With Sensor, Action, and
Logic Blocks. Journal of the Learning Sciences, 17(4), 517550.
Wyeth, P., & Purchase, H. C. (2002). Tangible programming elements for young
children. In CHI ’02 extended abstracts on Human factors in computing systems -
CHI '02 (p. 774). doi:10.1145/506443.506591
Yannier, N., Koedinger, K. R., & Hudson, S. E. (2013). Tangible collaborative learning
with a mixed-reality game: Earthshake. Artificial Intelligence in Education.
Yap, K., Zheng, C., Tay, A., Yen, C.-C., & Do, E. Y.-L. (2015). Word out! Learning the
Alphabet through Full Body Interactions. In Proceedings of the 6th Augmented
Human International Conference on - AH ’15 (pp. 101108).
Zaman, B., Vanden Abeele, V., Markopoulos, P., & Marshall, P. (2012). Editorial: The
evolving field of tangible interaction for children: The challenge of empirical
validation. Personal and Ubiquitous Computing, 16, 367378.
Ziemke, T. (2002). What’s that Thing Called Embodiment? In Proceedings of the 25th
Annual meeting of the Cognitive Science Society (pp. 13051310).
... Do they actually improve problem-solving ability, performance, and positive emotional responses needed for STEM learning and creativity, or do they simply function as chocolate-covered broccoli? Furthermore, the similar design choices of most programming puzzle games-which focus on single player touchpad/mouse and keyboard experiences [14,31]-may be neglecting recent research highlighting the efficacy of alternative body-centric, physically embodied theories and approaches (i.e., embodied cognition, embodied interaction, and enactment) in helping learners make meaning, problem-solve, and improve positive emotional responses [22,30,31,36,38]. Two robust and common physical embodiment approaches within HCI and Learning Science communities are tangibles/manipulatives [33,36] and augmented reality (AR) [6,22]. ...
... The goal of my research is to explore how the diverse affordances of various forms of physical embodiment differentially impact meaning-making processes, enhance positive emotional responses for learners, and improve performance in problem-solving tasks [28][29][30][31]. This will be done through controlled comparison of ...
... The goal of this research is to explore if applying physically embodied designs results in improved learning outcomes for core CT skills, increased problem-solving performance and ability, and positive emotional responses that are desirable for learning and creativity (e.g., enjoyment, motivation, situational interest, etc.). I have already laid the theoretical basis for this examination through creation of a design framework [29] and categorization of existing designs [30] for embodied learning games and simulations. Additionally, I conducted a preliminary study comparing tangible programming blocks to mouse input, looking at impacts upon players' programming self-beliefs and enjoyment [31]. ...
Conference Paper
Full-text available
One prominent aspect of programming skills/expertise is that it requires the use of many creative processes such as problem-solving, visualization, reflection, motivation and handling failure. While there have been a variety of puzzle-based educational programming games created to help teach learners these skills, few have been evaluated to assess their efficacy in developing programming and problem-solving skills or improving learners' positive emotional responses. Furthermore, most games are designed for a single player touchpad/mouse experience. This is problematic when trying to understand the validity of these designs, and whether there are alternative physically embodied design approaches that may prove more effective. My dissertation work helps address this problem. After creating a framework based on a meta-review that carefully dissects embodiment strategies in learning games, I am creating and evaluating tangible and augmented reality versions of a programming game. I plan to examine how these different forms of physical interaction help to facilitate and enhance meaning-making, problem-solving, and positive emotions.
... Furthermore, the methodical filling-in of this structure helps to categorize existing concepts, identify problematic design spaces, differentiate ideas, and identify unexplored terrain [61]. Some examples of taxonomic design frameworks include [61,72,172,173]. One notable drawback of taxonomical design frameworks is that they do not elucidate the relations between concepts as other types of frameworks can. ...
... Therefore, my programming interface (Figure 3-1C) similarly adopts usage of block-based manipulatives to convey programming concepts both digitally (e.g., commands are represented as digital blocks, and specific commands such as loops indicate syntax using an arrow with automatic indentation of the looped command) and physically (in the design of the physical blocks which are shown in Figure 3-3). The blocks themselves utilize a similar design approach to [101,107,137,266,279], taking advantage of many physical affordances that tangibles provide [172,173,204,210]. For instance, there are several aspects of the block's physical properties intended to eliminate syntax errors and more clearly illustrate programming concepts. ...
... The tangible programming blocks were created to employ embodied interaction and take advantage of many physical affordances that tangibles provide [172,173,204,210]. ...
Thesis
Full-text available
Embodiment is a concept that has gained widespread usage within the Human-Computer Interaction (HCI) community in recent years. In a general sense, embodiment is the notion that cognition arises not just from the mind, but also through our bodies‘ physical and social interactions with the world around us. HCI has employed this body-centric approach to the design of technology in a variety of domains, including interaction design, robotics, music systems, and education. However, due to the broad number of academic domains that define and operationalize embodiment within HCI (e.g., cognitive science, social science, learning science, neuroscience, AI, robotics, and so forth), it has become a remarkably fuzzy term with little understood about what designs result in desired outcomes. Essentially, HCI researchers and practitioners often employ a black box of design decisions when creating their embodied systems. Notably, the inconsistent framing and application of embodiment within HCI is a substantial drawback when trying to design embodied technology to support particular use cases such as learning, where understanding the 'why' of outcomes is essential. In this dissertation, I contribute work towards opening up the black box of embodied design to develop a more precise understanding of its proper application for the development of learning technology. This was done through the creation of a taxonomical design framework that outlines key methods for incorporating embodiment into the design of educational games and simulations. In order to create the design framework, I collected over 60 exemplars of embodied learning games and simulations, followed by the application of a bottom up, open coding method to distill seven core design dimensions. I then demonstrated the design framework‘s importance for a variety of HCI use cases including 1) categorizing existing embodied educational technologies, 2) identifying problematic design spaces, and 3) identifying design gaps for the generation of novel embodied learning systems. I also further employed the design framework to develop my own embodied learning system, Bots & (Main)Frames, which teaches basic programming and computational thinking skills through the use of tangibles. In order to better understand when and how embodied tangible technology can aid learning, I built two versions of Bots & (Main)Frames that only differed in input method (non-embodied mouse vs. embodied tangible programming blocks), while keeping all other game mechanics, aesthetics, and so forth identical. I then conducted two controlled experimental studies to compare differences between the two versions of Bots & (Main)Frames. My results show that an embodied tangible design had far greater positive impact for a number of key learning factors including programming self-beliefs, situational interest, enjoyment, and overall learning/performance outcomes. The quantitative and qualitative findings from these studies make key advances toward understanding when and how embodied tangible technology can aid in learning computational thinking skills.
... Looking deeper into the theoretical concepts underlying this black box reveals a commonly used set of terms: embodiment, embodied cognition, and embodied interaction to name a few. Unfortunately, embodiment and related terms are remarkably broad and fuzzy constructs-with definitions and operationalization that differs drastically depending upon the academic domain in which they are utilized [92,94]. This large breadth of interpretation for "embodiment" is a likely cause of the seemingly ambiguous design choices underlying many educational games and simulations employing physical interactions. ...
... Furthermore, the methodical filling-in of this structure helps to categorize existing concepts, identify problematic design spaces, differentiate ideas, and identify unexplored terrain [38]. Some examples of taxonomic design frameworks include [38,46,92,94]. However, one notable drawback of taxonomical design frameworks is that they do not elucidate the relations between concepts as other types of frameworks can. ...
Chapter
Full-text available
In recent years, embodiment—the notion that cognition arises not just from the mind, but also through our bodies’ physical and social interactions with the world around us—has become an important concept to enhance the design of educational games and simulations within communities such as Human–Computer Interaction (HCI), games, and learning science. However, the broad number of academic domains that define and operationalize embodiment differently has often led researchers and designers to employ a black box of design decisions—resulting in a substantial breadth of seemingly unrelated educational systems. This inconsistent application of embodiment is a substantial drawback when trying to design embodied technology to support particular use cases such as learning, and ultimately becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within existing educational designs. In this chapter, we present our work toward opening the black box of embodied design decisions by combining differing conceptual and design approaches for embodied learning games and simulations into a unified taxonomical design framework. We describe the motivation and creation process for the framework, explain its dimensions, and provide examples of its use. Ultimately, such a framework helps to explicate high-level design decisions by providing a unifying foundation for the description and categorization of embodied learning systems, and encourages new designs by identifying problematic design spaces and unexplored research/design terrain. Our design framework will benefit educational game designers and researchers by mapping out a more precise understanding of how to incorporate embodiment into the design of educational games and simulations.
... al, 2018). Several research programs have formed to investigate the embodied nature of mathematics (e.g., Abrahamson 2014; Alibali & Nathan, 2012;Arzarello et al., 2009;De Freitas & Sinclair, 2014;Edwards, Ferrara, & Moore-Russo, 2014;Lakoff & Núñez, 2000;Melcer & Isbister, 2016;Ottmar & Landy, 2016;Radford 2009;Nathan, Walkington, Boncoddo, Pier, Williams, & Alibali, 2014;Soto-Johnson & Troup, 2014;Soto-Johnson, Hancock, & Oehrtman, 2016), demonstrating a "critical mass" of projects, findings, senior and junior investigators, and conceptual frameworks to support an ongoing community of like minded scholars within the mathematics education research community. ...
... Inspired by the PME-NA 2018 theme, we will specifically focus on the ways in which the field of embodied cognition has developed and how new emerging technologies and innovative pedagogies can influence mathematics teaching and learning. This effort will be crafted to align with recent developments in the embodiment literature, and new theoretical frameworks tying various perspectives on embodiment to different forms of physicality in educational technology (Melcer & Isbister, 2016 Examples include: coding videos of pre-service teachers' distributed gestures to explore a mathematical conjecture (Walkington et al., 2018); exploring mathematical transformations while using a dynamic technology tool (Ottmar & Landy, 2016), having students and teachers play and create embodied technology games to teach mathematics and computational thinking (Arroyo et al., 2017;Melcer & Isbister, 2018); usng dual eye tracking (Shvarts & Abrahamson, 2018), and a teacher guiding the movements of a learner exploring ratios (Abrahamson & Sánchez-García, 2016). Through these examples, we will explore questions such as: what role does technology play on supporting the connections between the mind, body, and action? ...
Conference Paper
Full-text available
Embodied cognition is growing in theoretical importance and as a driving set of design principles for curriculum activities and technology innovations for mathematics education. The central aim of the EMIC (Embodied Mathematical Imagination and Cognition) Working Group is to attract engaged and inspired colleagues into a growing community of discourse around theoretical, technological, and methodological developments for advancing the study of embodied cognition for mathematics education. A thriving, informed, and interconnected community of scholars organized around embodied mathematical cognition will broaden the range of activities, practices, and emerging technologies that count as mathematical. EMIC builds upon our prior working groups with a specific focus on how we can leverage emerging technologies to study embodied cognition and mathematics learning. In particular, we aim to develop new theories and extend existing frameworks and perspectives from which EMIC collaboration and activities can emerge.
... Finally, from a strongly applied perspective, Deng et al. [20] and Melcer et al. [21] propose design frameworks oriented toward highlighting relevant concepts that designers need to take into account when designing EIEs for learning. Deng et al. [20] elaborate a collection of cards (Tango Cards) to design tangible learning games. ...
... Specifically, they describe a series of key concepts about learning and interaction design (e.g., multiple modalities, coherent mapping, well-ordered problems, intrinsic rewards, etc.). Melcer et al. [21], instead, propose a taxonomy that focuses on interaction design (e.g., definition of the physical interaction, of the mapping strategy and of the multiuser modality) to guide researchers in taking design decisions to create embodied learning games. ...
Article
Full-text available
Embodied Interaction faces designers with the challenge of thinking about users and interaction from different viewpoints with respect to traditional technologies. This task is even more complex when designing non-task oriented systems. We propose a framework to guide researchers in thinking and designing non-task-oriented Embodied Interaction Environments or, in other words, embodied experiences that users can enjoy for its own sake and not as means for accomplishing a task or achieving an extrinsic goal. The framework is grounded on experience-centered design approaches and will present four qualities ((1) Spatial, Corporeal and Material Consistency, (2) Contingent Enhancement, (3) Mindful Embodied Engagement and (4) Situated Reflexivity) aimed at providing critical lenses, strategies and techniques to guide the design and research processes. Finally, we will discuss how designers can implement the proposed framework in different stages of the design process and paths for future research.
... For all other uses, contact the owner/author(s). Conversely to some of these frequent design decisions, there is notable work advocating for the importance of learning through collaboration [6] and physical embodiment/interactions with the body [15,16]. In the context of CS education, pair programming (two programmers working collaboratively at a shared computer on the same task [3]) has been shown to improve self-sufficiency, performance, and overall grades for students in an introductory CS course [18]. ...
... There has been a variety of different approaches, designs, and theories around the application of physical embodiment to learning in educational games. In a meta-review comparing and taxonomizing embodiment strategies, Melcer and Isbister [15,16] identified five forms of physical embodiment commonly utilized in educational games: Direct Embodied, Enacted, Manipulated, Surrogate, and Augmented. In the Manipulated form of embodiment, use of physical objects (e.g., tangibles and manipulatives) allow for learning concepts to be embedded directly into the physical material and design of the object, as well as through the embodied interactions learners have by manipulating these objects [19,20]. ...
Conference Paper
Full-text available
While there are common design decisions in existing games for teaching Computer Science (single player puzzle based games for the touchpad/keyboard and mouse), recent work has suggested that alternative approaches such as collaborative play and physically embodied designs may also provide important benefits to learners. In order to explore how making interactions with an educational programming game more physically embodied could impact collaborative play, we created an educational programming game called Bots & (Main)Frames. We then conducted a preliminary study to examine if the level designs achieved desired challenge and explore how two versions of the game with different forms of physical embodiment/input (e.g., mouse vs. tangible programming blocks) impacted player interactions underlying collaboration. We found that game levels seem to provide desired increasing challenge, and that players often used the mouse and tangible programming blocks to aid communication/collaboration in distinctly different ways.
... In contrast to the ubiquity of mouse/keyboard input, recent research suggests that body-based, physically embodied designs provide affordances that aid in the meaning-making process and offer greater learning benefits than traditional keyboard and mouse input [26,30,32,46]. One common physically embodied design approach utilized in HCI and Learning Science communities is the use of tangibles/physical manipulatives, which have demonstrated many potential benefits over traditional desktop applications. ...
... In a meta-review dissecting embodiment strategies, Melcer and Isbister [25] identified five forms of physical embodiment commonly used in educational games: Direct Embodied, Enacted, Manipulated, Surrogate, and Augmented. Notably, they also found that the Manipulated form of embodiment frequently uses tangibles to teach computational concepts [26]. We similarly employed the affordances of Manipulated embodiment (e.g., applying embodied metaphors and interactions to physical objects [3], and utilizing physical form to better represent learning concepts [18,31]) in the design of the blocks for our tangibles version of Bots & (Main)Frames. ...
Conference Paper
Full-text available
Computer Science (CS) and related skills such as programming and Computational Thinking (CT) have recently become topics of global interest, with a large number of programming games created to engage and educate players. However, there has been relatively limited work to assess 1) the efficacy of such games with respect to critical educational factors such as enjoyment and programming self-beliefs; and 2) whether there are advantages to alternative, physically embodied design approaches (e.g., tangibles as input). To better explore the benefits of a tangible approach, we built and tested two versions of an educational programming game that were identical in design except for the form of interaction (tangible programming blocks vs. mouse input). After testing 34 participants, results showed that while both game versions were successful at improving programming self-beliefs, the tangible version corresponded to higher self-reports of player enjoyment. Overall, this paper presents a comparison between the efficacy of tangible and mouse design approaches for improving key learning factors in educational programming games.
... Second, we aimed to specifically learn what game researchers in personalization and game design have done. While we acknowledge that game research is highly interdisciplinary and that scholars publish in a wide variety of outlets, both game-and disciplinary-based [113,125], most self-identified games researchers will contribute to game-specific outlets. Thus, such outlets should at the very least provide a sense of the state-of-the-art in both personalization and game design. ...
Conference Paper
Full-text available
Personalization has been explored in the context of games in many forms (e.g., dynamic difficulty adjustment, affective video games, adaptive systems, experience-driven PCG, etc.). The majority of techniques used in these fields have relied on data-driven or manual methods for identifying game components to modify for personalization. We propose a theoretical framework for identifying and categorizing low-level components of games that can be personalized. In this paper we first perform a review of game design frameworks and personalization approaches. We then systematically identify the aspects of games which have been utilized for personalization and which components have been identified in game design frameworks as building blocks of games. We synthesize the identified components into categories of game elements. We then propose PEAS, a theoretical framework for personalization through the adaptation of the Player, Environment, Agents, and System.
... The blocks themselves utilize a similar design approach to [26,29,41,88,92], taking advantage of many physical affordances that tangibles provide [50,51,61,63]. For instance, there are several aspects of the block's physical properties intended to eliminate syntax errors and more clearly illustrate programming concepts. ...
Conference Paper
Full-text available
While recent work has begun to evaluate the efficacy of educational programming games, many common design decisions in these games (e.g., single player gameplay using touchpad or mouse) have not been explored for learning outcomes. For instance, alternative design approaches such as collaborative play and embodied interaction with tangibles may also provide important benefits to learners. To better understand how these design decisions impact learning and related factors, we created an educational programming game that allows for systematically varying input method and mode of play. In this paper, we describe design rationale for mouse and tangible versions of our game, and report a 2x2 factorial experiment comparing efficacy of mouse and tangible input methods with individual and collaborative modes of play. Results indicate tangibles have a greater positive impact on learning, situational interest, enjoyment, and programming self-beliefs. We also found collaborative play helps further reduce programming anxiety over individual play.
Article
Materialist design is presented as an embodied perspective on educational design that can be applied to redesign of classroom-based learning environments. Materialist design is informed by a framework of materialist epistemology, which positions material innovation on equal placement with symbol-based formal theory. Historical examples of Einstein’s conceptual reliance on trains for his Theory of Relativity and the Wright brothers’ use of wind tunnels in aeronautics illustrate how materialist design drives progress on complex design problems. A key aspect is the application of scale-down methodology, where complex systems are reconceptualized as interactions among nearly decomposable subsystems that can be redesigned and integrated back into the entire system. The application of materialist design is illustrated with the redesign of an embodied video game that uses real-time motion capture technology to promote high school geometry reasoning and proof, following its use in an ethnically and linguistically diverse classroom. Our embodied perspective offers particular insights for understanding and implementing designs of complex learning environments, and assessing their influences on educational practices and student outcomes.
Conference Paper
Full-text available
We report on a nine-month-long observational study with teachers and students with autism in a classroom setting. We explore the impact of motion-based activities on students' behavior. In particular, we examine how the playful gaming activity impacted students' engagement, peer-directed social behaviors, and motor skills. We document the effectiveness of a collaborative game in supporting initiation of social activities between peers, and in eliciting novel body movements that students were not observed to produce outside of game play. We further identify the positive impact of game play on overall classroom engagement. This includes an " audience effect " whereby non-playing peers direct initiations to those playing the game and vice versa, and a positive " spillover " effect of the activity on students' social behavior outside of game play. We identify key considerations for designing and deploying motion-based activities for children with autism in a classroom setting.
Article
Full-text available
Peer interaction plays an important role during the learning process. Currently predominant paradigm of digital learning materials limits learners’ interaction with their peers using traditional user interfaces. To cope with this problem, this study proposes an embodiment-based learning environment to facilitate more peer interactions. The effectiveness of the environment is evaluated by designing an electronic circuit learning activity and through a control-experimental group experiment, including 80 voluntary participants randomly assigned to the “embodiment-based learning group” and “traditional learning group.” Three variables, learning performance, perceived cooperative perception and social flow were assessed. Results show that there is no significant difference among the two groups in learning performance; however, participants in the embodiment-based learning group have higher perceived cooperation process and social flow during learning process than the traditional learning group.
Chapter
Recent research from a large number of fields has recently come together under the rubric of embodied cognitive science. Embodied cognitive science attempts to show specific ways in which the body shapes and constrains thought. I enumerate the standard variety of usages that the term "embodiment" currently receives in cognitive science and contrast notions of embodiment and experientialism at a variety of levels of investigation. The purpose is to develop a broad-based theoretic framework for embodiment which can serve as a bridge between different fields. I introduce the theoretic framework using examples that trace related research issues such as mental imagery, mental rotation, spatial language and conceptual metaphor across several levels of investigation. As a survey piece, this chapter covers numerous different conceptualizations of the body ranging from the physiological and developmental to the mental and philosophical; theoretically, it focuses on questions of whether and how all these different conceptualizations can form a cohesive research program.
Conference Paper
We explore the potential of bringing together the advantages of computer games and the physical world to increase engagement, collaboration and learning. We introduce EarthShake: A tangible interface and mixed-reality game consisting of an interactive multimodal earthquake table, block towers and a computer game synchronized with the physical world via depth camera sensing. EarthShake helps kids discover physics principles while experimenting with real blocks in a physical environment supported with audio and visual feedback. Students interactively make predictions, see results, grapple with disconfirming evidence and formulate explanations in forms of general principles. We report on a preliminary user study with 12 children, ages 4-8, indicating that EarthShake produces large and significant learning gains, improvement in explanation of physics concepts, and clear signs of productive collaboration and high engagement.
Book
Cognitive load theory (CLT) is one of the most important theories in educational psychology, a highly effective guide for the design of multimedia and other learning materials. This edited volume brings together the most prolific researchers from around the world who study various aspects of cognitive load to discuss its current theoretical as well as practical issues. The book is divided into three parts. The first part describes the theoretical foundations and assumptions of CLT, the second discusses the empirical findings about the application of CLT to the design of learning environments, and the third part concludes the book with discussions and suggestions for new directions for future research. It aims to become the standard handbook in CLT for researchers and graduate students in psychology, education, and educational technology.
Conference Paper
A prototype multimodal tablet application for learning the physics of motion has been developed tested and evaluated. By moving their finger across the screen the application enables the user to map its position and velocity in real-time in terms of graphs. The learning outcome of those test subjects using the application was compared to a group that did not use the application but had it shown to them, at the same time as getting an explanation of all the physics involved. There was a small but not significant difference in performance between these groups on a post-test. However, a larger (arguably significant) difference was seen between the male and female test subjects for the sub-set of questions of a more analytical nature. These were the questions targeted in this paper.
Article
To support astronomy education, we developed a tangible learning environment called the tangible earth system. To clarify its problems, we defmed an assessment framework from the aspects of curriculum guidelines, design guidelines of tangible learning environments, and epistemology of agency. Based on the analysis of our small-scale user study, we identified problems of the system in terms of location, dynamics, and correspondence parameters.
Article
Tangible User Interfaces offer new ways of interaction with virtual objects, yet little research has been conducted on their learner-friendly design in the context of spatial learning. Although frameworks such as Embodied Cognition stress the importance of sensory perception and movement, studies have found that high interactivity can be overwhelming and may lead to a lower learning performance. In a 2 × 2 factorial design participants (n = 96) learned heart anatomy using a 3D model that was either controlled using a mouse or a tangible object, i.e. a motion tracked plastic model of the virtual heart. Secondly, we varied the interaction mode featuring either a selective pointing mode in which only the label that the user currently activated was displayed or permanent display of all labels. Retention performance, cognitive load scores, and motivation measures indicate that the tangible object leads to significantly higher learning outcomes. The effect of the label display mode is different for the two input devices: The performance with selective pointing in the mouse condition is better than the performance with permanent display in the mouse condition; in the TUI condition this is exactly the other way around. Based on these results, we propose extensions for Embodied Cognition and Cognitive Load Theory.
Article
Museums are exploring new ways of using emerging digital technologies to enhance the visitor experience. In this context, our research focuses on designing, developing and studying interactions for museum exhibits that introduce cultural concepts in ways that are tangible and embodied. We introduce here a tangible tabletop installation piece that was designed for a museum exhibition contrasting Western and African notions of mapping history and place. Inspired by the Lukasa board, a mnemonic device used by the Luba peoples in Central Africa, the tabletop piece enables visitors to learn and understand symbolic and nonlinguistic mapping concepts that are central to the Lukasa by creating and sharing stories with each other. In this paper we share our design process, a user study focusing on children and learning, and design implications on how digital and tangible interaction technologies can be used for cultural learning in museum exhibits.