Bridging the Physical Divide: A Design
Framework for Embodied Learning
Games and Simulations
Existing embodied learning games and simulations
utilize a large breadth of design approaches that often
result in the creation of seemingly unrelated systems.
This becomes problematic when trying to critically
evaluate the usage and effectiveness of embodiment
within embodied learning designs. In this paper, we
present our work on combining differing conceptual and
design approaches for embodied learning systems into
a unified design framework. We describe the creation
process for the framework, explain its dimensions, and
provide two examples of its use. Our embodied learning
games and simulations framework will benefit HCI
researchers by providing a unifying foundation for the
description, categorization, and evaluation of embodied
learning systems and designs.
Embodiment; embodied learning; design framework.
ACM Classification Keywords
H.5.m. Information interfaces and presentation (e.g.,
Recent work on educational systems has shown the
benefits of incorporating physicality, motion, and
Permission to make digital or hard copies of part or all of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. Copyrights
for third-party components of this work must be honored. For all other
uses, contact the Owner/Author.
Copyright is held by the owner/author(s).
CHI'16 Extended Abstracts, May 07-12, 2016, San Jose, CA, USA
Edward F Melcer
New York University
Brooklyn, NY 11201, USA
University of California, Santa Cruz
Santa Cruz, CA 95064, USA
embodiment into designs. This direction stems from the
concept that cognition does not only occur in the mind
but is also supported by bodily activity; situated in and
interacting with our physical and social environment [7,
8]. Embodied learning approaches suggest many
potential benefits such as improved spatial recall ,
intuitive interactions and mappings , increased
engagement and positive feelings [47, 49], and
enhanced collaboration [40, 49].
However, when examining existing embodied learning
systems closely, we find a large breadth of designs that
result in seemingly unrelated systems (see Figure 1).
This becomes problematic when trying to understand
where and how embodiment occurs in these systems,
and which design elements help to facilitate embodied
learning. The problem is further aggravated by limited
empirical validation of many systems , and a broad
conceptual usage of embodiment and related terms in a
diverse variety of domains such as HCI, learning
science, neuroscience, and philosophy [4, 44, 51].
For designers seeking to utilize embodiment, the
differences in approach to physicality, motion, and
interaction pose a significant hurdle. One solution that
can bridge conceptual differences between existing
systems and research domains is the creation of a
design framework [10, 42].
Our goal in providing an embodied learning design
framework is to bridge conceptual gaps and resulting
design choices made from the differing uses of
embodiment in various domains. In this section we
present an overview of design frameworks,
embodiment, and embodied learning taxonomies.
Design frameworks can help designers conceptualize
nuances of particular technologies and formalize the
creative process . In interface design, design
frameworks have been used to provide terminology to
categorize ideas  as well as organize complex
concepts into logical hierarchies . By treating a set
of taxonomical terms as orthogonal dimensions in a
design space, the resulting matrix provides structure
for classification and comparison of designs . The
resulting design framework provides a means to
critically examine designs of existing systems and
encourage new designs by providing a unifying
foundation for the description and categorization of
systems. Furthermore, the methodical filling-in of this
structure helps to categorize existing concepts,
differentiate ideas, and identify unexplored terrain .
Embodiment and Embodied Learning Taxonomies
Embodiment and related terms such as embodied
cognition and embodied interaction have many different
interpretations and applications across a wide range of
academic domains. HCI tends to view embodiment as
the way that physical and social phenomena unfold in
real time and space as a part of the world in which we
are situated . However, learning science views tend
to be more oriented on the body as the central focus
for embodiment [19, 44]. Furthermore, Ziemcke has
noted this divide in their work identifying six different
uses of embodiment across research domains .
Similar to the many interpretations of embodiment,
embodied learning frameworks and taxonomies also
have vastly different interpretations of physicality,
motion, and interaction. Johnson-Glenberg et al created
an embodied learning taxonomy that specifies the
Figure 1: A spectrum of different
embodied learning systems. Top
to bottom - Interactive Slide ,
Electronic Blocks , Embodied
Poetry , SpatialEase , Eco
strength of embodiment as a combination of the
amount of motoric engagement, gestural congruency to
learning content, and immersion . Black et al
created the Instructional Embodiment Framework (IEF)
which consists of various forms of physical embodiment
(e.g., direct, surrogate, and augmented) as well as
imagined embodiment (e.g., explicit and implicit) where
the individual can embody action and perception
through imagination . In the Tangible Embodied
Interaction (TEI) field, Fishkin's taxonomy for the
analysis of tangible interfaces views embodiment as the
distance between input and output where embodiment
can be full, nearby, environmental, or distant .
Towards A Design Framework for Embodied
Learning Games and Simulations
Creating the Design Framework
To create our design framework, we conducted an
extensive literature review for published examples of
embodied learning games and simulations in venues
such as CHI, TEI, and Interaction Design and Children
(IDC). We also examined related frameworks and
taxonomies in subdomains and communities such as
TEI [14, 31, 39], embodiment and embodied learning
[5, 19], and mixed reality [10, 43]. Our final list
contains a total of 42 papers describing designs for
different embodied learning games and simulations. We
then performed bottom up, open coding similar to 
in order to distill a set of 25 candidate dimensions that
fit concepts found in the reviewed literature. Next, we
iteratively reduced and combined candidate dimensions
into a set small enough for a concise framework. We
then presented our framework to experts in HCI, game
design, and learning science for feedback and additional
refinements to our framework. The final design
framework consists of 7 dimensions shown in Table 1.
We further organized the dimensions into three groups
based on their overarching themes.
Mode of Play
Table 1: Our design framework for embodied learning systems. Similar dimensions are clustered under a group based on an
overarching theme, and the different values for each dimension are shown..
Design Space Dimensions
Physicality describes how learning is physically
embodied in a system and consists of five distinct
values. 1) The embodied value refers to an embodied
cognition and learning science approach where the
body plays the primary constituent role in cognition
. This form of embodiment focuses on gestural
congruency and how the body can physically represent
learning concepts . For instance, a full body
interaction game where players contort their bodies to
match letters shown on a screen . 2) The enacted
value refers to Direct Embodiment from the IEF ,
and to enactivism which focuses on knowing as
physically doing [16, 24]. This form of embodiment
focuses more on acting/enacting out knowledge
through physical action of statements or sequences. For
example, a gravitational physics game where payers
walk along the trajectory an asteroid would travel in
the vicinity of planets and their gravitational forces
. 3) The manipulated value refers to the tangible
embodied interactions of TEI  and the use of
manipulatives in learning science . This form of
embodiment arises from embodied metaphors and
interactions with physical objects , and the objects'
physical embodiment of learning concepts . 4) The
surrogate value refers to the IEF concept of Surrogate
Embodiment, where learners manipulate a physical
agent or "surrogate" representative of themselves to
enact learning concepts . This form of embodiment
is often used in systems with an interactive physical
environment that is directly tied to a real-time virtual
simulation [15, 21]. 5) The augmented value refers to
the IEF notion of Augmented Embodiment, where
combined use of a representational system (e.g.,
avatar) and augmented feedback system (e.g.,
Microsoft Kinect and TV screen) embed the learner
within an augmented reality system. This form of
embodiment is most commonly found in systems where
learners physically control digital avatars in augmented
environments [22, 29].
Transforms conceptualize a space, describing the
relationships between physical or digital actions and
physical or digital effects . We utilize the transform
types of Physical action => Physical effect (PPt),
Physical action => Digital effect (PDt), and Digital
action => Physical effect (DPt) from Rogers et al 
to describe the many forms of existing systems.
Mapping borrows the notion of Location from Price's
tangible learning framework  which describes the
different spatial locations of output in relation to the
object or action triggering the effect (i.e., how is input
spatially mapped to output). Mappings can be
discrete—input and output are located separately (e.g.,
an action triggers output on a nearby screen); co-
located—input and output are contiguous (e.g., an
action triggers output that is directly adjacent or
overlaid); and embedded—input and output are
embedded in the same object.
Correspondence builds upon the notion of Physical
Correspondence from Price's tangible learning
framework  which refers to the degree to which the
physical properties of objects are closely mapped to the
learning concepts. We expand this dimension to include
physical actions as well as objects. Correspondence can
be symbolic—objects and actions act as common
signifiers to the learning concepts; or literal—physical
properties and actions are closely mapped to the
learning concepts and metaphor of the domain.
Mode of Play specifies how individuals socially interact
and play within a system. The system can facilitate
individual, collaborative, or competitive play for the
learner(s). Plass et al  found different learning
benefits for each mode of play, suggesting it as an
important dimension for considering learning outcomes.
Coordination highlights how individuals in a system
may have to socially coordinate their actions  in
order to successfully complete learning objectives.
Social coordination can occur with other players and
NPCs, or can be a limited focus that doesn't occur.
Environment refers to the learning environment in
which the educational content is situated. Environments
can be either physical, mixed, or virtual. While
transforms conceptualize a space through description of
actions and effects, the environment focuses on the
actual space where learning occurs. For instance, a PDt
transform can occur in drastically different learning
environments (see Figure 2). In some systems, a
player's physical actions are tracked but only used as
input to control a virtual character in a virtual
environment . In other systems, the player's
physical actions are tracked and mapped to control
digital effects overlaid on an augmented physical space
or mixed reality environment .
Applying the Design Framework for
Embodied Learning Games and Simulations
Example 1 - Identifying Problematic Design Spaces
One benefit of our design framework is that it allows us
to systematically examine design elements of existing
systems, identifying potential problematic design
spaces. As an example of this usage, we examine the
Tangible Earth system (see Figure 3) where the authors
had to create an assessment framework to identify
problems the system encountered . Tangible Earth
is designed to support learning of the sun's diurnal
motion and earth's rotation. It consists of a doll-like
avatar, a globe and rotating table to represent the
earth and its rotation, an electrical light representing
the sun, and a laptop running VR universe simulator.
Learners would physically manipulate the rotation of
the earth and position/rotation of the avatar to observe
simulated changes in sun's position from the avatar's
Figure 4: Problematic design space identified by Tangible
One of the more significant problems identified by 
for Tangible Earth was that learners spent very little
time looking at the tangibles themselves (e.g., globe,
lamp, and avatar), instead focusing primarily on the VR
simulation in the laptop. This proved to be especially
problematic for manipulation of the avatar, where users
would frequently forget the orientation of its body or
head. This often caused the sun to appear or disappear
unexpectedly in the simulation, confusing learners and
learning concepts. By analyzing this issue with our
design framework, we identified a potential problematic
design space (see Figure 4). Learners had difficulty
remembering the position of a physical agent
representative of themselves (surrogate embodiment)
because all of their physical actions were mapped to
digital effects (PDt) in a simulated virtual world. This
difficulty makes sense considering remembering the
physical position/orientation of a surrogate avatar in
Figure 2: Two systems
illustrating PDt transforms in
different learning environments.
Top - physical actions are
mapped as input into a virtual
environment . Bottom -
physical actions are mapped as
input into a mixed reality
environment overlaid on physical
Figure 3: Tangible Earth
embodied learning system .
both the real world and the virtual world would
introduce a significant amount of extraneous cognitive
load . As a result, our design framework suggests
that the intersection of surrogate embodiment and PDt
transforms is a problematic design space that should be
carefully considered when designing future embodied
Example 2 - Identifying Design Gaps
Another benefit of our design framework is that it
allows us to methodically fill in the framework with
existing systems to identify gaps and unexplored
terrain . As an illustration of this usage, we fill in
example pairings between the two dimensions of
physicality and transforms. This provides examples of
relevant combinations between these two dimensions in
the embodied learning systems literature (see Table 2).
Word Out! ,
Table 2: Example pairings between the physicality and
transform categories of Physical Interaction.
Examining Table 2 we find several design gaps for
existing embodied learning games and simulations.
Some of the more potentially useful parings in the
identified design gaps are Embodied + PPt, Manipulated
+ DPt, Surrogate + PPt, and Surrogate + DPt, where
interesting future system designs could evolve from
utilizing one of these parings. For instance, using a
Surrogate + PPt paring could lead to the design of
physical embodied learning board games. Additionally,
a Surrogate + DPt paring could lead to an asymmetric
computational thinking game where one player controls
and interacts with a physical avatar while another
player digitally designs the physical courses and
obstacles for the first player to complete.
Conclusion and Future Work
In this paper, we presented our design framework for
embodied learning games and simulations. A design
framework allows us to systematically understand,
analyze, and differentiate design elements of existing
embodied learning systems. This ultimately aids us in
determining where and how embodiment occurs in an
educational system, and guides the application of
specific design choices in future work. Future work will
build systems addressing design gaps found by this
framework, and test the efficacy and learning outcomes
of these systems. In broader application, this design
framework can also be used to guide construction of
systems that methodically examine questions of when
and how embodied learning should be used within
games/simulations; which will help to further ground
the framework and clarify its interpretation.
1. Bakker, S. et al. 2012. Embodied metaphors in
tangible interaction design. Personal and Ubiquitous
2. Bakker, S. et al. 2011. MoSo Tangibles : Evaluating
Embodied Learning. Proceedings of the fifth
international conference on Tangible, embedded, and
embodied interaction - TEI ’11 (2011), 85–92.
3. Bartoli, L. et al. 2013. Exploring Motion-based
Touchless Games for Autistic Children ’ s Learning.
Proceedings of the 12th International Conference on
Interaction Design and Children (2013), 102–111.
4. Birchfield, D. et al. 2008. Embodiment, Multimodality,
and Composition: Convergent Themes across HCI and
Education for Mixed-Reality Learning Environments.
Advances in Human-Computer Interaction. 2008,
5. Black, J.B. et al. 2012. Embodied cognition and
learning environment design. Theoretical foundations
of learning environments. 198–223.
6. Chu, J.H. et al. 2015. Mapping Place: Supporting
Cultural Learning through a Lukasa-inspired Tangible
Tabletop Museum Exhibit. Proceedings of the 9th
international conference on Tangible, embedded, and
embodied interaction - TEI ’15 (2015), 261–268.
7. Clark, A. 2008. Supersizing the mind: embodiment,
action, and cognitive extension.
8. Dourish, P. 2001. Where the Action Is: The
Foundations of Embodied Interaction.
9. Edge, D. et al. 2013. SpatialEase: Learning Language
through Body Motion. Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems
(CHI ’13) (2013), 469–472.
10. Ens, B. and Hincapié-ramos, J.D. 2014. Ethereal
Planes : A Design Framework for 2D Information
Spaces in 3D Mixed Reality Environments. Proceedings
of the 2nd ACM symposium on Spatial user interaction
11. Esteves, A. and Oakley, I. 2011. Design for interface
consistency or embodied facilitation? CHI 2011
Embodied Interaction: Theory and Practice in HCI
Workshop (2011), 1–4.
12. Fadjo, C.L. and Black, J.B. 2012. You’re In the Game:
Direct Embodiment and Computational Artifact
Construction. Proceedings of the International
Conference of the Learning Sciences: Future of
Learning (Vol. 2: Symposia) (2012), 99–109.
13. Fernaeus, Y. and Tholander, J. 2006. Finding Design
Qualities in a Tangible programming space. CHI 2006
Proceedings ’06 (2006), 447–456.
14. Fishkin, K.P. 2004. A taxonomy for and analysis of
tangible interfaces. Personal and Ubiquitous
Computing. 8, 5 (2004), 347–358.
15. Gnoli, A. et al. 2014. Back to the future: Embodied
Classroom Simulations of Animal Foraging.
Proceedings of the 8th International Conference on
Tangible, Embedded and Embodied Interaction - TEI
’14 (2014), 275–282.
16. Holton, D.L. 2010. Constructivism + embodied
cognition = enactivism: theoretical and practical
implications for conceptual change. AERA 2010
17. Howison, M. et al. 2011. The mathematical imagery
trainer: From Embodied Interaction to Conceptual
Learning. Proceedings of the 2011 annual conference
on Human factors in computing systems - CHI ’11
18. Ishii, H. 2008. Tangible bits: beyond pixels.
Proceedings of the 2nd international conference on
Tangible and Embedded Intreaction (TEI ’08) (2008).
19. Johnson-Glenberg, M.C. et al. 2014. Collaborative
embodied learning in mixed reality motion-capture
environments: Two science studies. Journal of
Educational Psychology. 106, 1 (2014), 86–104.
20. Kelliher, A. et al. 2009. SMALLab: A mixed-reality
environment for embodied and mediated learning.
MM’09 - Proceedings of the 2009 ACM Multimedia
Conference, with Co-located Workshops and
Symposiums (2009), 1029–1031.
21. Kuzuoka, H. et al. 2014. Tangible Earth: Tangible
Learning Environment for Astronomy Education.
Proceedings of the second international conference on
Human-agent interaction (2014), 23–27.
22. Kynigos, C. et al. 2010. Exploring rules and underlying
concepts while engaged with collaborative full-body
games. Proceedings of the 9th International
Conference on Interaction Design and Children (2010),
23. Lahey, B. et al. 2008. Integrating video games and
robotic play in physical environments. Proceedings of
the 2008 ACM SIGGRAPH symposium on Video games
- Sandbox ’08 (2008), 107.
24. Li, Q. 2012. Understanding enactivism: a study of
affordances and constraints of engaging practicing
teachers as digital game designers. Educational
Technology Research and Development. 60, 5 (2012),
25. Lindgren, R. et al. 2013. MEteor: Developing Physics
Concepts Through Body- Based Interaction With A
Mixed Reality Simulation. Physics Education Research
Conference - PERC ’13 (2013), 217–220.
26. Lyons, L. et al. 2013. Feel the burn: Exploring Design
Parameters for Effortful Interaction for Educational
Games. Proceedings of the 12th International
Conference on Interaction Design and Children - IDC
’13 (2013), 400–403.
27. Malinverni, L. et al. 2012. Impact of Embodied
Interaction on Learning Processes: Design and
Analysis of an Educational Application Based on
Physical Activity. Proceedings of the 11th International
Conference on Interaction Design and Children (2012),
28. Marshall, P. et al. 2003. Conceptualising tangibles to
support learning. IDC ’03: Proceedings of the 2003
conference on Interaction design and children (2003),
29. Nakayama, T. et al. 2014. Human SUGOROKU:
Learning Support System of Vegetation Succession
with Full-body Interaction Interface. Proceedings of
the SIGCHI Conference on Human Factors in
Computing Systems (CHI ’14) (2014), 2227–2232.
30. Novellis, F. and Moher, T. 2011. How Real is “Real
Enough”? Designing Artifacts and Procedures for
Embodied Simulations of Science Practices.
Proceedings of the 10th International Conference on
Interaction Design and Children (2011), 90–98.
31. O’Malley, C. and Fraser, S. 2004. Literature review in
learning with tangible technologies.
32. Oullier, O. et al. 2008. Social coordination dynamics:
measuring human bonding. Social neuroscience. 3, 2
33. Paul, F.C. et al. 2015. Get Creative With Learning:
Word Out! A Full Body Interactive Game. Proceedings
of the 33rd Annual ACM Conference Extended
Abstracts on Human Factors in Computing Systems -
CHI EA ’15 (2015), 81–84.
34. Plaisant, C. et al. 1995. Image-browser taxonomy and
guidelines for designers. IEEE Software. 12, 2 (1995),
35. Plass, J.L. et al. 2010. Cognitive Load Theory.
Cambridge University Press.
36. Plass, J.L. et al. 2013. The impact of individual,
competitive, and collaborative mathematics game play
on learning, performance, and motivation. Journal of
Educational Psychology. 105, 4 (2013), 1050–1066.
37. Pouw, W.T.J.L. et al. 2014. An Embedded and
Embodied Cognition Review of Instructional
Manipulatives. Educational Psychology Review. 26, 1
38. Price, B.A. et al. 1993. A Principled Taxonomy of
Software Visualization. Journal of Visual Languages &
Computing. 4, 3 (1993), 211–266.
39. Price, S. 2008. A representation approach to
conceptualizing tangible learning environments.
Proceedings of the 2nd international conference on
Tangible and embedded interaction TEI 08 (2008),
40. Price, S. et al. 2003. Using tangibles to promote novel
forms of playful learning. Interacting with Computers.
15, 2 (2003), 169–185.
41. Rieser, J.J. et al. 1994. Imagery, Action, and Young
Children’s Spatial Orientation: It's Not Being There
That Counts, It's What One Has in Mind. Child
Development. 65, 5 (1994), 1262.
42. Robinett, W. 1992. Synthetic Experience: A
Taxonomy, Survey of Earlier Thought, and
Speculations on the Future.
43. Rogers, Y. et al. 2002. A conceptual framework for
mixed reality environments: Designing novel learning
activities for young children. Presence. 11, 6 (2002),
44. Rohrer, T. 2007. The body in space: Dimensions of
embodiment. Body, language and mind. 339–378.
45. Schweikardt, E. and Gross, M. 2008. The robot is the
program: interacting with roBlocks. Proceedings of the
second international conference on Tangible,
embedded, and embodied interaction - TEI ’08 (2008),
46. Shapiro, L. 2010. Embodied Cognition. Routledge.
47. Wei, C. et al. 2015. Effects of Embodiment-Based
Learning on Perceived Cooperation Process and Social
Flow. 7th World Conference on Educational Sciences
48. Wyeth, P. 2008. How Young Children Learn to Program
With Sensor, Action, and Logic Blocks. Journal of the
Learning Sciences. 17, 4 (2008), 517–550.
49. Yannier, N. et al. 2013. Tangible collaborative learning
with a mixed-reality game: Earthshake. Lecture Notes
in Computer Science. 7926 LNAI, (2013), 131–140.
50. Zaman, B. et al. 2012. Editorial: The evolving field of
tangible interaction for children: The challenge of
empirical validation. Personal and Ubiquitous
Computing. 16, (2012), 367–378.
51. Ziemke, T. 2002. What’s that Thing Called
Embodiment? Proceedings of the 25th Annual meeting
of the Cognitive Science Society (2002), 1305–1310.