Conference PaperPDF Available

Bridging the Physical Divide: A Design Framework for Embodied Learning Games and Simulations

Conference Paper

Bridging the Physical Divide: A Design Framework for Embodied Learning Games and Simulations

Abstract and Figures

Existing embodied learning games and simulations utilize a large breadth of design approaches that often result in the creation of seemingly unrelated systems. This becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within embodied learning designs. In this paper, we present our work on combining differing conceptual and design approaches for embodied learning systems into a unified design framework. We describe the creation process for the framework, explain its dimensions, and provide two examples of its use. Our embodied learning games and simulations framework will benefit HCI researchers by providing a unifying foundation for the description, categorization, and evaluation of embodied learning systems and designs.
Content may be subject to copyright.
Bridging the Physical Divide: A Design
Framework for Embodied Learning
Games and Simulations
Abstract
Existing embodied learning games and simulations
utilize a large breadth of design approaches that often
result in the creation of seemingly unrelated systems.
This becomes problematic when trying to critically
evaluate the usage and effectiveness of embodiment
within embodied learning designs. In this paper, we
present our work on combining differing conceptual and
design approaches for embodied learning systems into
a unified design framework. We describe the creation
process for the framework, explain its dimensions, and
provide two examples of its use. Our embodied learning
games and simulations framework will benefit HCI
researchers by providing a unifying foundation for the
description, categorization, and evaluation of embodied
learning systems and designs.
Author Keywords
Embodiment; embodied learning; design framework.
ACM Classification Keywords
H.5.m. Information interfaces and presentation (e.g.,
HCI): Miscellaneous.
Introduction
Recent work on educational systems has shown the
benefits of incorporating physicality, motion, and
Permission to make digital or hard copies of part or all of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. Copyrights
for third-party components of this work must be honored. For all other
uses, contact the Owner/Author.
Copyright is held by the owner/author(s).
CHI'16 Extended Abstracts, May 07-12, 2016, San Jose, CA, USA
ACM 978-1-4503-4082-3/16/05.
http://dx.doi.org/10.1145/2851581.2892455
Edward F Melcer
New York University
Brooklyn, NY 11201, USA
eddie.melcer@nyu.edu
Katherine Isbister
University of California, Santa Cruz
Santa Cruz, CA 95064, USA
katherine.isbister@ucsc.edu
embodiment into designs. This direction stems from the
concept that cognition does not only occur in the mind
but is also supported by bodily activity; situated in and
interacting with our physical and social environment [7,
8]. Embodied learning approaches suggest many
potential benefits such as improved spatial recall [41],
intuitive interactions and mappings [48], increased
engagement and positive feelings [47, 49], and
enhanced collaboration [40, 49].
However, when examining existing embodied learning
systems closely, we find a large breadth of designs that
result in seemingly unrelated systems (see Figure 1).
This becomes problematic when trying to understand
where and how embodiment occurs in these systems,
and which design elements help to facilitate embodied
learning. The problem is further aggravated by limited
empirical validation of many systems [50], and a broad
conceptual usage of embodiment and related terms in a
diverse variety of domains such as HCI, learning
science, neuroscience, and philosophy [4, 44, 51].
For designers seeking to utilize embodiment, the
differences in approach to physicality, motion, and
interaction pose a significant hurdle. One solution that
can bridge conceptual differences between existing
systems and research domains is the creation of a
design framework [10, 42].
Background
Our goal in providing an embodied learning design
framework is to bridge conceptual gaps and resulting
design choices made from the differing uses of
embodiment in various domains. In this section we
present an overview of design frameworks,
embodiment, and embodied learning taxonomies.
Design Frameworks
Design frameworks can help designers conceptualize
nuances of particular technologies and formalize the
creative process [10]. In interface design, design
frameworks have been used to provide terminology to
categorize ideas [38] as well as organize complex
concepts into logical hierarchies [34]. By treating a set
of taxonomical terms as orthogonal dimensions in a
design space, the resulting matrix provides structure
for classification and comparison of designs [42]. The
resulting design framework provides a means to
critically examine designs of existing systems and
encourage new designs by providing a unifying
foundation for the description and categorization of
systems. Furthermore, the methodical filling-in of this
structure helps to categorize existing concepts,
differentiate ideas, and identify unexplored terrain [10].
Embodiment and Embodied Learning Taxonomies
Embodiment and related terms such as embodied
cognition and embodied interaction have many different
interpretations and applications across a wide range of
academic domains. HCI tends to view embodiment as
the way that physical and social phenomena unfold in
real time and space as a part of the world in which we
are situated [8]. However, learning science views tend
to be more oriented on the body as the central focus
for embodiment [19, 44]. Furthermore, Ziemcke has
noted this divide in their work identifying six different
uses of embodiment across research domains [51].
Similar to the many interpretations of embodiment,
embodied learning frameworks and taxonomies also
have vastly different interpretations of physicality,
motion, and interaction. Johnson-Glenberg et al created
an embodied learning taxonomy that specifies the
Figure 1: A spectrum of different
embodied learning systems. Top
to bottom - Interactive Slide [27],
Electronic Blocks [48], Embodied
Poetry [20], SpatialEase [9], Eco
Planner [11].
strength of embodiment as a combination of the
amount of motoric engagement, gestural congruency to
learning content, and immersion [19]. Black et al
created the Instructional Embodiment Framework (IEF)
which consists of various forms of physical embodiment
(e.g., direct, surrogate, and augmented) as well as
imagined embodiment (e.g., explicit and implicit) where
the individual can embody action and perception
through imagination [5]. In the Tangible Embodied
Interaction (TEI) field, Fishkin's taxonomy for the
analysis of tangible interfaces views embodiment as the
distance between input and output where embodiment
can be full, nearby, environmental, or distant [14].
Towards A Design Framework for Embodied
Learning Games and Simulations
Creating the Design Framework
To create our design framework, we conducted an
extensive literature review for published examples of
embodied learning games and simulations in venues
such as CHI, TEI, and Interaction Design and Children
(IDC). We also examined related frameworks and
taxonomies in subdomains and communities such as
TEI [14, 31, 39], embodiment and embodied learning
[5, 19], and mixed reality [10, 43]. Our final list
contains a total of 42 papers describing designs for
different embodied learning games and simulations. We
then performed bottom up, open coding similar to [10]
in order to distill a set of 25 candidate dimensions that
fit concepts found in the reviewed literature. Next, we
iteratively reduced and combined candidate dimensions
into a set small enough for a concise framework. We
then presented our framework to experts in HCI, game
design, and learning science for feedback and additional
refinements to our framework. The final design
framework consists of 7 dimensions shown in Table 1.
We further organized the dimensions into three groups
based on their overarching themes.
Group
Dimension
Values
Physical Interaction
Physicality
Enacted
Manipulated
Surrogate
Augmented
Transforms
PPt
PDt
DPt
Mapping
Discrete
Co-located
Embedded
Correspondence
Symbolic
Literal
Social Interaction
Mode of Play
Individual
Collaborative
Competitive
Coordination
Other Player(s)
NPC(s)
None
World
Environment
Physical
Mixed
Virtual
Table 1: Our design framework for embodied learning systems. Similar dimensions are clustered under a group based on an
overarching theme, and the different values for each dimension are shown..
Design Space Dimensions
Physicality describes how learning is physically
embodied in a system and consists of five distinct
values. 1) The embodied value refers to an embodied
cognition and learning science approach where the
body plays the primary constituent role in cognition
[46]. This form of embodiment focuses on gestural
congruency and how the body can physically represent
learning concepts [19]. For instance, a full body
interaction game where players contort their bodies to
match letters shown on a screen [33]. 2) The enacted
value refers to Direct Embodiment from the IEF [5],
and to enactivism which focuses on knowing as
physically doing [16, 24]. This form of embodiment
focuses more on acting/enacting out knowledge
through physical action of statements or sequences. For
example, a gravitational physics game where payers
walk along the trajectory an asteroid would travel in
the vicinity of planets and their gravitational forces
[25]. 3) The manipulated value refers to the tangible
embodied interactions of TEI [28] and the use of
manipulatives in learning science [37]. This form of
embodiment arises from embodied metaphors and
interactions with physical objects [1], and the objects'
physical embodiment of learning concepts [18]. 4) The
surrogate value refers to the IEF concept of Surrogate
Embodiment, where learners manipulate a physical
agent or "surrogate" representative of themselves to
enact learning concepts [5]. This form of embodiment
is often used in systems with an interactive physical
environment that is directly tied to a real-time virtual
simulation [15, 21]. 5) The augmented value refers to
the IEF notion of Augmented Embodiment, where
combined use of a representational system (e.g.,
avatar) and augmented feedback system (e.g.,
Microsoft Kinect and TV screen) embed the learner
within an augmented reality system. This form of
embodiment is most commonly found in systems where
learners physically control digital avatars in augmented
environments [22, 29].
Transforms conceptualize a space, describing the
relationships between physical or digital actions and
physical or digital effects [43]. We utilize the transform
types of Physical action => Physical effect (PPt),
Physical action => Digital effect (PDt), and Digital
action => Physical effect (DPt) from Rogers et al [43]
to describe the many forms of existing systems.
Mapping borrows the notion of Location from Price's
tangible learning framework [39] which describes the
different spatial locations of output in relation to the
object or action triggering the effect (i.e., how is input
spatially mapped to output). Mappings can be
discreteinput and output are located separately (e.g.,
an action triggers output on a nearby screen); co-
locatedinput and output are contiguous (e.g., an
action triggers output that is directly adjacent or
overlaid); and embeddedinput and output are
embedded in the same object.
Correspondence builds upon the notion of Physical
Correspondence from Price's tangible learning
framework [39] which refers to the degree to which the
physical properties of objects are closely mapped to the
learning concepts. We expand this dimension to include
physical actions as well as objects. Correspondence can
be symbolicobjects and actions act as common
signifiers to the learning concepts; or literalphysical
properties and actions are closely mapped to the
learning concepts and metaphor of the domain.
Mode of Play specifies how individuals socially interact
and play within a system. The system can facilitate
individual, collaborative, or competitive play for the
learner(s). Plass et al [36] found different learning
benefits for each mode of play, suggesting it as an
important dimension for considering learning outcomes.
Coordination highlights how individuals in a system
may have to socially coordinate their actions [32] in
order to successfully complete learning objectives.
Social coordination can occur with other players and
NPCs, or can be a limited focus that doesn't occur.
Environment refers to the learning environment in
which the educational content is situated. Environments
can be either physical, mixed, or virtual. While
transforms conceptualize a space through description of
actions and effects, the environment focuses on the
actual space where learning occurs. For instance, a PDt
transform can occur in drastically different learning
environments (see Figure 2). In some systems, a
player's physical actions are tracked but only used as
input to control a virtual character in a virtual
environment [26]. In other systems, the player's
physical actions are tracked and mapped to control
digital effects overlaid on an augmented physical space
or mixed reality environment [25].
Applying the Design Framework for
Embodied Learning Games and Simulations
Example 1 - Identifying Problematic Design Spaces
One benefit of our design framework is that it allows us
to systematically examine design elements of existing
systems, identifying potential problematic design
spaces. As an example of this usage, we examine the
Tangible Earth system (see Figure 3) where the authors
had to create an assessment framework to identify
problems the system encountered [21]. Tangible Earth
is designed to support learning of the sun's diurnal
motion and earth's rotation. It consists of a doll-like
avatar, a globe and rotating table to represent the
earth and its rotation, an electrical light representing
the sun, and a laptop running VR universe simulator.
Learners would physically manipulate the rotation of
the earth and position/rotation of the avatar to observe
simulated changes in sun's position from the avatar's
perspective.
Figure 4: Problematic design space identified by Tangible
Earth.
One of the more significant problems identified by [21]
for Tangible Earth was that learners spent very little
time looking at the tangibles themselves (e.g., globe,
lamp, and avatar), instead focusing primarily on the VR
simulation in the laptop. This proved to be especially
problematic for manipulation of the avatar, where users
would frequently forget the orientation of its body or
head. This often caused the sun to appear or disappear
unexpectedly in the simulation, confusing learners and
learning concepts. By analyzing this issue with our
design framework, we identified a potential problematic
design space (see Figure 4). Learners had difficulty
remembering the position of a physical agent
representative of themselves (surrogate embodiment)
because all of their physical actions were mapped to
digital effects (PDt) in a simulated virtual world. This
difficulty makes sense considering remembering the
physical position/orientation of a surrogate avatar in
Figure 2: Two systems
illustrating PDt transforms in
different learning environments.
Top - physical actions are
mapped as input into a virtual
environment [26]. Bottom -
physical actions are mapped as
input into a mixed reality
environment overlaid on physical
space [25].
Figure 3: Tangible Earth
embodied learning system [21].
both the real world and the virtual world would
introduce a significant amount of extraneous cognitive
load [35]. As a result, our design framework suggests
that the intersection of surrogate embodiment and PDt
transforms is a problematic design space that should be
carefully considered when designing future embodied
learning systems.
Example 2 - Identifying Design Gaps
Another benefit of our design framework is that it
allows us to methodically fill in the framework with
existing systems to identify gaps and unexplored
terrain [10]. As an illustration of this usage, we fill in
example pairings between the two dimensions of
physicality and transforms. This provides examples of
relevant combinations between these two dimensions in
the embodied learning systems literature (see Table 2).
Physicality
Embodied
Enacted
Manipulated
Surrogate
Augmented
Transforms
PPt
Scratch
Direct
Embodiment
[12]
Electronic
Blocks [48],
roBlocks [45]
PDt
SpatialEase [9],
Word Out! [33],
Mathematical
Imagery Trainer
[17]
Embodied
Poetry [20],
AquaRoom
[30], MEteor
[25]
Eco Planner
[11], Mapping
Place [6],
MoSo
Tangibles [2]
Hunger
Games [15],
Tangible Earth
[21], Tangible
Programming
Space [13]
Human
SUGOROKO
[29], Bump
Bash [3],
Sorter Game
[22]
DPt
ALERT [23]
Table 2: Example pairings between the physicality and
transform categories of Physical Interaction.
Examining Table 2 we find several design gaps for
existing embodied learning games and simulations.
Some of the more potentially useful parings in the
identified design gaps are Embodied + PPt, Manipulated
+ DPt, Surrogate + PPt, and Surrogate + DPt, where
interesting future system designs could evolve from
utilizing one of these parings. For instance, using a
Surrogate + PPt paring could lead to the design of
physical embodied learning board games. Additionally,
a Surrogate + DPt paring could lead to an asymmetric
computational thinking game where one player controls
and interacts with a physical avatar while another
player digitally designs the physical courses and
obstacles for the first player to complete.
Conclusion and Future Work
In this paper, we presented our design framework for
embodied learning games and simulations. A design
framework allows us to systematically understand,
analyze, and differentiate design elements of existing
embodied learning systems. This ultimately aids us in
determining where and how embodiment occurs in an
educational system, and guides the application of
specific design choices in future work. Future work will
build systems addressing design gaps found by this
framework, and test the efficacy and learning outcomes
of these systems. In broader application, this design
framework can also be used to guide construction of
systems that methodically examine questions of when
and how embodied learning should be used within
games/simulations; which will help to further ground
the framework and clarify its interpretation.
References
1. Bakker, S. et al. 2012. Embodied metaphors in
tangible interaction design. Personal and Ubiquitous
Computing (2012).
2. Bakker, S. et al. 2011. MoSo Tangibles : Evaluating
Embodied Learning. Proceedings of the fifth
international conference on Tangible, embedded, and
embodied interaction - TEI ’11 (2011), 8592.
3. Bartoli, L. et al. 2013. Exploring Motion-based
Touchless Games for Autistic Children ’ s Learning.
Proceedings of the 12th International Conference on
Interaction Design and Children (2013), 102111.
4. Birchfield, D. et al. 2008. Embodiment, Multimodality,
and Composition: Convergent Themes across HCI and
Education for Mixed-Reality Learning Environments.
Advances in Human-Computer Interaction. 2008,
(2008), 119.
5. Black, J.B. et al. 2012. Embodied cognition and
learning environment design. Theoretical foundations
of learning environments. 198223.
6. Chu, J.H. et al. 2015. Mapping Place: Supporting
Cultural Learning through a Lukasa-inspired Tangible
Tabletop Museum Exhibit. Proceedings of the 9th
international conference on Tangible, embedded, and
embodied interaction - TEI ’15 (2015), 261268.
7. Clark, A. 2008. Supersizing the mind: embodiment,
action, and cognitive extension.
8. Dourish, P. 2001. Where the Action Is: The
Foundations of Embodied Interaction.
9. Edge, D. et al. 2013. SpatialEase: Learning Language
through Body Motion. Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems
(CHI ’13) (2013), 469472.
10. Ens, B. and Hincapié-ramos, J.D. 2014. Ethereal
Planes : A Design Framework for 2D Information
Spaces in 3D Mixed Reality Environments. Proceedings
of the 2nd ACM symposium on Spatial user interaction
(2014).
11. Esteves, A. and Oakley, I. 2011. Design for interface
consistency or embodied facilitation? CHI 2011
Embodied Interaction: Theory and Practice in HCI
Workshop (2011), 14.
12. Fadjo, C.L. and Black, J.B. 2012. You’re In the Game:
Direct Embodiment and Computational Artifact
Construction. Proceedings of the International
Conference of the Learning Sciences: Future of
Learning (Vol. 2: Symposia) (2012), 99109.
13. Fernaeus, Y. and Tholander, J. 2006. Finding Design
Qualities in a Tangible programming space. CHI 2006
Proceedings ’06 (2006), 447456.
14. Fishkin, K.P. 2004. A taxonomy for and analysis of
tangible interfaces. Personal and Ubiquitous
Computing. 8, 5 (2004), 347358.
15. Gnoli, A. et al. 2014. Back to the future: Embodied
Classroom Simulations of Animal Foraging.
Proceedings of the 8th International Conference on
Tangible, Embedded and Embodied Interaction - TEI
’14 (2014), 275282.
16. Holton, D.L. 2010. Constructivism + embodied
cognition = enactivism: theoretical and practical
implications for conceptual change. AERA 2010
Conference (2010).
17. Howison, M. et al. 2011. The mathematical imagery
trainer: From Embodied Interaction to Conceptual
Learning. Proceedings of the 2011 annual conference
on Human factors in computing systems - CHI ’11
(2011).
18. Ishii, H. 2008. Tangible bits: beyond pixels.
Proceedings of the 2nd international conference on
Tangible and Embedded Intreaction (TEI ’08) (2008).
19. Johnson-Glenberg, M.C. et al. 2014. Collaborative
embodied learning in mixed reality motion-capture
environments: Two science studies. Journal of
Educational Psychology. 106, 1 (2014), 86104.
20. Kelliher, A. et al. 2009. SMALLab: A mixed-reality
environment for embodied and mediated learning.
MM’09 - Proceedings of the 2009 ACM Multimedia
Conference, with Co-located Workshops and
Symposiums (2009), 10291031.
21. Kuzuoka, H. et al. 2014. Tangible Earth: Tangible
Learning Environment for Astronomy Education.
Proceedings of the second international conference on
Human-agent interaction (2014), 2327.
22. Kynigos, C. et al. 2010. Exploring rules and underlying
concepts while engaged with collaborative full-body
games. Proceedings of the 9th International
Conference on Interaction Design and Children (2010),
222.
23. Lahey, B. et al. 2008. Integrating video games and
robotic play in physical environments. Proceedings of
the 2008 ACM SIGGRAPH symposium on Video games
- Sandbox ’08 (2008), 107.
24. Li, Q. 2012. Understanding enactivism: a study of
affordances and constraints of engaging practicing
teachers as digital game designers. Educational
Technology Research and Development. 60, 5 (2012),
785806.
25. Lindgren, R. et al. 2013. MEteor: Developing Physics
Concepts Through Body- Based Interaction With A
Mixed Reality Simulation. Physics Education Research
Conference - PERC ’13 (2013), 217220.
26. Lyons, L. et al. 2013. Feel the burn: Exploring Design
Parameters for Effortful Interaction for Educational
Games. Proceedings of the 12th International
Conference on Interaction Design and Children - IDC
’13 (2013), 400403.
27. Malinverni, L. et al. 2012. Impact of Embodied
Interaction on Learning Processes: Design and
Analysis of an Educational Application Based on
Physical Activity. Proceedings of the 11th International
Conference on Interaction Design and Children (2012),
6069.
28. Marshall, P. et al. 2003. Conceptualising tangibles to
support learning. IDC ’03: Proceedings of the 2003
conference on Interaction design and children (2003),
101109.
29. Nakayama, T. et al. 2014. Human SUGOROKU:
Learning Support System of Vegetation Succession
with Full-body Interaction Interface. Proceedings of
the SIGCHI Conference on Human Factors in
Computing Systems (CHI ’14) (2014), 22272232.
30. Novellis, F. and Moher, T. 2011. How Real is “Real
Enough”? Designing Artifacts and Procedures for
Embodied Simulations of Science Practices.
Proceedings of the 10th International Conference on
Interaction Design and Children (2011), 9098.
31. O’Malley, C. and Fraser, S. 2004. Literature review in
learning with tangible technologies.
32. Oullier, O. et al. 2008. Social coordination dynamics:
measuring human bonding. Social neuroscience. 3, 2
(2008), 178192.
33. Paul, F.C. et al. 2015. Get Creative With Learning:
Word Out! A Full Body Interactive Game. Proceedings
of the 33rd Annual ACM Conference Extended
Abstracts on Human Factors in Computing Systems -
CHI EA ’15 (2015), 8184.
34. Plaisant, C. et al. 1995. Image-browser taxonomy and
guidelines for designers. IEEE Software. 12, 2 (1995),
2132.
35. Plass, J.L. et al. 2010. Cognitive Load Theory.
Cambridge University Press.
36. Plass, J.L. et al. 2013. The impact of individual,
competitive, and collaborative mathematics game play
on learning, performance, and motivation. Journal of
Educational Psychology. 105, 4 (2013), 10501066.
37. Pouw, W.T.J.L. et al. 2014. An Embedded and
Embodied Cognition Review of Instructional
Manipulatives. Educational Psychology Review. 26, 1
(2014), 5172.
38. Price, B.A. et al. 1993. A Principled Taxonomy of
Software Visualization. Journal of Visual Languages &
Computing. 4, 3 (1993), 211266.
39. Price, S. 2008. A representation approach to
conceptualizing tangible learning environments.
Proceedings of the 2nd international conference on
Tangible and embedded interaction TEI 08 (2008),
151.
40. Price, S. et al. 2003. Using tangibles to promote novel
forms of playful learning. Interacting with Computers.
15, 2 (2003), 169185.
41. Rieser, J.J. et al. 1994. Imagery, Action, and Young
Children’s Spatial Orientation: It's Not Being There
That Counts, It's What One Has in Mind. Child
Development. 65, 5 (1994), 1262.
42. Robinett, W. 1992. Synthetic Experience: A
Taxonomy, Survey of Earlier Thought, and
Speculations on the Future.
43. Rogers, Y. et al. 2002. A conceptual framework for
mixed reality environments: Designing novel learning
activities for young children. Presence. 11, 6 (2002),
677686.
44. Rohrer, T. 2007. The body in space: Dimensions of
embodiment. Body, language and mind. 339378.
45. Schweikardt, E. and Gross, M. 2008. The robot is the
program: interacting with roBlocks. Proceedings of the
second international conference on Tangible,
embedded, and embodied interaction - TEI ’08 (2008),
167168.
46. Shapiro, L. 2010. Embodied Cognition. Routledge.
47. Wei, C. et al. 2015. Effects of Embodiment-Based
Learning on Perceived Cooperation Process and Social
Flow. 7th World Conference on Educational Sciences
(2015), 608613.
48. Wyeth, P. 2008. How Young Children Learn to Program
With Sensor, Action, and Logic Blocks. Journal of the
Learning Sciences. 17, 4 (2008), 517550.
49. Yannier, N. et al. 2013. Tangible collaborative learning
with a mixed-reality game: Earthshake. Lecture Notes
in Computer Science. 7926 LNAI, (2013), 131140.
50. Zaman, B. et al. 2012. Editorial: The evolving field of
tangible interaction for children: The challenge of
empirical validation. Personal and Ubiquitous
Computing. 16, (2012), 367378.
51. Ziemke, T. 2002. What’s that Thing Called
Embodiment? Proceedings of the 25th Annual meeting
of the Cognitive Science Society (2002), 13051310.
... The integration of users' bodies and their physical context has been conceptualised during interaction with digital content. Towards that end, different physicalities of embodiment have been described: body-centred, object-centred, and environment-centred embodiment [90,91]. We offer examples of embodied DTs and FTs following this categorisation in Figure 6. ...
... We offer examples of embodied DTs and FTs following this categorisation in Figure 6. These categories themselves include direct-embodied, enacted, manipulated, surrogate, and augmented approaches [90,91]. The direct-embodied approach considers the users' bodies as primary constituents of cognition and therefore the body state is used to convey information. ...
... We observe the difference between non-situated AR (top row) and situated AR (bottom row) for anchoring content at the body of the user, an object, or the immediate environment. The distinction between body-, object-, and environment-centred AR is inspired by the classification of different physicalities of embodiment offered by [90,91]. With a body anchor, the DT is anchored on the users' bodies, similar to the directly embodied approach. ...
Article
Full-text available
Smart Cities already surround us, and yet they are still incomprehensibly far from directly impacting everyday life. While current Smart Cities are often inaccessible, the experience of everyday citizens may be enhanced with a combination of the emerging technologies Digital Twins (DTs) and Situated Analytics. DTs represent their Physical Twin (PT) in the real world via models, simulations, (remotely) sensed data, context awareness, and interactions. However, interaction requires appropriate interfaces to address the complexity of the city. Ultimately, leveraging the potential of Smart Cities requires going beyond assembling the DT to be comprehensive and accessible. Situated Analytics allows for the anchoring of city information in its spatial context. We advance the concept of embedding the DT into the PT through Situated Analytics to form Fused Twins (FTs). This fusion allows access to data in the location that it is generated in in an embodied context that can make the data more understandable. Prototypes of FTs are rapidly emerging from different domains, but Smart Cities represent the context with the most potential for FTs in the future. This paper reviews DTs, Situated Analytics, and Smart Cities as the foundations of FTs. Regarding DTs, we define five components (physical, data, analytical, virtual, and Connection Environments) that we relate to several cognates (i.e., similar but different terms) from existing literature. Regarding Situated Analytics, we review the effects of user embodiment on cognition and cognitive load. Finally, we classify existing partial examples of FTs from the literature and address their construction from Augmented Reality, Geographic Information Systems, Building/City Information Models, and DTs and provide an overview of future directions.
... In turn, Melcer and Ibister introduced a theoretical framework of embodied cognition [44,48]. The "physicality" dimension of the framework describes five main types of embodiment: directembodied (body-centered), enacted (body-in-action), manipulated (object-centered), surrogate (object-in-action), and augmented (environment-centered). ...
... Fourth, we highlighted different types of embodiment, in particular direct-embodied and enacted [44,48]. While direct-embodied interaction focuses on the body as "the primary constituent of cognition", an enacted approach emphasizes "learning by physically doing". ...
Conference Paper
Full-text available
Grasping mathematics can be difficult. Often, students struggle to connect mathematical concepts with their own experiences and even believe that math has nothing to do with the real world. To create more concreteness in mathematics education, we focus on the role of the body in learning, and more specifically, embodied interactions for learning derivatives. In this project, we designed an embodied game to teach derivatives, and validated our design with a panel of experts. We then used this prototype to explore different embodied interactions in terms of usability, sense of embodiment, and learning outcomes. In particular, we evaluated different degrees of embodied interactions, and different types of embodied interactions in Virtual Reality. We conclude with insights and recommendations for mathematics education with embodied interactions.
... In this section, we compare design choices of the identified related work using a taxonomical design framework developed by Melcer and Isbister [24,25] that outlines key methods for incorporating embodiment into the design of embodied learning systems. A design framework is an important HCI tool that provides a common language for designers and researchers to conceptualize variants of particular technologies and formalize the creative process [26]. ...
... Transforming physical actions into physical effects in the environment enables a concrete (and embodied) experience with elements of the knowledge domain, i.e., the actions happen in the physical environment (not exclusively in a digital environment). In our exploratory literature review, we identified two related work with PPt transform dimension (physical action to physical effect) using the design framework of Melcer and Isbister [24]. Differently from TangiTime, they do not incorporate manipulation outside the tabletop display or communication between objects as made possible with TangiTime. ...
Article
Full-text available
Contemporary computational technology (tangible and ubiquitous) are still challenging the mainstream systems design methods, demanding new ways of considering the interaction design and its evaluation. In this work, we draw on concepts of enactivism and enactive systems to investigate interaction and experience in the context of the ubiquity of computational systems. Our study is illustrated with the design and usage experience of TangiTime: a tangible tabletop system proposed for an educational exhibit. TangiTime was designed to enable a socioenactive experience of interaction with the concept of “deep time.” In this paper, we present the TangiTime design process, the artifacts designed and implemented, in its conceptual, interactional, and architectural aspects. Besides that, we present and discuss results of an exploratory study within an exhibition context, to observe how socioenactive aspects of the experience potentially emerge from the interaction. Overall, the paper contributes with elements of design that should be considered when designing a socioenactive experience in environments constituted by contemporary computational technology.
... Looking deeper into the theoretical concepts underlying this black box reveals a commonly used set of terms: embodiment, embodied cognition, and embodied interaction to name a few. Unfortunately, embodiment and related terms are remarkably broad and fuzzy constructs-with definitions and operationalization that differs drastically depending upon the academic domain in which they are utilized [92,94]. This large breadth of interpretation for "embodiment" is a likely cause of the seemingly ambiguous design choices underlying many educational games and simulations employing physical interactions. ...
... Furthermore, the methodical filling-in of this structure helps to categorize existing concepts, identify problematic design spaces, differentiate ideas, and identify unexplored terrain [38]. Some examples of taxonomic design frameworks include [38,46,92,94]. However, one notable drawback of taxonomical design frameworks is that they do not elucidate the relations between concepts as other types of frameworks can. ...
Chapter
Full-text available
In recent years, embodiment—the notion that cognition arises not just from the mind, but also through our bodies’ physical and social interactions with the world around us—has become an important concept to enhance the design of educational games and simulations within communities such as Human–Computer Interaction (HCI), games, and learning science. However, the broad number of academic domains that define and operationalize embodiment differently has often led researchers and designers to employ a black box of design decisions—resulting in a substantial breadth of seemingly unrelated educational systems. This inconsistent application of embodiment is a substantial drawback when trying to design embodied technology to support particular use cases such as learning, and ultimately becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within existing educational designs. In this chapter, we present our work toward opening the black box of embodied design decisions by combining differing conceptual and design approaches for embodied learning games and simulations into a unified taxonomical design framework. We describe the motivation and creation process for the framework, explain its dimensions, and provide examples of its use. Ultimately, such a framework helps to explicate high-level design decisions by providing a unifying foundation for the description and categorization of embodied learning systems, and encourages new designs by identifying problematic design spaces and unexplored research/design terrain. Our design framework will benefit educational game designers and researchers by mapping out a more precise understanding of how to incorporate embodiment into the design of educational games and simulations.
... Tangibles are a powerful tool for engaging users [12], providing customized physical objects and interactions that deliver a more intuitive, embodied experience [3,5,13]. Notably, within the context of games, the embodied interactions that tangibles afford can even be applied to enhance player experience and learning outcomes [18,19,22]. However, we posit that the real-time capabilities of fabrication technologies used to create tangibles (such as 3D printers) have not been fully explored within the context of games and learning. ...
... As a result, their benefits to learning have been hypothesized and studied broadly [16,22]. For instance, incorporating tangibles and principles of embodiment into learning activities has been shown to elicit boosts to engagement, spatial recall and mental manipulation, intuition for physical interactions and mappings, and positive feelings towards learning science content [18]. There have been a number of embodied educational games that utilize tangibles to teach a variety of topics such as programming [10,19,33], music [4,5], reading and spelling [9], anatomy [25], and animal foraging behavior [11]. ...
Conference Paper
Full-text available
3D printers are becoming increasingly accessible to the average consumer, however their potential utility within games has yet to be fully explored. Integrating 3D printer fabrication technology within game design presents a novel means for engaging players and providing them with tangible representations of gameplay elements. This in turn could be employed to increase embodied gameplay and even embodied learning for the player. In this paper, we present a novel "fabrication game" designed to teach basic evolutionary concepts. In the game, players take turns physically assembling components 3D printed in real-time to iteratively evolve their creatures and observe the impact of their evolutionary choices on a digital population simulation. We discuss the potential of this game's unique design in leveraging real-time fabrication of tangibles to enhance a player's understanding of principles of evolution and natural selection.
... With respect to academic explorations, this more detailed analysis has been done in a number of past works categorizing and defining games and their genres as a whole [3,27,91]. These game genres have included idle games [5], educational games [61,63], serious games [56], games for rehabilitation [74], games for dementia [19,59], exertion games [65], affective games [52], and games and simulations [48]. ...
Article
Full-text available
Visual Novel (VN) is a widely recognizable genre of narrative-focused games that has grown in popularity over the past decade. Surprisingly, despite being so widely recognizable, there is not a singular definition to help guide the design and analysis of such games---with academic definitions and implementations ranging from "interactive textbooks" to "adventure games with multi-ending stories". In this paper, we present a unified definition of VNs drawn from an analysis of 30 prior academic definitions. We also examined 54 existing VNs to further refine our definition and employ a deeper analysis of the interactivity within VNs. We highlight key features of VNs that arise in our definition, and discuss resulting implications for the design of VNs. This work is relevant for narrative game designers and researchers, affording a more unified structure and clearer guidelines to identify, analyze, and design future VN games.
... Within HCI, there has been an assortment of definitions for the idea of 'embodiment'. This is in part due to the fact that the term is conceptualized differently between a number of fields from which HCI research heavily draws, including cognitive science, cognitive linguistics and artificial intelligence [Melcer & Isbister, 2016]. In defining embodied interaction, this thesis looks to the work of Dourish [2004]. ...
Conference Paper
Interest in teaching children about computing is increasing apace, as evidenced by the recent redesign of the English computing curriculum, as well as the variety of new tools for learning about computing by making, tinkering and coding. The rapid emergence of the Internet of Things (IoT), through which billions of everyday objects are becoming embedded with the abilities to sense their environment, compute data, and wirelessly connect to other devices, introduces new topics to the scope of computing education. However, what these IoT topics are and how they can be taught to children is still ill defined. Simultaneously, new handheld and tangible physical computing toolkits offer much promise for promoting collaborative, discovery-based learning within classroom settings. These toolkits provide new opportunities for learning about electronics and IoT, by enabling children to connect the digital with the physical. This thesis investigates how IoT topics can be introduced to primary and secondary classrooms through discovery-based learning together with a physical computing toolkit. Specifically, this research addresses three core questions. First, what IoT concepts and topics are appropriate for children to learn about? Second, how can discovery-based learning be designed to facilitate IoT learning for beginners? Third, how can learning about IoT be made accessible and inclusive? This thesis describes the design and evaluation of novel learning approaches for teaching children about introductory IoT topics, especially understanding sensors, actuators and data, as well as critical thinking about their limitations and implications. The contribution is to provide a detailed, descriptive account of how children can first learn these topics in classroom settings through discovery-based activities, as well as of how discovery-based activities together with new types of tangible, physical computing interfaces can contribute to engagement, curiosity and collaborative interaction in computing classrooms and beyond.
Article
Materialist design is presented as an embodied perspective on educational design that can be applied to redesign of classroom-based learning environments. Materialist design is informed by a framework of materialist epistemology, which positions material innovation on equal placement with symbol-based formal theory. Historical examples of Einstein’s conceptual reliance on trains for his Theory of Relativity and the Wright brothers’ use of wind tunnels in aeronautics illustrate how materialist design drives progress on complex design problems. A key aspect is the application of scale-down methodology, where complex systems are reconceptualized as interactions among nearly decomposable subsystems that can be redesigned and integrated back into the entire system. The application of materialist design is illustrated with the redesign of an embodied video game that uses real-time motion capture technology to promote high school geometry reasoning and proof, following its use in an ethnically and linguistically diverse classroom. Our embodied perspective offers particular insights for understanding and implementing designs of complex learning environments, and assessing their influences on educational practices and student outcomes.
Article
Full-text available
Peer interaction plays an important role during the learning process. Currently predominant paradigm of digital learning materials limits learners’ interaction with their peers using traditional user interfaces. To cope with this problem, this study proposes an embodiment-based learning environment to facilitate more peer interactions. The effectiveness of the environment is evaluated by designing an electronic circuit learning activity and through a control-experimental group experiment, including 80 voluntary participants randomly assigned to the “embodiment-based learning group” and “traditional learning group.” Three variables, learning performance, perceived cooperative perception and social flow were assessed. Results show that there is no significant difference among the two groups in learning performance; however, participants in the embodiment-based learning group have higher perceived cooperation process and social flow during learning process than the traditional learning group.
Conference Paper
Full-text available
We describe the design and rationale for a project in which a room-sized mixed reality simulation was created to develop middle school students' knowledge and intuitions about how objects move in space. The simulation environment, called MEteor, uses laser-based motion tracking and both floor- and wall-projected imagery to encourage students to use their bodies to enact the trajectory of an asteroid as it travels in the vicinity of planets and their gravitational forces. By embedding students within an immersive simulation and offering novel perspectives on scientific phenomena, the intent is to engage learners in physics education at both an embodied and affective level. We describe a study showing improved attitudes towards science and feelings of engagement and learning for participants who used the whole-body MEteor simulation compared to a desktop computer version of the same simulation. We also discuss general implications for the design of technology-enhanced physics education environments.
Conference Paper
Full-text available
Information spaces are virtual workspaces that help us manage information by mapping it to the physical environment. This widely influential concept has been interpreted in a variety of forms, often in conjunction with mixed reality. We present Ethereal Planes, a design framework that ties together many existing variations of 2D information spaces. Ethereal Planes is aimed at assisting the design of user interfaces for next-generation technologies such as head-worn displays. From an extensive literature review, we encapsulated the common attributes of existing novel designs in seven design dimensions. Mapping the reviewed designs to the framework dimensions reveals a set of common usage patterns. We discuss how the Ethereal Planes framework can be methodically applied to help inspire new designs. We provide a concrete example of the framework's utility during the design of the Personal Cockpit, a window management system for head-worn displays.
Chapter
Recent research from a large number of fields has recently come together under the rubric of embodied cognitive science. Embodied cognitive science attempts to show specific ways in which the body shapes and constrains thought. I enumerate the standard variety of usages that the term "embodiment" currently receives in cognitive science and contrast notions of embodiment and experientialism at a variety of levels of investigation. The purpose is to develop a broad-based theoretic framework for embodiment which can serve as a bridge between different fields. I introduce the theoretic framework using examples that trace related research issues such as mental imagery, mental rotation, spatial language and conceptual metaphor across several levels of investigation. As a survey piece, this chapter covers numerous different conceptualizations of the body ranging from the physiological and developmental to the mental and philosophical; theoretically, it focuses on questions of whether and how all these different conceptualizations can form a cohesive research program.
Conference Paper
This paper presents Word Out, an interactive game to train a child's creative thinking and problem solving capabilities through full body interaction. Targeted at children 4-7 years old, Word Out employs the Microsoft Kinect to detect the silhouette of players. Players are tasked to use their bodies to match the shapes of the alphabets displayed on the screen. Being explorative in nature, the game promotes creative learning through play, as well as encourages collaboration and kinesthetic learning for children. Over two months, more than 15,000 children have played Word Out installed in two different museums. This paper presents the design and implementation of the Word Out game and preliminary analyses of data collected at the museums to share insights on the potential of meaningful and engaging educational games that promote freedom and encourage creativity in the child.
Conference Paper
We explore the potential of bringing together the advantages of computer games and the physical world to increase engagement, collaboration and learning. We introduce EarthShake: A tangible interface and mixed-reality game consisting of an interactive multimodal earthquake table, block towers and a computer game synchronized with the physical world via depth camera sensing. EarthShake helps kids discover physics principles while experimenting with real blocks in a physical environment supported with audio and visual feedback. Students interactively make predictions, see results, grapple with disconfirming evidence and formulate explanations in forms of general principles. We report on a preliminary user study with 12 children, ages 4-8, indicating that EarthShake produces large and significant learning gains, improvement in explanation of physics concepts, and clear signs of productive collaboration and high engagement.
Book
Cognitive load theory (CLT) is one of the most important theories in educational psychology, a highly effective guide for the design of multimedia and other learning materials. This edited volume brings together the most prolific researchers from around the world who study various aspects of cognitive load to discuss its current theoretical as well as practical issues. The book is divided into three parts. The first part describes the theoretical foundations and assumptions of CLT, the second discusses the empirical findings about the application of CLT to the design of learning environments, and the third part concludes the book with discussions and suggestions for new directions for future research. It aims to become the standard handbook in CLT for researchers and graduate students in psychology, education, and educational technology.
Article
To support astronomy education, we developed a tangible learning environment called the tangible earth system. To clarify its problems, we defmed an assessment framework from the aspects of curriculum guidelines, design guidelines of tangible learning environments, and epistemology of agency. Based on the analysis of our small-scale user study, we identified problems of the system in terms of location, dynamics, and correspondence parameters.
Article
Museums are exploring new ways of using emerging digital technologies to enhance the visitor experience. In this context, our research focuses on designing, developing and studying interactions for museum exhibits that introduce cultural concepts in ways that are tangible and embodied. We introduce here a tangible tabletop installation piece that was designed for a museum exhibition contrasting Western and African notions of mapping history and place. Inspired by the Lukasa board, a mnemonic device used by the Luba peoples in Central Africa, the tabletop piece enables visitors to learn and understand symbolic and nonlinguistic mapping concepts that are central to the Lukasa by creating and sharing stories with each other. In this paper we share our design process, a user study focusing on children and learning, and design implications on how digital and tangible interaction technologies can be used for cultural learning in museum exhibits.