Content uploaded by Edward Melcer
Author content
All content in this area was uploaded by Edward Melcer on Oct 12, 2017
Content may be subject to copyright.
Moving to Learn: Exploring the Impact
of Physical Embodiment in Educational
Programming Games
Abstract
There has been increasing attention paid to the
necessity of Computational Thinking (CT) and CS
education in recent years. To address this need, a
broad spectrum of animation programming
environments and games have been created to engage
learners. However, most of these tools are designed for
the touchpad/mouse and keyboard, and few have been
evaluated to assess their efficacy in developing
CT/programming skills. This is problematic when trying
to understand the validity of such designs for CS
education, and whether there are alternative
approaches that may prove more effective. My
dissertation work helps address this problem. After
creating a framework based on a meta-review that
carefully dissects embodiment strategies in learning
games, I am building and evaluating tangible and
augmented reality versions of a CT game. I plan to
examine how these different forms of physical
interaction help to facilitate and enhance meaning-
making during the learning process, and whether/how
they improve related learning factors such as self-
beliefs and enjoyment.
Author Keywords
Physical Embodiment; Educational Games; Embodied
Interaction; Embodied Cognition; Programming;
Computational Thinking.
Permission to make digital or hard copies of part or all of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. Copyrights
for third-party components of this work must be honored. For all other
uses, contact the owner/author(s). Copyright is held by the
author/owner(s).
CHI'17 Extended Abstracts, May 06-11, 2017, Denver, CO, USA
ACM 978-1-4503-4656-6/17/05.
http://dx.doi.org/10.1145/3027063.3027129
Edward Melcer
New York University
Brooklyn, NY 11201, USA
eddie.melcer@nyu.edu
ACM Classification Keywords
H.5.m. Information interfaces and presentation (e.g.,
HCI): Miscellaneous.
Background and Motivation
In recent years, there has been a substantial amount of
public attention around the necessity of Computational
Thinking (CT) and CS education with notable calls from
the National Science Foundation and president of the
United States of America [3, 10]. A broad spectrum of
animation programming environments (e.g., Logo [26],
Scratch [32], and Blockly [9]) as well as puzzle
programming games (e.g., Mazzy [18] and Machineers
[22]) have been created to teach these crucial CT skills.
However, a recent survey reveals most CT education
tools created commercially and academically to be
almost exclusively designed for the touchpad/mouse
and keyboard [13]. Additionally, few of these systems
have been evaluated to assess their efficacy in
developing CT/programming skills. This is problematic
when trying to understand the validity of such designs
for CS education and whether there are alternative
approaches that may prove more effective.
Furthermore, little is known about whether CT-focused
games actually improve other important educational
factors for STEM learning (such as engagement,
enjoyment, and programming self-beliefs [1, 35]), or if
they simply function as chocolate-covered broccoli.
Conversely, recent work has suggested that body-
based, physically embodied designs provide affordances
that aid in the meaning-making process and offer
greater learning benefits than traditional keyboard and
mouse games [24, 27, 29]. Two physical approaches of
particular relevance within the HCI and Learning
Science communities are tangibles/manipulatives [25,
27] and augmented reality (AR) [7, 19]. The primary
advantage of tangibles over traditional desktop
applications is that they allow for learning concepts to
be embedded directly into the physical material and
design of an object, as well as through the embodied
interactions learners have by manipulating these
objects [30]. AR’s primary advantage is utilizing
embodied cognition to help learners develop
understanding through mirroring or enacting learning
concepts with their body [19]. These physical design
approaches have also shown beneficial effects on key
learning factors such as engagement [6], enjoyment
[39], and positive feelings towards learning content and
science in general [21].
The goal of my research is to explore how the diverse
affordances of these various forms of physical
embodiment can differ in impact upon the meaning-
making process and related factors for learners [23,
24]. This will be done through creation, evaluation, and
comparison of educational programming games utilizing
different forms of physical embodiment.
Related Work
Physical Embodiment
In my research, I take a broad perspective towards
embodiment: centering it around the notion that
human reasoning and behavior is connected to, or
influenced by our bodies and their physical/social
experience and interaction with the world [31]. This is
seen as an iterative relationship, where reasoning and
behavior can shape interaction as well as the other way
round, yet also complex because of the context, time,
space, emotion, etc. in which interaction is situated.
Applying this perspective in a related work survey I did
when constructing a design framework [23], I identified
five different forms of physical embodiment: 1) Direct
Embodied focuses on gestural congruency and how
the body can physically represent learning concepts
[16]. 2) Enacted focuses on acting out/enacting
knowledge through physical action (i.e., knowledge-as-
action) [14]. 3) Manipulated focuses on utilization of
embodied metaphors and interactions with physical
objects [2], and the objects' physical embodiment of
learning concepts [15, 28]. 4) Surrogate focuses on
learners manipulating a physical agent or "surrogate"
representative of themselves to enact learning concepts
[8]. 5) Augmented focuses on combined use of a
representational system (e.g., avatar) and augmented
feedback system (e.g., Microsoft Kinect and TV screen)
to embed the learner within an augmented reality
system [8].
Computational Thinking
CT is a complex construct with a wide variety of
definitions. However, [4, 5] have identified a core set of
CT skills commonly utilized in the literature as: 1)
Conditional Logic - the use of an “if-then-else”
construct; 2) Algorithm Building - a data “recipe” or set
of instructions; 3) Simulation - modeling or testing of
algorithms or logic; 4) Debugging - the act of
determining problems in order to fix rules that are
malfunctioning; and 5) Abstraction - use of procedures
to encapsulate a set of often repeated commands.
Tangibles and Computational Concepts
There has been some work in the tangible and
embodied interaction community on the creation of
tangibles to teach computing concepts such as roBlocks
[34], Note Code [20], Thingy Oriented Programming
[12], TanProRobot 2.0 [37], and Electronic Blocks [38].
However, concepts covered by these tools are focused
on physical computing, electronics, and music rather
than actual computational thinking or games.
Problem Statement
The primary question addressed by my research is:
How do different forms of physical embodiment and
interaction impact learning in educational games? I am
working towards answering this question in the context
of educational programming games. From this, there
are three main sub-questions guiding my work:
1. What affordances do different forms of physical
embodiment and interaction provide to facilitate
meaning-making during the learning process?
2. What forms of physical embodiment prove more
effective for learning certain Computational
Thinking skills and why?
3. Do different forms of physical embodiment and
interaction have differing outcomes on related
learning factors such as self-beliefs, cognitive
load, enjoyment, and engagement?
Research Goals and Methods
Based on the above questions, the goal of this research
is to explore if applying physically embodied designs
results in improved learning outcomes for core CT skills
(i.e., Algorithm Building, Abstraction, Simulation, and
Debugging) and related learning factors. I have already
laid the theoretical groundwork for this examination
through the creation of a design framework for
embodied learning games and simulations [23, 24].
Using the design framework, my aim is to create
different versions of a CT game called Bots &
(Main)Frames based on common forms of physical
embodiment and evaluate/compare/refine them across
three studies with novice programmers.
Figure 2: The tangible programming blocks version of the CT
game.
The first study will compare the prototypical CT puzzle
game version for mouse (see Figure 1) with a tangible
programming blocks version utilizing fiducial tracking
from the ReacTIVision framework [17] to program (see
Figure 2). The second study will compare these against
an AR version where programming is touch-based on a
tablet and players instead enact execution of their code
by walking through physical space (see Figure 3). I
plan to analyze learning outcomes for these studies
using a between-subjects design with video recording
and qualitative coding/analysis [33] to identify
occurrences of CT and physical embodiment during
play. This will be done in conjunction with assessments
of programming self-beliefs [36], cognitive load [11],
and enjoyment to compare improvements in key
learning factors.
For the third study, I plan to use prior findings to
iterate and refine existing designs of the tangible and
AR games to enhance their efficacy before reevaluation
with a K-12 population. The doctoral consortium will
prove especially beneficial to my work for this aspect
since I will have both of the original designs to present
and feedback will greatly benefit the iteration process.
Expected Contributions
Through this dissertation work, I expect to make the
following contributions:
1. Empirical and artifact-based contributions
towards understanding the design space of
physically embodied educational games, in the
form of a design framework [23, 24] and
evaluated physical computational thinking
games.
2. New understanding and evidence concerning how
physical embodiment and interaction can impact
meaning-making during the learning process.
3. Design suggestions for creating engaging and
enjoyable educational programming games.
References
1. Ainley, M. and Ainley, J. 2011. Student
engagement with science in early adolescence: The
contribution of enjoyment to students’ continuing
interest in learning about science. Contemporary
Educational Psychology. 36, 1 (2011), 4–12.
2. Bakker, S. et al. 2012. Embodied metaphors in
tangible interaction design. Personal and Ubiquitous
Computing (2012).
3. Barr, D. et al. 2011. Computational Thinking: A
Digital Age Skill for Everyone. Learning & Leading
with Technology. 38, 6 (2011), 20–23.
4. Barr, V. and Stephenson, C. 2011. Bringing
computational thinking to K-12: what is Involved
and what is the role of the computer science
education community? ACM Inroads.
5. Berland, M. and Lee, V.R. 2011. Collaborative
Strategic Board Games as a Site for Distributed
Figure 1: The prototypical
keyboard and mouse CT game.
Figure 3: The proposed AR
version of the CT game.
Computational Thinking. International Journal of
Game-Based Learning. 1, 2 (2011), 65–81.
6. Bhattacharya, A. et al. 2015. Designing Motion-
Based Activities to Engage Students with Autism in
Classroom Settings. IDC 2015 (2015), 69–78.
7. Birchfield, D. et al. 2008. Embodiment,
Multimodality, and Composition: Convergent
Themes across HCI and Education for Mixed-Reality
Learning Environments. Advances in Human-
Computer Interaction. 2008, (2008), 1–19.
8. Black, J.B. et al. 2012. Embodied cognition and
learning environment design. Theoretical
foundations of learning environments. 198–223.
9. Blockly: A visual programming editor:
https://developers.google.com/blockly/. Accessed:
2016-10-09.
10. Computer Science For All: 2016.
https://www.whitehouse.gov/blog/2016/01/30/co
mputer-science-all. Accessed: 2016-09-21.
11. Eysink, T.H.S. et al. 2009. Learner Performance in
Multimedia Learning Arrangements: An Analysis
Across Instructional Approaches. American
Educational Research Journal. 46, 4 (2009), 1107–
1149.
12. Güldenpfennig, F. et al. 2016. Toward Thingy
Oriented Programming: Recording Marcos With
Tangibles. Proceedings of the TEI’16: Tenth
International Conference on Tangible, Embedded,
and Embodied Interaction (2016), 455–461.
13. Harteveld, C. et al. 2014. A Design-Focused
Analysis of Games Teaching Computer Science.
Proceedings of Games+ Learning+ Society 10
(2014).
14. Holton, D.L. 2010. Constructivism + embodied
cognition = enactivism: theoretical and practical
implications for conceptual change. AERA 2010
Conference (2010).
15. Ishii, H. 2008. Tangible bits: beyond pixels.
Proceedings of the 2nd international conference on
Tangible and Embedded Intreaction (TEI ’08)
(2008).
16. Johnson-Glenberg, M.C. et al. 2014. Collaborative
embodied learning in mixed reality motion-capture
environments: Two science studies. Journal of
Educational Psychology. 106, 1 (2014), 86–104.
17. Kaltenbrunner, M. and Bencina, R. 2007.
reacTIVision: a computer-vision framework for
table-based tangible interaction. Proceedings of the
1st international conference on Tangible and
embedded interaction. (2007), 69–74.
18. Kao, D. and Harrell, D.F. 2015. Mazzy: A STEM
Learning Game. Foundations of Digital Games
(2015).
19. Kelliher, A. et al. 2009. SMALLab: A mixed-reality
environment for embodied and mediated learning.
MM’09 - Proceedings of the 2009 ACM Multimedia
Conference, with Co-located Workshops and
Symposiums (2009), 1029–1031.
20. Kumar, V. et al. 2015. Note Code – A Tangible
Music Programming Puzzle Tool. Proceedings of the
10th International Conference on Tangible,
Embedded, and Embodied Interaction - TEI ’15
(2015), 625–629.
21. Lindgren, R. et al. 2013. MEteor: Developing
Physics Concepts Through Body- Based Interaction
With A Mixed Reality Simulation. Physics Education
Research Conference - PERC ’13 (2013), 217–220.
22. Lode, H. et al. 2013. Machineers: playfully
introducing programming to children. CHI ’13
Human Factors in Computing Systems (2013),
2639–2642.
23. Melcer, E. and Isbister, K. 2016. Bridging the
Physical Divide: A Design Framework for Embodied
Learning Games and Simulations. CHI’16 Extended
Abstracts (2016), 2225–2233.
24. Melcer, E. and Isbister, K. 2016. Bridging the
Physical Learning Divides: A Design Framework for
Embodied Learning Games and Simulations.
Proceedings of the 1st International Joint
Conference of DiGRA and FDG (2016).
25. O’Malley, C. and Fraser, S. 2004. Literature review
in learning with tangible technologies.
26. Papert, S. 1980. Mindstorms: Children, computers,
and powerful ideas. Basic Books, Inc.
27. Pouw, W.T.J.L. et al. 2014. An Embedded and
Embodied Cognition Review of Instructional
Manipulatives. Educational Psychology Review. 26,
1 (2014), 51–72.
28. Price, S. 2008. A representation approach to
conceptualizing tangible learning environments.
Proceedings of the 2nd international conference on
Tangible and embedded interaction TEI 08 (2008),
151.
29. Price, S. et al. 2010. Action and representation in
tangible systems: implications for design of
learning interactions. Proceedings of the fourth
international conference on Tangible, embedded,
and embodied interaction - TEI ’10 (2010), 145–
152.
30. Price, S. et al. 2008. Towards a framework for
investigating tangible environments for learning.
International Journal of Arts and Technology. 1, 3/4
(2008), 351–368.
31. Price, S. and Jewitt, C. 2013. A multimodal
approach to examining “embodiment” in tangible
learning environments. Proceedings of TEI ’13
(2013), 43–50.
32. Resnick, M. et al. 2009. Scratch: Programming for
All. Communications of the ACM. 52, (2009), 60–
67.
33. Saldaña, J. 2015. The coding manual for qualitative
researchers. Sage.
34. Schweikardt, E. and Gross, M. 2008. The robot is
the program: interacting with roBlocks.
Proceedings of the second international conference
on Tangible, embedded, and embodied interaction -
TEI ’08 (2008), 167–168.
35. Scott, M.J. and Ghinea, G. 2013. Educating
programmers: A reflection on barriers to deliberate
practice. Proceedings of the 2nd Annual HEA STEM
Conference (2013).
36. Scott, M.J. and Ghinea, G. 2014. Measuring
enrichment: the assembly and validation of an
instrument to assess student self-beliefs in CS1.
Proceedings of the tenth annual conference on
International computing education research (2014),
123–130.
37. Wang, D. et al. 2016. A Tangible Embedded
Programming System to Convey Event-Handling
Concept. Proceedings of the TEI’16: Tenth
International Conference on Tangible, Embedded,
and Embodied Interaction (2016), 133–140.
38. Wyeth, P. 2008. How Young Children Learn to
Program With Sensor, Action, and Logic Blocks.
Journal of the Learning Sciences. 17, 4 (2008),
517–550.
39. Yannier, N. et al. 2016. Adding Physicality to an
Interactive Game Improves Learning and
Enjoyment : Evidence from EarthShake. ACM
Transactions on Computer-Human Interaction
(TOCHI). 23, 4 (2016), 1–31.