Conference PaperPDF Available

Moving to Learn: Exploring the Impact of Physical Embodiment in Educational Programming Games

Authors:

Abstract and Figures

There has been increasing attention paid to the necessity of Computational Thinking (CT) and CS education in recent years. To address this need, a broad spectrum of animation programming environments and games have been created to engage learners. However, most of these tools are designed for the touchpad/mouse and keyboard, and few have been evaluated to assess their efficacy in developing CT/programming skills. This is problematic when trying to understand the validity of such designs for CS education, and whether there are alternative approaches that may prove more effective. My dissertation work helps address this problem. After creating a framework based on a meta-review that carefully dissects embodiment strategies in learning games, I am building and evaluating tangible and augmented reality versions of a CT game. I plan to examine how these different forms of physical interaction help to facilitate and enhance meaning-making during the learning process, and whether/how they improve related learning factors such as self-beliefs and enjoyment.
Content may be subject to copyright.
Moving to Learn: Exploring the Impact
of Physical Embodiment in Educational
Programming Games
Abstract
There has been increasing attention paid to the
necessity of Computational Thinking (CT) and CS
education in recent years. To address this need, a
broad spectrum of animation programming
environments and games have been created to engage
learners. However, most of these tools are designed for
the touchpad/mouse and keyboard, and few have been
evaluated to assess their efficacy in developing
CT/programming skills. This is problematic when trying
to understand the validity of such designs for CS
education, and whether there are alternative
approaches that may prove more effective. My
dissertation work helps address this problem. After
creating a framework based on a meta-review that
carefully dissects embodiment strategies in learning
games, I am building and evaluating tangible and
augmented reality versions of a CT game. I plan to
examine how these different forms of physical
interaction help to facilitate and enhance meaning-
making during the learning process, and whether/how
they improve related learning factors such as self-
beliefs and enjoyment.
Author Keywords
Physical Embodiment; Educational Games; Embodied
Interaction; Embodied Cognition; Programming;
Computational Thinking.
Permission to make digital or hard copies of part or all of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. Copyrights
for third-party components of this work must be honored. For all other
uses, contact the owner/author(s). Copyright is held by the
author/owner(s).
CHI'17 Extended Abstracts, May 06-11, 2017, Denver, CO, USA
ACM 978-1-4503-4656-6/17/05.
http://dx.doi.org/10.1145/3027063.3027129
Edward Melcer
New York University
Brooklyn, NY 11201, USA
eddie.melcer@nyu.edu
ACM Classification Keywords
H.5.m. Information interfaces and presentation (e.g.,
HCI): Miscellaneous.
Background and Motivation
In recent years, there has been a substantial amount of
public attention around the necessity of Computational
Thinking (CT) and CS education with notable calls from
the National Science Foundation and president of the
United States of America [3, 10]. A broad spectrum of
animation programming environments (e.g., Logo [26],
Scratch [32], and Blockly [9]) as well as puzzle
programming games (e.g., Mazzy [18] and Machineers
[22]) have been created to teach these crucial CT skills.
However, a recent survey reveals most CT education
tools created commercially and academically to be
almost exclusively designed for the touchpad/mouse
and keyboard [13]. Additionally, few of these systems
have been evaluated to assess their efficacy in
developing CT/programming skills. This is problematic
when trying to understand the validity of such designs
for CS education and whether there are alternative
approaches that may prove more effective.
Furthermore, little is known about whether CT-focused
games actually improve other important educational
factors for STEM learning (such as engagement,
enjoyment, and programming self-beliefs [1, 35]), or if
they simply function as chocolate-covered broccoli.
Conversely, recent work has suggested that body-
based, physically embodied designs provide affordances
that aid in the meaning-making process and offer
greater learning benefits than traditional keyboard and
mouse games [24, 27, 29]. Two physical approaches of
particular relevance within the HCI and Learning
Science communities are tangibles/manipulatives [25,
27] and augmented reality (AR) [7, 19]. The primary
advantage of tangibles over traditional desktop
applications is that they allow for learning concepts to
be embedded directly into the physical material and
design of an object, as well as through the embodied
interactions learners have by manipulating these
objects [30]. AR’s primary advantage is utilizing
embodied cognition to help learners develop
understanding through mirroring or enacting learning
concepts with their body [19]. These physical design
approaches have also shown beneficial effects on key
learning factors such as engagement [6], enjoyment
[39], and positive feelings towards learning content and
science in general [21].
The goal of my research is to explore how the diverse
affordances of these various forms of physical
embodiment can differ in impact upon the meaning-
making process and related factors for learners [23,
24]. This will be done through creation, evaluation, and
comparison of educational programming games utilizing
different forms of physical embodiment.
Related Work
Physical Embodiment
In my research, I take a broad perspective towards
embodiment: centering it around the notion that
human reasoning and behavior is connected to, or
influenced by our bodies and their physical/social
experience and interaction with the world [31]. This is
seen as an iterative relationship, where reasoning and
behavior can shape interaction as well as the other way
round, yet also complex because of the context, time,
space, emotion, etc. in which interaction is situated.
Applying this perspective in a related work survey I did
when constructing a design framework [23], I identified
five different forms of physical embodiment: 1) Direct
Embodied focuses on gestural congruency and how
the body can physically represent learning concepts
[16]. 2) Enacted focuses on acting out/enacting
knowledge through physical action (i.e., knowledge-as-
action) [14]. 3) Manipulated focuses on utilization of
embodied metaphors and interactions with physical
objects [2], and the objects' physical embodiment of
learning concepts [15, 28]. 4) Surrogate focuses on
learners manipulating a physical agent or "surrogate"
representative of themselves to enact learning concepts
[8]. 5) Augmented focuses on combined use of a
representational system (e.g., avatar) and augmented
feedback system (e.g., Microsoft Kinect and TV screen)
to embed the learner within an augmented reality
system [8].
Computational Thinking
CT is a complex construct with a wide variety of
definitions. However, [4, 5] have identified a core set of
CT skills commonly utilized in the literature as: 1)
Conditional Logic - the use of an “if-then-else”
construct; 2) Algorithm Building - a data “recipe” or set
of instructions; 3) Simulation - modeling or testing of
algorithms or logic; 4) Debugging - the act of
determining problems in order to fix rules that are
malfunctioning; and 5) Abstraction - use of procedures
to encapsulate a set of often repeated commands.
Tangibles and Computational Concepts
There has been some work in the tangible and
embodied interaction community on the creation of
tangibles to teach computing concepts such as roBlocks
[34], Note Code [20], Thingy Oriented Programming
[12], TanProRobot 2.0 [37], and Electronic Blocks [38].
However, concepts covered by these tools are focused
on physical computing, electronics, and music rather
than actual computational thinking or games.
Problem Statement
The primary question addressed by my research is:
How do different forms of physical embodiment and
interaction impact learning in educational games? I am
working towards answering this question in the context
of educational programming games. From this, there
are three main sub-questions guiding my work:
1. What affordances do different forms of physical
embodiment and interaction provide to facilitate
meaning-making during the learning process?
2. What forms of physical embodiment prove more
effective for learning certain Computational
Thinking skills and why?
3. Do different forms of physical embodiment and
interaction have differing outcomes on related
learning factors such as self-beliefs, cognitive
load, enjoyment, and engagement?
Research Goals and Methods
Based on the above questions, the goal of this research
is to explore if applying physically embodied designs
results in improved learning outcomes for core CT skills
(i.e., Algorithm Building, Abstraction, Simulation, and
Debugging) and related learning factors. I have already
laid the theoretical groundwork for this examination
through the creation of a design framework for
embodied learning games and simulations [23, 24].
Using the design framework, my aim is to create
different versions of a CT game called Bots &
(Main)Frames based on common forms of physical
embodiment and evaluate/compare/refine them across
three studies with novice programmers.
Figure 2: The tangible programming blocks version of the CT
game.
The first study will compare the prototypical CT puzzle
game version for mouse (see Figure 1) with a tangible
programming blocks version utilizing fiducial tracking
from the ReacTIVision framework [17] to program (see
Figure 2). The second study will compare these against
an AR version where programming is touch-based on a
tablet and players instead enact execution of their code
by walking through physical space (see Figure 3). I
plan to analyze learning outcomes for these studies
using a between-subjects design with video recording
and qualitative coding/analysis [33] to identify
occurrences of CT and physical embodiment during
play. This will be done in conjunction with assessments
of programming self-beliefs [36], cognitive load [11],
and enjoyment to compare improvements in key
learning factors.
For the third study, I plan to use prior findings to
iterate and refine existing designs of the tangible and
AR games to enhance their efficacy before reevaluation
with a K-12 population. The doctoral consortium will
prove especially beneficial to my work for this aspect
since I will have both of the original designs to present
and feedback will greatly benefit the iteration process.
Expected Contributions
Through this dissertation work, I expect to make the
following contributions:
1. Empirical and artifact-based contributions
towards understanding the design space of
physically embodied educational games, in the
form of a design framework [23, 24] and
evaluated physical computational thinking
games.
2. New understanding and evidence concerning how
physical embodiment and interaction can impact
meaning-making during the learning process.
3. Design suggestions for creating engaging and
enjoyable educational programming games.
References
1. Ainley, M. and Ainley, J. 2011. Student
engagement with science in early adolescence: The
contribution of enjoyment to students’ continuing
interest in learning about science. Contemporary
Educational Psychology. 36, 1 (2011), 412.
2. Bakker, S. et al. 2012. Embodied metaphors in
tangible interaction design. Personal and Ubiquitous
Computing (2012).
3. Barr, D. et al. 2011. Computational Thinking: A
Digital Age Skill for Everyone. Learning & Leading
with Technology. 38, 6 (2011), 2023.
4. Barr, V. and Stephenson, C. 2011. Bringing
computational thinking to K-12: what is Involved
and what is the role of the computer science
education community? ACM Inroads.
5. Berland, M. and Lee, V.R. 2011. Collaborative
Strategic Board Games as a Site for Distributed
Figure 1: The prototypical
keyboard and mouse CT game.
Figure 3: The proposed AR
version of the CT game.
Computational Thinking. International Journal of
Game-Based Learning. 1, 2 (2011), 6581.
6. Bhattacharya, A. et al. 2015. Designing Motion-
Based Activities to Engage Students with Autism in
Classroom Settings. IDC 2015 (2015), 6978.
7. Birchfield, D. et al. 2008. Embodiment,
Multimodality, and Composition: Convergent
Themes across HCI and Education for Mixed-Reality
Learning Environments. Advances in Human-
Computer Interaction. 2008, (2008), 119.
8. Black, J.B. et al. 2012. Embodied cognition and
learning environment design. Theoretical
foundations of learning environments. 198223.
9. Blockly: A visual programming editor:
https://developers.google.com/blockly/. Accessed:
2016-10-09.
10. Computer Science For All: 2016.
https://www.whitehouse.gov/blog/2016/01/30/co
mputer-science-all. Accessed: 2016-09-21.
11. Eysink, T.H.S. et al. 2009. Learner Performance in
Multimedia Learning Arrangements: An Analysis
Across Instructional Approaches. American
Educational Research Journal. 46, 4 (2009), 1107
1149.
12. Güldenpfennig, F. et al. 2016. Toward Thingy
Oriented Programming: Recording Marcos With
Tangibles. Proceedings of the TEI’16: Tenth
International Conference on Tangible, Embedded,
and Embodied Interaction (2016), 455461.
13. Harteveld, C. et al. 2014. A Design-Focused
Analysis of Games Teaching Computer Science.
Proceedings of Games+ Learning+ Society 10
(2014).
14. Holton, D.L. 2010. Constructivism + embodied
cognition = enactivism: theoretical and practical
implications for conceptual change. AERA 2010
Conference (2010).
15. Ishii, H. 2008. Tangible bits: beyond pixels.
Proceedings of the 2nd international conference on
Tangible and Embedded Intreaction (TEI ’08)
(2008).
16. Johnson-Glenberg, M.C. et al. 2014. Collaborative
embodied learning in mixed reality motion-capture
environments: Two science studies. Journal of
Educational Psychology. 106, 1 (2014), 86104.
17. Kaltenbrunner, M. and Bencina, R. 2007.
reacTIVision: a computer-vision framework for
table-based tangible interaction. Proceedings of the
1st international conference on Tangible and
embedded interaction. (2007), 6974.
18. Kao, D. and Harrell, D.F. 2015. Mazzy: A STEM
Learning Game. Foundations of Digital Games
(2015).
19. Kelliher, A. et al. 2009. SMALLab: A mixed-reality
environment for embodied and mediated learning.
MM’09 - Proceedings of the 2009 ACM Multimedia
Conference, with Co-located Workshops and
Symposiums (2009), 10291031.
20. Kumar, V. et al. 2015. Note Code A Tangible
Music Programming Puzzle Tool. Proceedings of the
10th International Conference on Tangible,
Embedded, and Embodied Interaction - TEI ’15
(2015), 625629.
21. Lindgren, R. et al. 2013. MEteor: Developing
Physics Concepts Through Body- Based Interaction
With A Mixed Reality Simulation. Physics Education
Research Conference - PERC ’13 (2013), 217220.
22. Lode, H. et al. 2013. Machineers: playfully
introducing programming to children. CHI ’13
Human Factors in Computing Systems (2013),
26392642.
23. Melcer, E. and Isbister, K. 2016. Bridging the
Physical Divide: A Design Framework for Embodied
Learning Games and Simulations. CHI’16 Extended
Abstracts (2016), 22252233.
24. Melcer, E. and Isbister, K. 2016. Bridging the
Physical Learning Divides: A Design Framework for
Embodied Learning Games and Simulations.
Proceedings of the 1st International Joint
Conference of DiGRA and FDG (2016).
25. O’Malley, C. and Fraser, S. 2004. Literature review
in learning with tangible technologies.
26. Papert, S. 1980. Mindstorms: Children, computers,
and powerful ideas. Basic Books, Inc.
27. Pouw, W.T.J.L. et al. 2014. An Embedded and
Embodied Cognition Review of Instructional
Manipulatives. Educational Psychology Review. 26,
1 (2014), 5172.
28. Price, S. 2008. A representation approach to
conceptualizing tangible learning environments.
Proceedings of the 2nd international conference on
Tangible and embedded interaction TEI 08 (2008),
151.
29. Price, S. et al. 2010. Action and representation in
tangible systems: implications for design of
learning interactions. Proceedings of the fourth
international conference on Tangible, embedded,
and embodied interaction - TEI ’10 (2010), 145
152.
30. Price, S. et al. 2008. Towards a framework for
investigating tangible environments for learning.
International Journal of Arts and Technology. 1, 3/4
(2008), 351368.
31. Price, S. and Jewitt, C. 2013. A multimodal
approach to examining “embodiment” in tangible
learning environments. Proceedings of TEI ’13
(2013), 4350.
32. Resnick, M. et al. 2009. Scratch: Programming for
All. Communications of the ACM. 52, (2009), 60
67.
33. Saldaña, J. 2015. The coding manual for qualitative
researchers. Sage.
34. Schweikardt, E. and Gross, M. 2008. The robot is
the program: interacting with roBlocks.
Proceedings of the second international conference
on Tangible, embedded, and embodied interaction -
TEI ’08 (2008), 167168.
35. Scott, M.J. and Ghinea, G. 2013. Educating
programmers: A reflection on barriers to deliberate
practice. Proceedings of the 2nd Annual HEA STEM
Conference (2013).
36. Scott, M.J. and Ghinea, G. 2014. Measuring
enrichment: the assembly and validation of an
instrument to assess student self-beliefs in CS1.
Proceedings of the tenth annual conference on
International computing education research (2014),
123130.
37. Wang, D. et al. 2016. A Tangible Embedded
Programming System to Convey Event-Handling
Concept. Proceedings of the TEI’16: Tenth
International Conference on Tangible, Embedded,
and Embodied Interaction (2016), 133140.
38. Wyeth, P. 2008. How Young Children Learn to
Program With Sensor, Action, and Logic Blocks.
Journal of the Learning Sciences. 17, 4 (2008),
517550.
39. Yannier, N. et al. 2016. Adding Physicality to an
Interactive Game Improves Learning and
Enjoyment : Evidence from EarthShake. ACM
Transactions on Computer-Human Interaction
(TOCHI). 23, 4 (2016), 131.
... For this, researchers proposed and developed numerous block-based and tangible programming interfaces that made programming more accessible to non-programmers [4,5,29,47,50,65,74,75]. Block-based interfaces conceptually focus on making programming easier by presenting programming concepts as visually distinct blocks. However, the traditional method of engaging with the blocks on a computer screen using a mouse or touch interface hinders the user's ability to fully immerse themselves in a physically engaging way rather than an external party behind the glass screen [52]. Tangible programming interfaces bring the interaction away from the computer screen but are limited by the physical materials representing the program constructs [68]. ...
... Along with the development of block-based interfaces, researchers employed tangible interaction for learning to program for over 30 years [34,51] and discovered that it could foster enhanced physical sensemaking and engagement [7,22,46,48]. Using Virtual Reality sacrifices some of the richness of touching and sensing physical material but could still provide an embodied experience compared to a sedentary desktop setting [52]. Some affordances of a tangible interface can also be mimicked in VR. ...
... Their physical activity is contrasted with the more passive looking activity in the foreground whereby Fedmyster sits beside Pokimane and watches her play Animal Crossing (see Figure 10.10). At 9:32 of the video, there are parallel physical and digital play activities (Melcer, 2017) occurring in the physical and virtual world of Animal Crossing. Even though Pokimane and Fedmyster are physically seated in the foreground, Pokimane is cognitively active in controlling her avatar to fish in the virtual world of Animal Crossing. ...
... We find in literature three main approaches to address CT in a kinesthetic way at K-12 school levels. First, CT is being developed by means of tangible devices and interfaces, which can be programmed through physical object manipulation (figure 4.8, left) (e.g., Aggarwal,Gardner-McCune, and Touretzky 2017;Melcer 2017;Wang, Wang, and Liu 2014). Second, we find several experiences where CT is fostered through dance. ...
Chapter
Full-text available
A guide to computational thinking education, with a focus on artificial intelligence literacy and the integration of computing and physical objects. Computing has become an essential part of today's primary and secondary school curricula. In recent years, K–12 computer education has shifted from computer science itself to the broader perspective of computational thinking (CT), which is less about technology than a way of thinking and solving problems—“a fundamental skill for everyone, not just computer scientists,” in the words of Jeanette Wing, author of a foundational article on CT. This volume introduces a variety of approaches to CT in K–12 education, offering a wide range of international perspectives that focus on artificial intelligence (AI) literacy and the integration of computing and physical objects. The book first offers an overview of CT and its importance in K–12 education, covering such topics as the rationale for teaching CT; programming as a general problem-solving skill; and the “phenomenon-based learning” approach. It then addresses the educational implications of the explosion in AI research, discussing, among other things, the importance of teaching children to be conscientious designers and consumers of AI. Finally, the book examines the increasing influence of physical devices in CT education, considering the learning opportunities offered by robotics. Contributors Harold Abelson, Cynthia Breazeal, Karen Brennan, Michael E. Caspersen, Christian Dindler, Daniella DiPaola, Nardie Fanchamps, Christina Gardner-McCune, Mark Guzdial, Kai Hakkarainen, Fredrik Heintz, Paul Hennissen, H. Ulrich Hoppe, Ole Sejer Iversen, Siu-Cheung Kong, Wai-Ying Kwok, Sven Manske, Jesús Moreno-León, Blakeley H. Payne, Sini Riikonen, Gregorio Robles, Marcos Román-González, Pirita Seitamaa-Hakkarainen, Ju-Ling Shih, Pasi Silander, Lou Slangen, Rachel Charlotte Smith, Marcus Specht, Florence R. Sullivan, David S. Touretzky
... Recent perspectives of embodied cognitive science offer new methodological prospects for exploring children's CT, where CT is studied as a process rather than a product of learning. Although some scholars have begun studying CT from an embodied perspective (Black et al., 2012;Chung & Hsiao, 2019;Melcer, 2017), attempts to conceptualize CT from an embodied perspective have not translated into researchers' methodological preferences. The research designs associated with CT have often been reductive, ignoring the chaotic, self-organizing aspects of the process. ...
Conference Paper
Full-text available
This study first presents an approach to the study of computational thinking (CT) as an embodied phenomenon that relies on the creation and analysis of multimodal transcripts. The approach, which incorporates a social semiotic approach to multimodality, is then used to train an artificial intelligence (AI) to recognize patterns in the participant’s behaviors that reflect their embodiment of CT during an educational robotics activity. The AI was developed to ease the labor-intensive aspects of creating and analyzing a multimodal transcript. The findings suggested that the AI-enhanced pattern recognition approach identified similar clusters of activity as human analysis, adding a level of confidence to the analysis of children’s CT that would be difficult to achieve using human analysis.
... In recent years, there has been an increase in research exploring how embodied interaction, coupled with physical artefacts can support learning. By embodiment is meant the connection between human reasoning and our bodies, and their physical/social experience in the world [30]. Embodied interaction can leverage children's pre-existing knowledge of the physical and social worlds to facilitate understanding when learning with artefacts (e.g. ...
Conference Paper
In the domain of computing education for children, much work has been done to devise creative and engaging methods of teaching about programming. However, there are many other fundamental aspects of computing that have so far received relatively less attention. This work explores how the topics of number systems and data representation can be taught in a way that piques curiosity and captures learners’ imaginations. Specifically, we present the design of two interactive physical computing artefacts, which we collectively call DataMoves, that enable students, 12-14 years old, to explore number systems and data through embodied movement and dance. Our evaluation of DataMoves, used in tandem with other pedagogical methods, demonstrates that the form of embodied, exploration-based learning adopted has much potential for deepening students’ understandings of computing topics, as well as for shaping positive perceptions of topics that are traditionally considered boring and dull.
... Participants were university students and the results demonstrated that students enjoyed using TUI more although TPL is more appropriate for very young children. According to [36] most CT tools are designed for mouse and keyboard or some other touch device. Tangible objects may provide a way to include learning concepts directly into those objects. ...
... To immerse a learner into programming with blocks, Melcer [6] proposes tangible programming approach, where the blocks are real cubes that can be assembled together and the resulting algorithm is evaluated and displayed using the augmented reality. is way the learner changes behavior of a virtual character by interacting with real objects, what seems to be more natural for children than mouse device and a computer screen. However, learner may easily get distracted from the surroundings and the number of blocks from each type is limited. ...
Conference Paper
Block-based programming languages are successfully being used as an alternative way of teaching introductory programming concepts. The success is in part due to the low barrier of entry and the visual game-like appeal fostering experimentation and creativity. Virtual reality (VR) presents a step further to an even more immersive and engaging experience. In this demo, we showcase our project Cubely, an immersive VR programming environment in which novice programmers solve programming puzzles within a virtual world. The puzzles are similar to Code.org exercises and solutions to the exercises are assembled by the programmer within the same virtual world using the cubes representing program instructions. The whole environment is templated to a theme of the popular Minecraft video game.
Thesis
Full-text available
O pensamento computacional é tido como uma competência necessária para conviver e prosperar na sociedade contemporânea; no entanto, diversos desafios permeiam a sua implementação na sala de aula. Um deles refere-se a estratégias e materiais didáticos que deem suporte ao seu desenvolvimento na educação básica. Embora o pensamento computacional possa ser aplicado em diferentes áreas, a maioria dos estudos se concentrou no desenvolvimento de habilidades de programação, o que pode limitar o potencial de aplicação dessa competência computacional. Ainda, relativamente poucas pesquisas exploraram a relação entre atividades plugadas e desplugadas e as experiências de aprendizagem que elas geram nos estudantes. Ao focar em estratégias de aprendizagem, como cognição incorporada e contação de história, suportadas por essas abordagens, ainda menos estudos são identificados, apesar do potencial pedagógico de tais estratégias. Neste contexto, esta tese propõe uma abordagem para o desenvolvimento do pensamento computacional voltada ao Ensino Fundamental I. A pesquisa parte do pressuposto que fundamentar as atividades na cognição incorporada e no contexto cultural dos estudantes, estruturando-as em diferentes níveis cognitivos e distribuindo-as por meio de diferentes mídias possa repercutir positivamente sobre a aprendizagem. Ainda, que ao explorar uma narrativa infantil situada no contexto cultural dos estudantes pode permitir que eles se envolvam na compreensão de conceitos de Ciência da Computação e possam perceber sua aplicação na solução de problemas de outros domínios. Com o objetivo de identificar a viabilidade da proposta, um quase-experimento foi realizado com estudantes do 5º ano do ensino fundamental. Para tanto, um livro-jogo, intitulado sertão.bit, foi concebido, ancorado nos pressupostos teóricos adotados, o qual usa o sertão de Pernambuco como cenário para os desafios. Duas formas de implementação da abordagem proposta foram analisadas. Uma é pautada em atividades sem o uso de tecnologias digitais − desplugada − e a outra é apoiada em atividades híbridas, ou seja, com e sem o uso dessas tecnologias. Em ambos os casos, uma mesma história foi contada, bem como houve interação, em diferentes níveis de incorporação, entre os estudantes e o material didático que implementa a proposta. Como resultado, identificou-se que o grupo que implementou a abordagem pautada em atividades híbridas obteve melhor desempenho de aprendizagem e evidenciou, em seus diários reflexivos, maior satisfação na realização das atividades, se comparado ao grupo desplugado. De forma complementar, a experiência foi positivamente avaliada pela professora da turma, que relatou sua percepção quanto à aplicação da proposta em sua sala de aula.
Thesis
Full-text available
Embodiment is a concept that has gained widespread usage within the Human-Computer Interaction (HCI) community in recent years. In a general sense, embodiment is the notion that cognition arises not just from the mind, but also through our bodies‘ physical and social interactions with the world around us. HCI has employed this body-centric approach to the design of technology in a variety of domains, including interaction design, robotics, music systems, and education. However, due to the broad number of academic domains that define and operationalize embodiment within HCI (e.g., cognitive science, social science, learning science, neuroscience, AI, robotics, and so forth), it has become a remarkably fuzzy term with little understood about what designs result in desired outcomes. Essentially, HCI researchers and practitioners often employ a black box of design decisions when creating their embodied systems. Notably, the inconsistent framing and application of embodiment within HCI is a substantial drawback when trying to design embodied technology to support particular use cases such as learning, where understanding the 'why' of outcomes is essential. In this dissertation, I contribute work towards opening up the black box of embodied design to develop a more precise understanding of its proper application for the development of learning technology. This was done through the creation of a taxonomical design framework that outlines key methods for incorporating embodiment into the design of educational games and simulations. In order to create the design framework, I collected over 60 exemplars of embodied learning games and simulations, followed by the application of a bottom up, open coding method to distill seven core design dimensions. I then demonstrated the design framework‘s importance for a variety of HCI use cases including 1) categorizing existing embodied educational technologies, 2) identifying problematic design spaces, and 3) identifying design gaps for the generation of novel embodied learning systems. I also further employed the design framework to develop my own embodied learning system, Bots & (Main)Frames, which teaches basic programming and computational thinking skills through the use of tangibles. In order to better understand when and how embodied tangible technology can aid learning, I built two versions of Bots & (Main)Frames that only differed in input method (non-embodied mouse vs. embodied tangible programming blocks), while keeping all other game mechanics, aesthetics, and so forth identical. I then conducted two controlled experimental studies to compare differences between the two versions of Bots & (Main)Frames. My results show that an embodied tangible design had far greater positive impact for a number of key learning factors including programming self-beliefs, situational interest, enjoyment, and overall learning/performance outcomes. The quantitative and qualitative findings from these studies make key advances toward understanding when and how embodied tangible technology can aid in learning computational thinking skills.
Conference Paper
Full-text available
We introduce the concept of Thingy Oriented Programming (TOP), which is an experimental and alternative approach to prototyping simple electronics applications and systems that involve networks of sensors and actuators. TOP enables the users to define or 'program' (wirelessly) connected objects. While this approach allows powerful physical and interactive applications, no professional skills are needed since TOP-programs are defined by recording sequences of tangible interactions (i.e., interaction macros). Our primary target groups are designers who want to augment their physical prototypes with interactivity in little time, as well as end-users who are interested in enhancing specific tasks in their (smart) homes (e.g., creating a switch which turns on/off the lights by clapping twice the hands). A third target group is comprised of children and their educators in computer science and electronics. We describe the TOP concept including use scenarios, demonstrate a proof-of-concept prototype and explain our next intended steps.
Article
Full-text available
Can experimenting with three-dimensional (3D) physical objects in mixed-reality environments produce better learning and enjoyment than flat-screen two-dimensional (2D) interaction? We explored this question with EarthShake: a mixed-reality game bridging physical and virtual worlds via depth-camera sensing, designed to help children learn basic physics principles. In this paper, we report on a controlled experiment with 67 children, 4-8 years old, that examines the effect of observing physical phenomena and collaboration (pairs vs. solo). A follow-up experiment with 92 children tests whether adding simple physical control, such as shaking a tablet, improves learning and enjoyment. Our results indicate that observing physical phenomena in the context of a mixed-reality game leads to significantly more learning and enjoyment compared to screen-only versions. However, there were no significant effects of adding simple physical control or having students play in pairs vs. alone. These results and our gesture analysis provide evidence that children's science learning can be enhanced through experiencing physical phenomena in amixed-reality environment.
Conference Paper
Full-text available
Due to a broad conceptual usage of the term embodiment across a diverse variety of research domains, existing embodied learning games and simulations utilize a large breadth of design approaches that often result in seemingly unrelated systems. This becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within existing designs, as well as when trying to utilize embodiment in the design of new games and simulations. In this paper, we present our work on combining differing conceptual and design approaches for embodied learning systems into a unified design framework. We describe the creation process for the framework, explain its dimensions, and provide examples of its use. Our design framework will benefit educational game researchers by providing a unifying foundation for the description, categorization, and evaluation of designs for embodied learning games and simulations.
Conference Paper
Full-text available
Existing embodied learning games and simulations utilize a large breadth of design approaches that often result in the creation of seemingly unrelated systems. This becomes problematic when trying to critically evaluate the usage and effectiveness of embodiment within embodied learning designs. In this paper, we present our work on combining differing conceptual and design approaches for embodied learning systems into a unified design framework. We describe the creation process for the framework, explain its dimensions, and provide two examples of its use. Our embodied learning games and simulations framework will benefit HCI researchers by providing a unifying foundation for the description, categorization, and evaluation of embodied learning systems and designs.
Conference Paper
Full-text available
We report on a nine-month-long observational study with teachers and students with autism in a classroom setting. We explore the impact of motion-based activities on students' behavior. In particular, we examine how the playful gaming activity impacted students' engagement, peer-directed social behaviors, and motor skills. We document the effectiveness of a collaborative game in supporting initiation of social activities between peers, and in eliciting novel body movements that students were not observed to produce outside of game play. We further identify the positive impact of game play on overall classroom engagement. This includes an " audience effect " whereby non-playing peers direct initiations to those playing the game and vice versa, and a positive " spillover " effect of the activity on students' social behavior outside of game play. We identify key considerations for designing and deploying motion-based activities for children with autism in a classroom setting.
Conference Paper
Full-text available
We present the design of Note Code -- a music programming puzzle game designed as a tangible device coupled with a Graphical User Interface (GUI). Tapping patterns and placing boxes in proximity enables programming these "note-boxes" to store sets of notes, play them back and activate different sub-components or neighboring boxes. This system provides users the opportunity to learn a variety of computational concepts, including functions, function calling and recursion, conditionals, as well as engage in composing music. The GUI adds a dimension of viewing the created programs and interacting with a set of puzzles that help discover the various computational concepts in the pursuit of creating target tunes, and optimizing the program made.
Conference Paper
Full-text available
We describe the design and rationale for a project in which a room-sized mixed reality simulation was created to develop middle school students' knowledge and intuitions about how objects move in space. The simulation environment, called MEteor, uses laser-based motion tracking and both floor- and wall-projected imagery to encourage students to use their bodies to enact the trajectory of an asteroid as it travels in the vicinity of planets and their gravitational forces. By embedding students within an immersive simulation and offering novel perspectives on scientific phenomena, the intent is to engage learners in physics education at both an embodied and affective level. We describe a study showing improved attitudes towards science and feelings of engagement and learning for participants who used the whole-body MEteor simulation compared to a desktop computer version of the same simulation. We also discuss general implications for the design of technology-enhanced physics education environments.
Article
Full-text available
These 2 studies investigate the extent to which an Embodied Mixed Reality Learning Environment (EMRELE) can enhance science learning compared to regular classroom instruction. Mixed reality means that physical tangible and digital components were present. The content for the EMRELE required that students map abstract concepts and relations onto their gestures and movements so that the concepts would become grounded in embodied action. The studies compare an immersive, highly interactive learning platform that uses a motion-capture system to track students’ gestures and locomotion as they kinesthetically learn with a quality classroom experience (teacher and content were held constant). Two science studies are presented: chemistry titration and disease transmission. In the counterbalanced design 1 group received the EMRELE intervention, while the other group received regular instruction; after 3 days and a midtest, the interventions switched. Each study lasted for 6 days total, with 3 test points: pretest, midtest, and posttest. Analyses revealed that placement in the embodied EMRELE condition consistently led to greater learning gains (effect sizes ranged from 0.53 to 1.93), compared to regular instruction (effect sizes ranged from 0.09 to 0.37). Order of intervention did not affect the final outcomes at posttest. These results are discussed in relation to a new taxonomy of embodiment in educational settings. We hypothesize that the positive results are due to the embodiment designed into the lessons and the high degree of collaboration engendered by the co-located EMRELE. (PsycINFO Database Record (c) 2014 APA, all rights reserved)
Conference Paper
Learning programming has positive effect on children's development, and Tangible User Interfaces (TUIs) is a convenient way for teaching young children programming. TanProRobot 2.0 is a tangible system as well as a small-scale distributed embedded system designed for children at grades 1-2 to learn programming concepts. The system consists of three parts: tangible programming blocks, a robot car and several manipulatives. The input and output of the system are both tangible. Children can program the robot car to act certain actions by arranging the programming blocks. Also, children can interact with the car with manipulatives. TanProRobot 2.0 aims to introduce event handling concept and sensors to children. Through a user study with 11 children, we found that TanProRobot 2.0 is an interesting programming system for children, and it is easy to learn and to use. Furthermore, it could help children get a preliminary understanding of event handling concepts.