PreprintPDF Available

Hack.VR: A Programming Game in Virtual Reality

Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

Hack.VR: A Programming Game in Virtual Reality
Dominic Kao*, Christos Mousas*, Alejandra J. Magana*, D. Fox Harrell, Rabindra Ratan,
Edward F. Melcer§, Brett Sherrick*, Paul Parsons*, Dmitri A. Gusev*
* Purdue University, MIT, Michigan State University, §UC Santa Cruz,,,,,,,,
Figure 1: Existing environments for programming in VR: A) Block-based programming (Cubely), B) Multiple-choice code
snippet selection (Imikode), C) Block-based programming (VR-OCKS), D) World-building (NeosVR), E) Code visualization
(Primitive), F) Visual programming (Glitchspace). All gures reproduced with permission.
In this article we describe Hack.VR (
d t /vr/), an object-
oriented programming game in virtual reality. Hack.VR uses a VR
programming language in which nodes represent functions and
node connections represent data ow. Using this programming
framework, players reprogram VR objects such as elevators, robots,
and switches. Hack.VR has been designed to be highly interactable
both physically and semantically.
VR; virtual reality; programming; object-oriented; game.
ACM Reference Format:
Dominic Kao*, Christos Mousas*, Alejandra J. Magana*, D. Fox Harrell
Rabindra Ratan
, Edward F. Melcer
, Brett Sherrick*, Paul Parsons*, Dmitri
A. Gusev*. 2020. Hack.VR: A Programming Game in Virtual Reality. In
International Conference on the Foundations of Digital Games (FDG ’20),
September 15–18, 2020, Bugibba, Malta. ACM, New York, NY, USA, 6 pages.
FDG ’20, September 15–18, 2020, Bugibba, Malta
2020. ACM ISBN 978-1-4503-8807-8/20/09. . . $15.00
In the entire history of computing, programming has been a largely
physically static activity. But technologies previously inaccessible
to most users are now growing rapidly. Today, 78% of Americans
are familiar with VR (from 45% in 2015) [
]. As a result, experi-
ences traditionally created for desktops are now appearing in VR,
e.g., training [
] and automobile design [
]. Researchers argue
that VR increases immersion [
], which in turn increases engage-
ment and learning [
]. VR might be especially useful for teaching
programming because spatial navigation in VR helps reduce ex-
traneous cognitive load and increase germane cognitive focus on
learning content compared to text on a screen [
]. Further, VR al-
lows users to experience a sense of self-presence in the environment
], which facilitates an embodied-cognitive learning experience
] through which users interact with the learning content
more intuitively [
], potentially augmenting learning outcomes
]. Nonetheless, only a handful of environments for programming
exists in VR. In this article, we describe a programming game in
virtual reality that we created called Hack.VR.12
1Trailer video:
2Walkthrough video:
arXiv:2007.04495v1 [cs.HC] 9 Jul 2020
FDG ’20, September 15–18, 2020, Bugibba, Malta Kao et al.
Figure 2: In Hack.VR, programs are sets of nodes. Nodes are interconnected by blue tubes, which represent data ow.
Existing environments for programming in VR can be seen in Fig-
ure 1. These include VR-OCKS and Cubely, block-based VR pro-
gramming environments [
], and Imikode, a multiple choice
code snippet selection environment in VR [
]. Other signicant
projects include NeosVR [
], a shared social universe that features
powerful programming tools for VR world creation, and Primitive
], an “immersive development environment” enabling 3D code
visualization. In the indie game Glitchspace [
], players use a visual
programming language to solve puzzles. These environments for
programming in VR have been developed for education (Cubely,
Imikode, VR-OCKS), for modifying virtual worlds (NeosVR), for
code visualization (Primitive), and for entertainment (Glitchspace).
Importantly, Hack.VR was created specically to teach object-
oriented programming (OOP) compared to the highly procedural
approaches in the systems above. OOP encapsulates logic into ob-
jects. This paradigm has a natural translation to VR, where 3D
objects can each contain internal programming that is abstracted
from observers. Hack.VR is the rst system for learning OOP in VR,
and will serve as a testbed to perform research studies. This testbed
may also be useful for studying other aspects, e.g., help facilities,
embellishment, the player avatar, feedback, and the resulting eects
on VR programming [3, 7, 10, 13–24, 30].
3.1 Engine
In Hack.VR, a program is a set of nodes. See Figure 2. Nodes contain
typical programming constructions, e.g., primitive values, objects,
arithmetic operators, conditionals, event handlers, function calls.
Nodes facilitate communication through data ow. Nodes may have
both inputs and outputs depending on the node type. For exam-
ple, see Figure 3. Nodes can also represent entire method calls, the
details of which are abstracted from the player except input and
output. Because the goal of Hack.VR is to teach the player OOP,
the inner-workings of the methods themselves are intentionally
Figure 3: Left: An arithmetic node that performs addition.
Right: An entity node contains a miniaturized representa-
tion of a virtual world object, e.g., a door, a robot, an eleva-
tor. Programs attached to an entity node will then operate
on the actual virtual world object, e.g., open the door, turn
the robot, operate the elevator.
abstracted away (and players cannot see the code) so that the player
can concentrate on higher-level representations. The engine also
supports extensions. For example, once a new function has been
dened in the engine, a node can call it. To reduce complexity, play-
ers in Hack.VR use designer-embedded nodes to solve each puzzle
instead of creating their own nodes. While the engine supports
on-the-y node creation, the UI does not currently support this.
Node-based programming, like any other type of programming,
can lead to execution errors. For example, a NOT node expects a
boolean input (true or false), and outputs the inversion of its input.
However, if a numerical value is instead used as input to a NOT
node, this results in an error. While this is valid in some procedural
languages (in C programming, 0 is false and anything else is true),
implicit conversions from numerical values to boolean data types is
not allowed in object-oriented programming (e.g., Java, C#). When
an error is detected in the program, this is indicated by a red tube.
See Figure 4 for examples.
Hack.VR: A Programming Game in Virtual Reality FDG ’20, September 15–18, 2020, Bugibba, Malta
Figure 4: Example program errors. Top: A feedback loop.
Middle: An invalid input. Bottom: Another invalid input. Er-
rors are automatically displayed in the node after the error.
Figure 5: Left: The left gun allows the player to freely tele-
port. Right: The right gun allows the player to manipu-
late objects, modify programs, and inspect nodes. Inspecting
shows a node’s output value on the square display.
Figure 6: Left: Puzzle 1 is opening a door using a boolean.
Right: Puzzle 2 is setting the height of an elevator so you
can reach the next puzzle.
Figure 7: Puzzle 3 is four mini-puzzles where the player ma-
nipulates arithmetic nodes to create a password.
Figure 8: Puzzles 4–8 teach logical operators. Left: A puzzle
using the and operator. Right: Programming a robot to drop
a cube only on the column marked 3.
Figure 9: Left: Puzzle 9 requires programming a robot to
move based on color. Arrows indicate in which direction the
robot should move. Right: Puzzle 10 is an extension of Puz-
zle 9, and now the robot moves based on a combination of
column number and color.
Figure 10: Puzzles 11, 12, and 13 explore constructors. Left:
The robot in Puzzle 11 must be constructed with the hover
movement type to get past the lava. Right: Robots in Puzzle
13 must be constructed with correct movement types and
body types matching each lane.
FDG ’20, September 15–18, 2020, Bugibba, Malta Kao et al.
Figure 11: Puzzles 14–17 explore classes and inheritance. In
Puzzle 14, the player modies a class to modify all objects
that are created from that class.
Similarities can be drawn between Hack.VR and other program-
ming paradigms inherently predisposed to visualization, e.g., ow-
based programming [
]. Hack.VR is inspired by node graph sys-
tems: Unreal Engine 4’s Blueprints [
], Unity Playmaker [
], and
shader editors (Shader Graph [
] and Amplify Shader Editor [
3.2 Art Style and Gameplay
Hack.VR is based in a sci- futuristic setting. See Figure 12. In
Hack.VR, the player holds two “guns” that take the place of their
VR controllers. See Figure 5. Hack.VR is compatible with both HTC
Vive and Oculus. Hack.VR’s controls are found in Appendix A. Using
these controls, Hack.VR challenges players with object-oriented
programming puzzles. Hack.VR consists of 17 dierent puzzles.
Each puzzle builds upon concepts from prior puzzles. See Figures 6
through 11 for a short description of puzzles.
3.3 Design Process
The design process followed a spiral approach of design
evaluation. Iterations grew in complexity and
renement over several cycles for each part of the game. Feedback
was solicited from designers, developers, and playtesters. Com-
ments armed positive design choices (e.g., “I like that you can see
the physical buttons and physical door miniaturized in the node
tree”) and highlighted potential improvements (e.g., “When I’m
connecting things, it’s hard to tell what connector I’m working
with; maybe highlight the current selected connector?”). A typical
early prototype can be seen in Figure 13.
In this article we described Hack.VR, a programming game in virtual
reality. We created a VR programming language that is highly visual,
while being highly semantic. Hack.VR is inspired by the possibilities
of programming in VR. Imagine these highly evocative scenarios:
Programming an innite stairwell taking you into the clouds.
Figure 12: Hack.VR’s sci- futuristic setting.
Figure 13: Prototyping Puzzles 9 and 10.
Programming a robot carrying you across vast deserts, rolling
hills, and tundras.
Reconguring and reprogramming the mechanical parts in
your gun to enhance your capabilities.
Given the great potential for VR to enhance learning outcomes
], we expect that Hack.VR might help teach program-
ming concepts more eectively than similar, non-immersive tools.
Although assessment research should be conducted to conrm this
expectation empirically, from a perspective that spans research, de-
sign, and play, there is reason to be excited about what the coming
decade will bring for programming in VR.
Hack.VR: A Programming Game in Virtual Reality FDG ’20, September 15–18, 2020, Bugibba, Malta
Amplify Creations. 2020. Amplify Shader Editor. (2020). https://assetstore.unity.
com/packages/tools/visual-scripting/amplify- shader-editor-68570
Johanna Bertram, Johannes Moskaliuk, and Ulrike Cress. 2015. Virtual training:
Making reality work? Computers in Human Behavior 43 (2015), 284–292. https:
Max V Birk, Cheralyn Atkins, Jason T Bowey, and Regan L Mandryk. 2016.
Fostering Intrinsic Motivation through Avatar Identication in Digital Games.
CHI (2016).
Jongpil Cheon and Michael M Grant. 2012. The eects of metaphorical interface
on germane cognitive load in web-based instruction. Educational Technology
Research and Development 60, 3 (2012), 399–420.
Chris Dede. 2009. Immersive interfaces for engagement and learning. (2009).
Epic Games. 2020. Blueprints Visual Scripting. (2020). https://docs.unrealengine.
Julian Frommel, Kim Fahlbusch, Julia Brich, and Michael Weber. 2017. The
Eects of Context-Sensitive Tutorials in Virtual Reality Games. (2017), 367–375.
Glitchspace. 2016. Glitchspace. (2016).
Greenlight Insights. 2018. New Consumer Data Finds VR Headset Us-
age Expected To Increase In 2019, According To Greenlight Insights.
(2018). data-nds- vr-headset-
usage-expected- increase-2019- according- greenlight-insights/
Kieran Hicks, Kathrin Gerling, Graham Richardson, Tom Pike, Oliver Burman,
and Patrick Dickinson. 2019. Understanding the eects of gamication and
juiciness on players. IEEE Conference on Computatonal Intelligence and Games,
CIG 2019-Augus (2019).
Mustafa Hussein and Carl Nätterdal. 2015. The benets of virtual reality in
education- A comparison study. (2015).
Hutong Games LLC. 2019. Playmaker. (2019).
packages/tools/visual-scripting/playmaker- 368
Dominic Kao. 2019. Exploring the Eects of Growth Mindset Usernames in STEM
Games. American Education Research Association (2019).
Dominic Kao. 2019. Innite Loot Box: A Platform for Simulating Video Game
Loot Boxes. IEEE Transactions on Games (2019).
Dominic Kao. 2019. The Eects of Anthropomorphic Avatars vs. Non-
Anthropomorphic Avatars in a Jumping Game. In The Fourteenth International
Conference on the Foundations of Digital Games.
Dominic Kao. 2020. Exploring Help Facilities in Game-Making Software. In ACM
Foundations of Digital Games.
Dominic Kao. 2020. The eects of juiciness in an action RPG. Entertainment
Computing 34, November 2018 (2020), 100359.
Dominic Kao and D. Fox Harrell. 2015. Exploring the Impact of Role Model
Avatars on Game Experience in Educational Games. The ACM SIGCHI Annual
Symposium on Computer-Human Interaction in Play (CHI PLAY) (2015).
Dominic Kao and D. Fox Harrell. 2015. Mazzy: A STEM Learning Game. Founda-
tions of Digital Games (2015).
Dominic Kao and D. Fox Harrell. 2016. Exploring the Eects of Encouragement
in Educational Games. Proceedings of the 34th Annual ACM Conference Extended
Abstracts on Human Factors in Computing Systems (CHI 2016) (2016).
Dominic Kao and D. Fox Harrell. 2016. Exploring the Impact of Avatar Color on
Game Experience in Educational Games. Proceedings of the 34th Annual ACM
Conference Extended Abstracts on Human Factors in Computing Systems (CHI 2016)
Dominic Kao and D. Fox Harrell. 2017. MazeStar: A Platform for Studying Virtual
Identity and Computer Science Education. In Foundations of Digital Games.
Dominic Kao and D. Fox Harrell. 2017. TowardUnderstanding the Impact of Visual
Themes and Embellishment on Performance, Engagement, and Self-Ecacy in
Educational Games. The annual meeting of the American Educational Research
Association (AERA) (2017).
Dominic Kao and D. Fox Harrell. 2018. The Eects of Badges and Avatar Identi-
cation on Play and Making in Educational Games. In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems - CHI’18.
Glyn Lawson, Davide Salanitri, and Brian Watereld. 2016. Future directions for
the development of virtual reality within an automotive manufacturer. Applied
Ergonomics 53 (2016), 323–330.
Elinda Ai Lim Lee and Kok Wai Wong. 2014. Learning with desktop virtual
reality: Low spatial ability learners are more positively aected. Computers and
Education (2014).
Edward F Melcer. 2018. Learning with the body: Understanding the Design Space
of Embodied Educational Technology. Ph.D. Dissertation. New York University
Tandon School of Engineering.
J. Paul Morrison. 1994. Flow-based programming. In Proc. 1st International
Workshop on Software Engineering for Parallel and Distributed Systems. 25–29.
NeosVR. 2020. NeosVR. (2020).
E O’Rourke, Kyla Haimovitz, Christy Ballweber, Carol S. Dweck, and Zoran
Popović. 2014. Brain points: a growth mindset incentive structure boosts per-
sistence in an educational game. Proceedings of the 32nd annual ACM con-
ference on Human factors in computing systems - CHI ’14 (2014), 3339–3348.
Solomon Sunday Oyelere and Violetta Cavalli-Sforza. 2019. Imikode: A VR Game
to Introduce OOP Concepts. Proceedings of the 19th Koli Calling International
Conference on Computing Education Research (2019).
[32] Primitive. 2020. Primitive. (2020).
Rabindra Ratan. 2013. Self-presence, explicated: Body, emotion, and identity
extension into the virtual self. In Handbook of research on technoself: Identity in a
technological society. IGI Global, 322–336.
Rafael J. Segura, Francisco J. del Pino, Carlos J. Ogáyar, and Antonio J. Rueda. 2019.
VR-OCKS: A virtual reality game for learning the basic concepts of programming.
Computer Applications in Engineering Education August (2019).
Dong Hee Shin. 2017. The role of aordance in the experience of virtual reality
learning: Technological and aective aordances in virtual reality. Telematics
and Informatics (2017).
Anthony Steed, Ye Pan, Fiona Zisch, and William Steptoe. 2016. The impact of a
self-avatar on cognitive load in immersive virtual reality. In Proceedings - IEEE
Virtual Reality.
Unity Technologies. 2020. Shader Graph. (2020).
Juraj Vincur, Martin Konopka, Jozef Tvarozek, Martin Hoang, and Pavol Navrat.
2017. Cubely: Virtual reality block-based programming environment. Proceedings
of the ACM Symposium on Virtual Reality Software and Technology, VRST Part
F1319, 2 (2017).
FDG ’20, September 15–18, 2020, Bugibba, Malta Kao et al.
Hack.VR Controls
Figure 14: Movement.
Figure 15: Grabbing an object. Once an object is grabbed, it can be rotated and moved.
Grabbing a node will inspect it, displaying its output on the right gun.
Figure 16: Switch between physical manipulation and program modication mode. A blue
laser indicates physical manipulation, while a red laser indicates modication.
Figure 17: Program modication.
Figure 18: Modifying node attachments.
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
“Juiciness” is a term that has been widely used to describe the positive feedback (both visual/audial) present in digital games. However, few empirical investigations have looked at how juiciness concretely impacts players. In this paper, we perform a study (N = 3018) in which we compare four identical versions of an action role-playing game with varying amounts of juiciness: (1) None; (2) Medium; (3) High; and (4) Extreme. We find that both None and Extreme amounts of juiciness lead to significantly decreased play time, significantly decreased player experience, significantly decreased intrinsic motivation, and significantly decreased performance relative to both Medium and High. This is, to the best of our knowledge, the largest study to date on juiciness. Our results have implications for designers, developers, and researchers.
Full-text available
Embodiment is a concept that has gained widespread usage within the Human-Computer Interaction (HCI) community in recent years. In a general sense, embodiment is the notion that cognition arises not just from the mind, but also through our bodies‘ physical and social interactions with the world around us. HCI has employed this body-centric approach to the design of technology in a variety of domains, including interaction design, robotics, music systems, and education. However, due to the broad number of academic domains that define and operationalize embodiment within HCI (e.g., cognitive science, social science, learning science, neuroscience, AI, robotics, and so forth), it has become a remarkably fuzzy term with little understood about what designs result in desired outcomes. Essentially, HCI researchers and practitioners often employ a black box of design decisions when creating their embodied systems. Notably, the inconsistent framing and application of embodiment within HCI is a substantial drawback when trying to design embodied technology to support particular use cases such as learning, where understanding the 'why' of outcomes is essential. In this dissertation, I contribute work towards opening up the black box of embodied design to develop a more precise understanding of its proper application for the development of learning technology. This was done through the creation of a taxonomical design framework that outlines key methods for incorporating embodiment into the design of educational games and simulations. In order to create the design framework, I collected over 60 exemplars of embodied learning games and simulations, followed by the application of a bottom up, open coding method to distill seven core design dimensions. I then demonstrated the design framework‘s importance for a variety of HCI use cases including 1) categorizing existing embodied educational technologies, 2) identifying problematic design spaces, and 3) identifying design gaps for the generation of novel embodied learning systems. I also further employed the design framework to develop my own embodied learning system, Bots & (Main)Frames, which teaches basic programming and computational thinking skills through the use of tangibles. In order to better understand when and how embodied tangible technology can aid learning, I built two versions of Bots & (Main)Frames that only differed in input method (non-embodied mouse vs. embodied tangible programming blocks), while keeping all other game mechanics, aesthetics, and so forth identical. I then conducted two controlled experimental studies to compare differences between the two versions of Bots & (Main)Frames. My results show that an embodied tangible design had far greater positive impact for a number of key learning factors including programming self-beliefs, situational interest, enjoyment, and overall learning/performance outcomes. The quantitative and qualitative findings from these studies make key advances toward understanding when and how embodied tangible technology can aid in learning computational thinking skills.
Full-text available
This paper presents a prototype of a virtual reality system to teach the basic concepts of programming called VR‐OCKS. The system is inspired by other visual languages such as Scratch or Kodu, and it works by proposing to the user the resolution of simple puzzles in a 3D environment. Several basic commands to a humanoid character, such as advance or turn, together with control flow structures like iteration and conditional selections, are needed to provide a solution for increasingly difficult challenges. Our aim is to attract users, usually children and teenagers, into the world of programming by taking advantage of the appeal and potential of Virtual Reality. The use of VR‐OCKS strengthened the spatial orientation and autonomy of the users, in addition to enhancing common sense, creative thinking and systematic reasoning. In our experiments, VR‐OCKS was accepted by adults and children alike and it showed great potential as an educational tool. This article is protected by copyright. All rights reserved.
Conference Paper
Full-text available
Avatar identification is a topic of increasingly intense interest. This is largely because avatar identification can promote a wide variety of outcomes: game enjoyment, intrinsic motivation, quality of made artifacts, and more. Yet we still understand very little about how different avatar types affect users. Here, we contribute one of the few highly controlled studies of this nature (N=1074). Specifically, we compare three avatar types in a jumping game: 1) Human (high anthropomorphism), 2) Block-like (low anthropomorphism), and 3) Robot (high anthropomorphism). We find that players randomly assigned to the Robot condition have significantly higher player experience. We find that both Robot and Human conditions lead to higher avatar identification. Finally, using linear hierarchical regression, we find that avatar identification significantly promotes player experience (29.8% variance) and time played (3.5% variance). Our study demonstrates the importance of considering avatar type in designing virtual systems.
Conference Paper
Full-text available
In our study (N=2189), we divided participants into 6 badge conditions: 1) Role model badges (e.g., Einstein), 2) Personal interest badges (e.g., Movies), 3) Achievement badges (e.g., "Code King"), 4) Choice, 5) Choice with badges always visible, and 6) No badges. Participants played a CS programming game, then used an editor to create their own level. Badges promoted avatar identification (personal interest, role model), player experience (achievement, role model), intrinsic motivation (achievement, role model), and self-efficacy (role model) during both the game and the editor. Independent of badges, avatar identification promoted player experience, intrinsic motivation, and self-efficacy. Additionally, avatar identification promoted greater overall time spent in both the game and the editor, and led to significantly higher overall quality of the completed game levels (as rated by 3 independent externally trained QA testers). Our study has implications for the design of badge systems and sheds new light on the effects of avatar identification on play and making.
Conference Paper
Full-text available
Virtual reality (VR) devices have become popular in recent time due to the release of several consumer grade VR devices. Currently games are considered one of the primary use cases for VR. Game mechanics in VR games frequently work differently compared to non-VR games. However, there is no prevalent way how to teach game mechanics and game interaction to players in VR games. In this work we implemented a VR wave shooter game with two variants of a tutorial. We conducted a user study (n = 39) examining the effects of a context-sensitive tutorial on players' experience compared to a traditional instruction screen tutorial. The results show that the context-sensitive tutorial elicited higher positive emotions, lower negative emotions, and higher motivation while immersion and performance were comparable. These findings highlight that tutorials should not be seen as a separate introduction to a game but part of the overall experience as they can directly influence the players' experience.
Loot boxes are garnering increased attention in both the industry and media. One focal point of the discussion is whether loot boxes should be considered a form of gambling [1]. While parallels can be drawn between loot boxes and random reward schedules, researchers have argued that the “glorification” aspect of loot boxes that have heightened player awareness (e.g., opening a box, a pack of cards, or spinning a wheel) of randomness is a relatively new trend in games [2]. However, there is currently a dearth of empirical research on loot boxes. We make two contributions in this paper: 1) Infinite Loot Box, an open-source Unity platform for experimenting with loot boxes created from scratch, and 2) a 2×2 experiment (high/low visual effects x high/low audial effects; N=1235). We find that high audial effects significantly increase the number of loot boxes opened. Neither audial nor visual effects were found to significantly impact other variables. These contributions push forward our understanding of loot boxes and their contextual factors.
Conference Paper
Block-based programming languages are successfully being used as an alternative way of teaching introductory programming concepts. The success is in part due to the low barrier of entry and the visual game-like appeal fostering experimentation and creativity. Virtual reality (VR) presents a step further to an even more immersive and engaging experience. In this demo, we showcase our project Cubely, an immersive VR programming environment in which novice programmers solve programming puzzles within a virtual world. The puzzles are similar to exercises and solutions to the exercises are assembled by the programmer within the same virtual world using the cubes representing program instructions. The whole environment is templated to a theme of the popular Minecraft video game.