Game-Based Learning with the
Leap Motion Controller
Martin Ebner, Norbert Spot
Graz University of Technology, Austria
Learning through games is a promising field for the future of education. In this research study the
authors describe the development process of an educational application, a game targeting P-12 students.
This application is meant to be used with the Leap Motion Controller, a small 3D infrared camera, which
while in use, tracks the user’s hands and finger movements in the 3D space. The chapter describes the
application itself and presents the outcomes of a field study, which was carried out with a small group of
students at an elementary school. It can be pointed out that there is a huge potential of using innovative
input devices in school education.
Computers have evolved rapidly and become part of people’s everyday life. They spread into
every field of industry and entertainment. People use computers almost everywhere, at work, at home, at
schools and universities. Nowadays at least in middle Europe more than 90% of our youth (age 12-18)
own a smartphone with mobile Internet access (Ebner et al, 2013). This affects of course also the field of
education and therefore a debate occurs whether today’s students are the people our educational system
was designed to teach or not? (Prensky, 2001). Since then several studies has been carried out addressing
a new generation of learners called digital natives (Prensky, 2011), netgeneration (Oblinger & Oblinger,
2005) or generation@ (Opaschowski, 1999). More a less all studies concluded that there is a change
concerning the ownership of digital devices and also the overall digital literacy, but this did not affect the
learning outcomes (Conole et al, 2006) (Bullen et al, 2008) (Margaryan et al, 2011) (Nagler & Ebner,
Usually if people talk about using a computer, we imagine someone sitting at a table, typing on a
keyboard, moving around with the mouse or tapping
the touchpad and staring at the computer’s screen. The
way we interact with computers has not changed
significantly since the 1960’s when these peripherals were invented (Altman, 2013). This is slowly
changing, though. Today, there are more and more innovative (input) devices hit the market. On the one
hand they are different and on the other hand they aim to change the way we interact with different kinds
of computers. These peripheries however, do not intend to replace the traditional keyboard-mouse setup.
They act as an addition, useful for diverse applications. Some of them represent also more natural way of
human-computer interaction. The computer can with the help of such peripheries sense motion in front of
the screen or touch instead of getting input from the user by a keyboard and mouse. This more natural
way of human-computer interaction enables a so-called Natural User Interface (Altman, 2013).
We did a solid research about current available innovative input devices aiming to find different
ways for educational games. One of them found is the MaKey MaKey1 seen on Figure 1. The device’s
name is a word play from the words “make” and “key”. Essentially it’s a printed circuit board with a
micro controller running Arduino Leonardo firmware. It uses the Human Interface Device (HID) protocol
to communicate with the computer, and it can send key presses, mouse clicks, and mouse movements. For
sensing closed switches on the digital input pins, its engineers use high resistance switching to make it so
that we can close a switch even through materials like the human skin, leaves, and play-doh. That means
we can create buttons from play-doh or just even draw a joystick with a pencil and use our “do it
yourself” controllers to play a game. Or we can load up a piano software and instead of keyboard keys
hook up the MaKey MaKey to bananas so that they become the piano keys. On the board itself there are
six inputs on the front, which can be attached to via alligator clips, soldering to the pads, or any other
method. There are another 12 inputs on the back, 6 for keyboard keys, and 6 for mouse motion, which are
accessible with jumpers via the female headers, paper clips, or by alligator clips creatively around the
headers. By reprogramming the Arduino environment different set of keys can be used or the behavior of
the device can be changed. The biggest drawback of this device lies in its design requiring closed
switches to make an action. This means the player should be “connected” to the device with a wire so that
it can sense the closed switch. While the device has some potential and it is interesting, it did not meet our
requirements in regards to collaboration and free real movements.
Another device on the list was the Touch Board from Bare Conductive.2 It’s a similar device to
the MaKey MaKey capable of turning almost any material or surface into a sensor. It is designed as an
easy-to-use platform for a huge range of projects, whether it’s painting a light switch on the wall, making
a paper piano or creating a custom interactive surface. It plays together with Bare Conductive’s Electric
Paint, which is essentially a non-toxic, air-drying, water- soluble conductive paint. Works great on many
materials including paper, plastic, textiles and conventional electronics. With the Electric Paint, we can
draw our own circuit, our own light switch or keypads and use those as sensors. This device also utilizes
the real world and it is a great tool for tinkering, but we have decided for a virtual solution. All these
devices are well known in the Maker Movement, which is following the idea to “learn by making”
something (Schön et al, 2014). Papert described it as “learning by doing”, “technology as building
material”, “big idea is hard fun”, “learning to learn” (Papert, 1980).
The device, which has been chosen for the purpose of this study, is the Leap Motion Controller.
The Leap Motion is a consumer-grade sensor developed by a company with the same name. This sensor is
something like the Microsoft Kinect3 sensor
which is more widely known and popular,
but as Weichert et al. (2013) point it out it is
much more precise. It was designed to sense
natural hand movements with high precision, instead of tracking the movements of the whole body. The
device lets people to use the computer in a complete new way. To point, wave, reach, grab, to pick
something up and move it in the virtual world. However, it doesn’t replace the keyboard, mouse, stylus,
or trackpad. It works with them, and without further special adapters. The Leap Motion is typically used
as seen on Figure 2, as it lies on the table in front of the computer’s monitor. According to the company,
with the Leap Motion software running, we just need to plug it into the USB port of a computer. But to
make use of it, applications specifically developed for the device have to be used. As of September 2013
there have been already 95 applications available through Airspace, the Leap Motion’s application store,
which has been renamed as App Store since then (Potter et al., 2013). These apps belong to broad field of
games or educational and scientific apps, as well as apps for music and art.
1 http://www.makeymakey.com (last access December 2014)
2 http://www.bareconductive.com/shop/touch-board/ (last access December 2014)
3 http://www.microsoft.com/en-us/kinectforwindows/ (last access December 2014)
The Leap Motion Controller is capable of sensing almost every little move done with our hands
and fingers, or even with tools (pen etc.) in our hand. More precisely, it’s 8 cubic feet (cca. 0,23m3) of
interactive, three-dimensional space it can observe. This space is called the interaction box and it is
illustrated on Figure 2. The device tracks all 10 fingers up to 1/100th of a millimeter and movements at a
rate of over 200 frames per second. The company states that it's dramatically more sensitive than existing
motion control technology. It has a super- wide 150° field of view and a Z-axis for depth. The effective
range of the Leap Motion Controller extends from approximately 25 to 600 millimeters above the device.
This range is limited by how the IR light travels through space. Beyond a certain distance, it becomes
much harder to detect the hand’s position in 3D. The maximum current provided by the USB connection
limits the intensity of the LEDs. With the applications written for the controller, we can reach out and
grab objects. Move them around in 3D.4
“The Leap Motion Controller introduces a new gesture and position tracking system with sub-
millimeter accuracy. In contrast to standard multi-touch solutions, this above-surface sensor is discussed
for use in realistic stereo 3D interaction systems, especially concerning direct selection of stereoscopically
displayed objects.” (Weichert, 2013) The device is intended to be used with a minimal setup. It is
designed for hand gesture and finger movement detection in interactive software applications. The sensor
works by projecting infrared (IR) light upwards from the device and detecting reflections using
monochromatic infrared cameras. Essentially it is an infrared stereoscopic 3D camera.
In this research study a prototype is described, which has been developed for use with the Leap
Motion with a special focus to the P-12 education field. Finally the outcomes of a small-scale field study
at an elementary school are presented.
The objective of this research study is to explore a way, how game-based learning (GBL) might,
in cooperation with innovative controller devices, make the learning process more enjoyable
(Shneiderman, 1998) and more motivating (Logan & Gording, 1981) (Holzinger, 1997) in P-12
education. GBL is close to problem-based learning, wherein specific problem scenarios are placed in a
play framework (Barros & Tamblyn, 1980).
For our purpose, a prototype of an educational application has been developed following strictly
the approach of prototyping (Alavi, 1984). Afterwards the application was evaluated at an elementary
school for a small-scale field study with third grade students. Finally, the students’ feedback has been
recorded and evaluated.
The prototype, which has been built for the purpose of this research study, is as mentioned before
a game. The requirements were to develop a game, which should be easy to use, but the gameplay itself
should be challenging and curious as well as it should also evoke players fantasy (Malone, 1980) to carry
out didactic effects. With other words children should be able to learn by simply playing the game
(Hannak et al, 2012). Furthermore, the game should offer a great and colorful user interface and it should
be also fun to play so that kids not get bored easily.
When designing for a motion controlled experience one of the first things developers have to
think about is in what environment is the
application going to be used. People are quite
adept at using their hands but the movements and
gestures are sometimes subtle, sometimes not that precise and we cannot hold our hands perfectly steady.
And if it comes to holding the arms for extended periods of time, it becomes quite exhausting.
4 https://www.leapmotion.com/product (last access December 2014)
Furthermore depending on the application different ergonomic factors have to be considered. In our case
the environment is a classroom and kids will use the application while standing or sitting at a desk. The
interactions were designed with small rest periods so that the hands and arms don’t get tired that fast.
We have decided to create a game which helps children to practice addition and subtraction
between 1 and 100, and which makes them more interesting to get to know the small multiplication chart
and its practice. The current version of the prototype contains ten levels with variable difficulty and with
exercises for practicing the multiplication chart.
The game is placed into a 3D
cartoon-like virtual world as shown in
Figure 3. In this world there are colorful
balloons floating in the air with numbers on
them. The player gets a mathematical
exercise to solve, which is shown at the
bottom of the screen. By solving we mean,
the player solves the exercise and he or she
has to destroy the balloon, which carries the
result of the exercise. When the player puts
his or her hand above the sensor, a virtual
robotic hand appears on the screen as seen
on Figure 4, which replicates the movements
of his/her hand. Punching the balloons to the ground with the virtual robotic hand pops/destroys them. So
that the game is more challenging, the player is put under time pressure. Sometimes the player even has to
navigate through the world and search for the right balloon. These are spread around and they might be
hidden behind trees or bushes. If the player pops the right balloon, he/she can continue to the next
exercise, which is presented in the same way. After solving all the exercises of a given level there is a
little pause for a tiny rest and then the next level is presented. If the player does not manage to solve the
exercises within a given time or pops a balloon, which does not carry the result to the exercise, he/she
loses one of the lives. At the start the player owns three hearts, which are equal to lives. If the player
manages to go through all the levels, or loses all the lives the game is ended and the score is presented.
The power of Leap Motion is unleashed when used in a 3D environment. To create our 3D world
we used Unity5, a 3D game development ecosystem which is a powerful rendering engine fully integrated
with a complete set of intuitive tools and rapid workflows to create interactive 3D and 2D content.
To test out our prototype for game based learning with the Leap Motion in a real world scenario,
it was given to 12 third grade children in an elementary school. The kids played with the application one-
by-one. We were curious how the Leap Motion and the application will perform in general in a school
class environment, as well as what the kids’ feedback will be. The prototype was built for them after all.
The multi-level evaluation consists of an observation of the children and photo and audio documentation
in parallel during their usage of the application. Furthermore we asked for feedback after the experimental
First of all we have to mention that there were some few performance issues during the test. The
application sometimes behaved unstable, unplayable and the sensor did not work well. This might have
been due to the strong lights in the classroom, or the fact that the application was each time run inside of
Unity, and was not exported as a standalone app, or because the SDK was at the time of the trial still in
5 http://unity3d.com (last access December 2014)
beta, thus some performance issues might occur due to this fact. In the majority of cases however, it did
work really well. The sensor had no problem also with the tracking of small hands of the kids.
Before the kids have started to play, they were given instructions how to play. This is due to that
the prototype does not include any graphics or other help files yet, which would show how to play. All the
kids reported, that they have never seen an application like this before. A short introduction was therefore
necessary. After they have seen the game in action they were eagerly waiting for their turn.
The kids really liked the interaction with the game menu. Some said they felt like magicians
while starting the game. After the game has started we have realized that the choice of the robotic hand
was good, as it did not scare the kids. The majority of children quickly learned how to move in the world
with the “virtual joystick” and they were searching for the balloon with the right number on it. The most
interesting part of the trial however, was to observe how different kids played the game. Although our test
group was small, after the trial we could categorize the kids based on their performance in the game.
There were basically four groups based on our observation:
• The first group was basically made by kids, which perform good at the class, they were
also good in solving exercises in the game as well as having good hand-eye coordination
skills and they had no problems with moving around in the virtual world, finding
balloons and popping them. They really wanted to play the game again and enjoyed it.
• In another group, there were children, who were still good in solving exercises, but they
had problems with coordination and finding the right balloons. Some of them had so
much trouble playing the game that their experience was so frustrating they not really
wanted to play the game again further. For those kids from this group, who found the
game a bit hard to play, but had fun playing it, practicing similar games as we have seen
in the current store, might have a positive influence on the hand-eye coordination skills.
They performed a bit better as they played the game the second time.
• Those students, who were somewhat slower in solving exercises, made the third group.
They had great coordination skills too and had no problems with game interaction. The
play for them was fun, and they wanted to play the game over and over again to be able
to gain a better score. This game element was the biggest motivation. After each play, the
kids were talking to each other and showing off their score. Those gaining higher were
satisfied, while those who have earned a lower score wanted to try it again to make their
score better. This form of virtual reward seems to work really well. It is motivational
enough to make kids want to play a math game.
• In the last group there were the kids who performed not that well. They had issues with
solving exercises. For them, solving math exercises and searching for balloons was not
that fun. They quickly realized that the sensor can track two or more hands too, and that it
is fun to play with the balloons with the virtual hands, or how cool it is when the virtual
hands do the same as their own hands, when they clap and so on. They got bored really
The trial showed that the majority of kids loved the game. We asked them whether they want to
play it again, and the response was positive. Almost all of them wanted to play the game again. They
asked what is the name of the game, how and where can they play it. The sensor, how small it is and what
it is capable of, amazed them. The kids enjoyed playing the game and they quickly forgot about math and
all they wanted to do was to find the right balloon, what obviously was possible only after solving the
given exercise, and they wanted to perform better to earn a higher score. This means they were practicing
math while playing a game, and not thinking about school or exercises. We asked them how the game
compares to the games they play day by day on their iPads. The answer was common: the iPad is nothing
special anymore, and “this is much-much better” or “the best game I have ever played in my life”. Yes,
kids tend to overrate things and change their minds quickly over time. But they grew up in a world where
there were smartphones and tablets everywhere, so they got used to it. They use iPads at the school where
they have a couple of different apps for learning. They said that it is not a big deal to use a touch-screen
“there you have to tap and touch it all the time, here you only swipe with your hands in the air and you
play the game”.
After the students played the game we interviewed the children with the help of the so-called cut-
off technique (Ebner et al, 2014). A group of up to 4 children got 5 statements and different smiles (from
sad to happy) and was asked to rank the statements as a group. Fischer (2007) mentioned that the ranking
technique is highly interesting because it makes children to discuss about facts, bring them to reflect some
circumstances and finally let them agree on one final decision. These are the statements, which were
given to them:
1. I would like to play the game again.
2. It was easy to pop the balloons.
3. It was easy to find the balloons.
4. I think the exercises were easy.
5. I think the game was easy to play.
The majority of students gave the best mark for all the statements. Some, which had a little
trouble popping the balloons, gave somewhat lower marks. Those kids however, which had problems
with the navigation and the gameplay, gave the lowest mark for all the statements. This clearly shows,
that for them it was quite hard to play the game due to the fact it was frustrating not to be able to find and
pop the balloons. Table 1 describes how the kids ranked the statements and what their decisions might
I would like to play the game again
The best overall mark. The kids enjoyed playing
the game, and they would love to play it again.
It was easy to pop the balloons
The majority of students gave the best mark,
however some of them, which had problems
popping the balloons, gave lower marks. This
might indicate that another, easier balloon popping
technique may be more appropriate.
It was easy to find the balloons
The majority of students gave the best mark, thus it
seems, that the navigation design works well.
I think the exercises were easy
The majority of students gave the best mark. This
might mean that the exercises were too easy and
more challenging exercises would probably be
I think the game was easy to play
The majority of students gave the best mark. This
indicates, that the game-prototype is intuitive.
The work on the prototype gave us a chance to learn to develop with Unity and the programming
language it uses, which is C#. The learning curve of Unity was steep. It provided all the tools and
features, which were needed in the development process of the prototype. The prototype has not yet been
taken to perfection and it has some reserves. As no sounds were added to the game, this would be an
essential part of a possible future work to add sounds in the game menu as well as into the game. For
example this could be environment sounds or music, or sound effects when popping the balloons. The
addition of different physics forces like wind would also make the gameplay more interesting. The
balloons might float in the wind and the player would need to catch and pop them.
As the Leap Motion is able to track two, or even more hands at the same time, a multiplayer
mode would also be a good addition, which would make the gameplay even more fun. So the evaluation
clearly pointed out that there are some useful additions to make the prototype become a good game.
The findings also show, that such games might not be ideal for all types of students, particularly
students with a need of special education or those, whose hand-eye coordination skill is not that well.
Those kids may have problems playing such games. Similar to other studies (Zechner & Ebner, 2011) it
must be mentioned that games did not improve the learning behavior of those students who are
performing not so well in the educational field the game was designed to. Nevertheless the majority of
children tend to get used to things quickly. They can successfully use the widespread technology in their
learning process, and our prototype has shown that bringing in fresh air, something new to them can boost
The goal of this work was to make use of a new innovative controller device and to develop a
prototype of a learning application for it, which would show a possible way to improve the game-based
learning process at elementary schools.
After the field study at the elementary school we like to point out that game-based learning might
not be for everyone just like Whitton (2007) states. Especially students needing special education had
trouble in succeeding the game. For other students however, such games utilizing innovative devices,
which create a Natural User Interface, apart from the fact that students practice the school subject, might
help to improve their hand-eye coordination skills. For well performing students with great coordination
skills such games might mean a fun and novel way to practice the school subject.
To declare, that game-based learning with innovative input devices such as the Leap Motion can
significantly improve the learning process of some school subjects - if the applications are done right -
would need a longer and more rigorous scientific research study, which should be done in the next step.
However, as we have seen in the classroom, such games can be motivational, engaging, challenging and
fun to play to make students forget about the subject while still practicing it.
Alavi, M. (1984). An assessment of the prototyping approach to information systems development.
Commun. ACM 27, 6 (June 1984), 556-563.
Altman, P. (2013). Using MS Kinect Device for Natural User Interface, Pilsen
Barrows, H. S., & Tamblyn, R. M. (1980). Problem-based learning: an approach to medical education
(Springer series on medical education). New York: Springer.
Bullen, M., Morgan, T., Belfer, K. & Oayyum, A (2008). The digital learner at BCIT and implications for
an e-strategy. EDEN, Paris, France.
Conole, G.; de Laat, M., Dillon, T. & Darby, J. (2006). LXP: Student experiences of technologies. Final
Report: JISC UK, http://www.jisc.ac.uk/whatwedo/programmes/elearningpedagogy/learneroutcomes (last
visited: December 2014)
Ebner, M., Nagler, W. & Schön, M. (2013). Architecture Students Hate Twitter and Love Dropbox” or
Does the Field of Study Correlates with Web 2.0 Behavior?. In: World Conference on Educational
Multimedia, Hypermedia and Telecommunications 2013, pp. 43-53. Chesapeake, VA: AACE.
Ebner, M., Schönhart, J., Schön, S. (2015). Experiences with iPads in Primary School. RESPUESTA
EVALUACIÓN. Revista PROFESORADO. accepted, in print
Fischer, J. (2007). Detektivische Methode - Legetechnik
http://wissensreise.de/Intranet/Aufgabenkultur/forschen/Seiten/dMundLegetechnik.html, last access
Hannak, C., Pilz, M. & Ebner, M. (2012). Fun - A Prerequisite for Learning Games. In Proceedings of
World Conference on Educational Multimedia, Hypermedia and Telecommunications 2012, pp. 1292-
1299, Chesapeake, VA: AACE
Holzinger, A. (1997). Computer-aided mathematics instruction with mathematica 3.0. Mathematica in
Education and Research, 6(4), 37–40.
Logan, F. A.; Gordon, W. C. (1981). Fundamentals of learning and motivation (3rd ed.). Dubuque, IA:
Malone, T. W. (1980). What makes things fun to learn? Heuristics for designing instructional computer
games. In Proceedings of: 3rd ACM SIGSMALL symposium and the First SIGPC symposium on small
systems, pp. 162–169.
Margaryan, A., Littlejohn, A. & Vojt, G. (2011) Are digital natives a myth or reality? University students’
use of digital technologies, Computers & Education, Volume 56, Issue 2, pp. 429-440
Metz R. (2013) Leap Motion’s Struggles Reveal Problems with 3-D Interfaces
interfaces/ (last access December 2014)
Nagler, W. & Ebner, M. (2009). Is Your University Ready For the Ne(x)t-Generation? In: World
Conference on Educational Multimedia, Hypermedia and Telecommunications 2009, S. 4344 – 4351.
Chesapeake, VA: AACE.
Oblinger, D. D. & Oblinger, J. L. (Hrsg.). (2005). Educating the Net Generation. Available at:
http://www.educause.edu/educatingthenetgen (last access December 2014)
Opaschowski, H. W. (1999). Generation @, Die Medienrevolution entläßt ihre Kinder: Leben im
Informationszeitalter. Hamburg/Ostfildern: Kurt Mair Verlag
Papert, S. (1980). Mindstorms: Children, Computers, And Powerful Ideas. New York: Basic Books.
Potter L. E., Araullo J., Carter L. (2013) The Leap Motion controller: A view on sign language, Adelaide,
Prensky, M. (2001), Digital natives, Digital Immigrants. On the Horizon, 9 (5), p. 1-6.
Shneiderman, B. (1998). Relate-create-donate: a teaching/learning philosophy for the cyber-generation.
Computers and Education, 31(1), 25–39.
Schön, S., Ebner, M., Kurma, S. (2014). The Maker Movement. Implications of new digital gadgets,
fabrication tools and spaces for creative learning and teaching. In: eLearning Papers, eLearning Papers
Special edition 2014 “Transforming Education through Innovation and Technology, September 2014, pp.
Weichert F., Bachmann D., Rudak B., Fisseler D. (2013) Analysis of the Accuracy and Robustness of the
Leap Motion Controller, Department of Computer Science VII, Technical University Dortmund,
Whitton, N. (2007) Motivation and computer game based learning. In ICT: Providing choices for learners
and learning. Proceedings ascilite Singapore 2007.
http://www.ascilite.org.au/conferences/singapore07/procs/whitton.pdf (last access December 2014)
Zechner, J.; Ebner, M. (2011). Playing a Game in Civil Engineering, - in: 14th International Conference
on Interactive Collaborative Learning (ICL2011) ̶ 11th International Conference Virtual University
(vu'11), S. 417 - 422