PreprintPDF Available

Visualizing Educational Game Data: A Case Study of Visualizations to Support Teachers

Authors:

Abstract and Figures

Video games have become one of the most popular mediums across cultures and ages. There is ample evidence that supports the benefits of using games for learning and assessment, and educators are largely supportive of using games in classrooms. However, the implementation of educational games as part of the curriculum and classroom practices has been rather scarce. One of the main barriers is that teachers face to actually know how their students are using the game so that they can analyze properly the effect of the activity and the interaction of students. Therefore, to support teachers to fully leverage the potential benefits of games in classrooms and make data-based decisions, educational games should incorporate learning analytics by transforming click-stream data generated from the gameplay into meaningful metrics and present visualizations of those metrics so that teachers can receive the information in an interactive and friendly way. For this work, we use data collected in a case study where teachers used Shadowspect geometry puzzle games in their classrooms. We apply learning analytics techniques to generate a series of metrics and visualizations that seek to facilitate that teachers can understand the interaction of students with the game. In this way, teachers can be more aware of the global progress of the class and each one of their students at an individual level, and intervene and adapt their classes when necessary.
Content may be subject to copyright.
Visualizing Educational Game Data:
A Case Study of Visualizations to Support
Teachers
Pedro A. Mart´ınez1?, Manuel J. G´omez1?, Jose A. Ruip´erez-Valiente1,2,
Gregorio Mart´ınez P´erez1, Yoon Jeon Kim2
1University of Murcia, Faculty of Computer Science, Murcia (Spain)
2Massachusetts Institute of Technology, Playful Journey Lab, Cambridge (USA)
Abstract. Video games have become one of the most popular mediums
across cultures and ages. There is ample evidence that supports the bene-
fits of using games for learning and assessment, and educators are largely
supportive of using games in classrooms. However, the implementation
of educational games as part of the curriculum and classroom practices
has been rather scarce. One of the main barriers is that teachers face to
actually know how their students are using the game so that they can
analyze properly the effect of the activity and the interaction of students.
Therefore, to support teachers to fully leverage the potential benefits of
games in classrooms and make data-based decisions, educational games
should incorporate learning analytics by transforming click-stream data
generated from the gameplay into meaningful metrics and present visu-
alizations of those metrics so that teachers can receive the information
in an interactive and friendly way. For this work, we use data collected
in a case study where teachers used Shadowspect geometry puzzle games
in their classrooms. We apply learning analytics techniques to generate
a series of metrics and visualizations that seek to facilitate that teachers
can understand the interaction of students with the game. In this way,
teachers can be more aware of the global progress of the class and each
one of their students at an individual level, and intervene and adapt their
classes when necessary.
Keywords: Educational games ·Learning analytics ·Game-based as-
sessment ·Technology-enhanced Learning.
1 Introduction
Playing games is one of the most popular activities for all ages of people around
the world, and its value as a learning opportunity has been widely accepted. In
U.S., for example, nearly three-quarters (74%) of parents believe video games can
serve an educational purpose for their children [7]. This has prompted a rapidly
?Both first and second authors contributed equally to this work.
Copyright c
2020 for this paper by its authors. Use permitted under Creative Com-
mons License Attribution 4.0 International (CC BY 4.0).
2 P. A. Mart´ınez et al.
increasing interest in using games in educational settings, not simply because
“it is what kids are paying attention to,” but because well-designed games are
very closely aligned with the design of good educational experiences [16, 8]. That
is, well-designed games pose cognitively complex and challenging problems that
deeply engage learners, thus helping them to learn more [20].
The benefits of games to support student learning have been well documented
over past 10 years. In a recent meta-analysis study, Clark and his colleagues [6]
report that compared to nongame conditions, games had a moderate to strong
effect for improving overall learning outcomes including cognitive and interper-
sonal skills. Another review [4] similarly reports games are beneficial for learning
of various outcomes such as knowledge acquisition, affect, behavior change, per-
ception and cognition as well as so called 21st Century Skills [21].
While ample evidence shows that games, in general, have a great potential
to support learning, only when combined with a thoughtful curriculum, games
can be successfully used to support learning in classrooms [26]. Successful and
meaningful integration of game-based learning in classrooms largely depends on
teacher’s practices and classroom-contexts [15]. Several studies have pointed out
the difficulties that teachers face to implement games in the classroom including
the rigidity of the curriculum, the perceived negative effects of gaming, unpre-
pared students, the lack of supportive materials, fixed class schedules and limited
budgets. Time is another limited resource; spending class time on learning com-
plex controls or watching long tutorials or teachers using their planning time
to familiarize themselves with the educational components of the game can be
limited [14]. Moreover, teachers often do not have the proper tools to understand
how students interact with the game environment and in what areas students
might need support. There are several options here, but one common approach
has been to provide learning analytic dashboards that can represent the low level
interactions in simple visualizations. This can bring opportunities for awareness,
reflection, sense-making and, above all, about the potential to improve learning,
that is, to get better at getting better [28].
For this educational purpose, we are using Shadowspect, an interactive com-
puter game where students can create geometrical primitives such as cones or
spheres to solve 3D puzzles, developing their geometric, dimensional and spatial
reasoning skills. Shadowspect features 30 levels of gradually increasing difficulty
where students need to move and rotate shapes in order to find solutions to
modeling problems. Using Shadowspect, students’ interaction with the game is
collected in order to analyze different metrics like active time, a number of puz-
zles completed or the total number of events performed while playing. These
metrics can be presented graphically so that teachers can observe each students’
progress in the game and find problems that the whole classroom could be having
with a specific puzzle. More specifically, for this research we have the following
objectives:
1. To present a proposal of metrics that can help teachers to understand the
interaction of students with Shadowspect.
Visualizing Educational Game Data 3
2. To present a case study with two uses cases from data collected in K12
schools across the US using Shadowspect:
(a) A first use case using these metrics to understand the global progress in
an entire classroom.
(b) A second use case using these metrics to understand students’ progress
in a classroom at an individual level.
The rest of the paper is organized as follows. Section 2 reviews background
literature on student engagement and learning analytics. Section 3 describes the
methods, including Shadowspect, as well as the educational context and the
data collection. Section 4 presents our proposal of metrics, and in Section 5 we
introduce and describe both use cases using the previous metrics and visualiza-
tions. Then we finalize the paper with discussion, conclusions and future work
in Section 6.
2 Related Work
With current data collection technologies, we can collect large datasets from
students’ interaction with educational games that need to be treated in order to
be understood [1]. Learning Analytics (LA) is a field of research and practice that
aims to collect and analyze data generated by a learner in a given environment,
which in this case can be applied to educational game data. This data analysis
is not only useful for the evaluation of the students [13], but it can also be used
for future improvements in the design of educational games, to personalize the
difficulty of the scenarios according to the student’s abilities [2] or as in our case
study, to identify strange behaviors or difficulties of the student when facing a
task. Finally, one of the main advantages of data analysis is to increase student
engagement and improve learning, as engagement and learning are linked [5, 9].
This personalized adaptation of scenarios and difficulty per student goes one step
further with multimodal learning analytics, which aims to collect data external
to the game such as the student’s heart rate [18], and can be used for multiple
purposes, such as to adjust the game difficulty based on the identified problems
and levels of concentration [27].
Researchers often implement metrics to make sense of the collected. raw data
through a feature engineering process [17, 19], which are higher level information
measures extracted from the data and according to the specificities of each met-
rics’ purpose. In our case, the algorithms seek to obtain metrics related to the
levels of activity, difficulty and other patterns, as other previous worked have
attempted [11], so that we can predict behavior or assess the student based on
what the instructor con learn from those metrics. Each environment might have
specific metrics, however, there are some that are often implemented, such as
the number of events or the time within the educational game.
In some previous studies, the main goal has been to measure the engagement
of each student. Authors in [22] differentiated four dimensions: the general activ-
ity, social, exploration and quests, and they found four profiles of engagement:
4 P. A. Mart´ınez et al.
fully engaged, lone achiever, social explorer and non-engaged. In this study we
implement similar metrics: for the general activity dimension, we implement a
series of levels of activity and for the exploration dimension we analyze the funnel
in the game puzzles. Another example of the importance of metrics for evaluation
is described in this paper [12], which deals with the application of game-based
learning to mathematics contents. Its aim is to study improvements with previ-
ous training through play and to see if those metrics can be indicators of success.
The study was carried out with students of about 10 years old that completed
mathematical tasks about rational numbers, using the game Semideus School.
The control of the behavior and performance of the students was recorded with
different metrics, similar to our case but adapted to their type of game, and they
proposed parameters such as time spent, the maximum level reached, number
of games played or general performance. As a conclusion, the use of metrics in
game-based learning as part of the evaluation process stands out.
Despite the proven benefits of educational games in learning [21], their imple-
mentation in schools remains rather limited. Problems such as the lack of both
computational and additional human resources, teacher’s rejection of new teach-
ing methods, the fact that some teachers still believe that the implementation of
educational games is a complex process that is beyond their reach, make it very
difficult to systematically expand the implementation of games in educational
settings. That is why providing guidelines and facilitating a simplified deploy-
ment of these games is so important, so that their implementation can greatly
benefit teachers and students [3]. One of the reported methods to facilitate the
adoption of educational games in the classroom is to provide visualization dash-
boards that can represent easy, interactive and intuitive visualizations, a series
of interactions of the student with the environment. Previous studies have made
this proposal in other types of learning environments, such as for massive open
online courses [23], or intelligent tutoring systems [10]. In our paper, we propose
a set of visualizations of the data collected in the educational game Shadowspect
[25], as a tool for teachers to detect problems within a class or with a particular
student as proposed by previous work [24]. With a visualization dashboard sys-
tem in place, the teacher can monitor what students are doing with the game
during the class period and intervene during the development of the activity
when appropriate or even use these metrics as part of formative assessment.
3 Methods
3.1 Shadowspect Tool
In this case study we use Shadowspect, a game-based assessment tool that aims
to provide metrics related to geometry content and other behavioral and cog-
nitive constructs. Shadowspect has been designed explicitly as a formative as-
sessment tool to measure math content standards (e.g. visualize relationships
between 2D and 3D objects), thus teachers can use it in their core math cur-
riculum.
Visualizing Educational Game Data 5
When students begin a puzzle, they receive a set of silhouettes from different
views that represent the figure they need to create, which will be composed
of other primitive shapes the student can put into the scenario. The primitive
shapes that students can create are cubes, pyramids, ramps, cylinders, cones
and spheres. Depending on the level and difficulty, the puzzle may restrict the
quantity or type of shapes they can create. After putting these shapes in the
scenario, they can also scale, move and rotate the shapes in order to build a figure
that solves the puzzle. Students can move the camera to see the figure they are
building from different perspectives and then use the ‘Snapshot’ functionality to
generate the silhouette and see how close they are to the objective. Finally they
can submit the puzzle and the game will evaluate the solution and provide them
with feedback.
Fig. 1: Two puzzle examples in Shadowspect
In the version of Shadowspect that we have used in this case study, we have
9 tutorial levels, 9 intermediate and 12 advanced. The tutorial levels aim to
teach the basic functionality of the game, so the students can learn how to build
different primitives, scale and rotate them, how to change the perspective, take
snapshots and so on. The intermediate levels allow students more freedom so
they will not receive so much help to solve puzzles and then the advanced levels
pretend to be a real challenge for students who have gained experience with
previous levels before.
3.2 Educational Context and Data Collection
The data used (N = 300) for this paper was collected as part of assessment
machinery development that later will be implemented in Shadowspect. Due
to the goal of ensuring that we have sufficient data points to create robust
assessment models, the team recruited 7 teachers who can use Shadowspect at
least two hours in their 7th grade and 10th grade math and geometry classes. In
this paper, we focus on a single class with 31 students to represent a real case
scenario of how a teacher could use these visualizations to monitor the progress
of their students in the classroom. All student interactions with the game were
6 P. A. Mart´ınez et al.
collected and stored in a MySQL database, we do not collect any identifiable or
personal data from the users except for a nickname. The data collection of the
selected class with 31 students includes around 54829 events (an average of 1768
events per user); students were active in the game environment for 33 hours (an
average of 65 active minutes per student), and students solved a total of 448
puzzles (an average of 14 puzzles per student).
4 Metrics Proposal
For the analysis of the data within our game we have defined four different
metrics: Levels of activity and difficulty, puzzle funnel and sequences between
puzzles. With these metrics we obtain a detailed analysis of the students’ in-
teraction with the game, so the teacher can receive detailed information that
can be used to evaluate and analyze students’ interactions. With this analysis
we cover the most important aspects in order to monitor and analyse students’
activity in a simple way. In addition, we want to explore the affordances of com-
bining several metrics together to augment their significance. Next, we explain
the implementation details of each metric.
– Levels of activity: This metric implements a set of parameters that de-
scribe the levels of activity of the user with Shadowspect. These are straight-
forward metrics to compute based on a feature engineering process, such as
the active time, number of events, different type of events, and number of
different types of events like snapshots, rotations, movements, scaling, shape
creations and deletions, among several others. However, for this case study
we focus only on the two first types of events that we mentioned because
these are the most important to look at when analyzing students’ interac-
tion with the game, however we would like to denote that all of them are
available for the teacher.
active time: Amount of active time in minutes establishing an inactiv-
ity threshold of 60 seconds (i.e. if the time between two events is above
60 seconds, the user is considered to be inactive during that time and
this slot is omitted from the computation).
n events: Total number of events triggered within the game (every ac-
tion performed by a student in Shadowspect is recorded as an event).
– Levels of Difficulty: This metric provides a set of parameters that are
related to the difficulty of the puzzles:
completed time: This parameter is computed by dividing the amount
of time invested in the game (active time) by the number of completed
puzzles.
actions completed: This parameter is computed by dividing the num-
ber of actions (n events) by the number of completed puzzles.
p incorrect: This parameter is calculated dividing the number of incor-
rect attempts by the total number of attempts (n attempts) multiplied
by 100.
Visualizing Educational Game Data 7
p abandoned: This parameter is computed by dividing the number of
started puzzles by the number of completed puzzles.
norm all measures: First the four aforementioned parameters for this
metric are standardized by computing the z-scores of each metric (i.e.
z=xµ
σwhere µis the mean and σthe standard deviation of x). Then,
we make the sum of the standardized parameters:
zall measures =zcomp time +zactions comp +zp incorrect +zp abandoned (1)
Finally, the calculated parameter is normalized between 0 and 1. The
parameters minz all measures and maxz all measures are the minimum and
maximum of the zall measures.
normall measures =(zall measures minz all measures )
(maxz all measur es minz all measures )(2)
Puzzle funnel: A conversion funnel is an e-commerce term that describes
the different stages in a buyer’s journey leading up to a purchase. We use this
same metaphor to illustrate the stages that a student can go through in order
to solve a puzzle. We define the following four stages for the funnel: started
(if the student has started the puzzle), create shape (if the student has set
up an object into this particular puzzle), submitted (if the student checked
the puzzle solution) and completed (if the student has submitted the puzzle
and the solution is correct). Then, we analyse the data to count the stages
reached for each one of the puzzles and by each student. This funnel seeks
to provide a quick overview of the current status for each student and puzzle
in a class.
– Sequence between puzzles: Although Shadowspect puzzles are divided
in three different categories based on its difficulty, they do not have to be
completed in a linear sequence. Therefore, students can jump from any puz-
zle to another, regardless of its difficulty, pursuing their own interests and
exploring the game. In this metric our objective is to analyze the temporal
interaction of students with the puzzles following a chronological order, so
that we can reconstruct the sequence of puzzles for a given student. We pro-
vide an output with the sequence of puzzle attempts in chronological order
and the funnel state the student reached in each attempt.
5 Case Study
This section presents the case study that exemplifies how teachers can use these
learning analytics visualizations in two different situations: the first use case in
Subsection 5.1 applies the metrics and visualizations to understand the global
progress in a classroom whereas the second use case in Subsection 5.2 is focused
on the individual progress of students in a classroom.
8 P. A. Mart´ınez et al.
5.1 Use Case 1: Classroom Analysis
The first analysis implements visualizations at the classroom level, so that the
teacher can see the overall progress of the class for each particular puzzle. This
can be useful to detect which puzzles are harder or which contents areas re-
quire extra additional explanations or support from the teacher. Figure 2 rep-
resents the first visualization with the puzzle funnel metric with its four stages
as described in Section 4. We represent started (blue), create shape (yellow),
submitted (red) and completed (green) in a circular chart for each one of the
puzzles, which are arranged following the sequence order of appearance in Shad-
owspect game. For each one of the puzzles, the first sentence on the top indicates
its name and the second sentence the category of the puzzle.
90.6%
87.5%
87.5%
87.5%
75%
75%
75%
68.8%
75%
75%
75%
68.8%
50%
46.9%
46.9%
40.6%
87.5%
87.5%
87.5%
84.4%
65.6%
59.4%
56.2%
56.2%
43.8%
37.5%
28.1%
3.1%
53.1%
53.1%
53.1%
46.9%
84.4%
84.4%
84.4%
84.4%
62.5%
62.5%
59.4%
46.9%
25%
21.9%
21.9%
21.9%
43.8%
43.8%
40.6%
34.4%
84.4%
84.4%
84.4%
84.4%
71.9%
71.9%
71.9%
62.5%
25%
25%
18.8%
15.6%
34.4%
31.2%
31.2%
25%
84.4%
84.4%
81.2%
81.2%
62.5%
59.4%
59.4%
40.6%
34.4%
31.2%
31.2%
25%
62.5%
62.5%
62.5%
59.4%
81.2%
81.2%
78.1%
78.1%
56.2%
56.2%
56.2%
43.8%
31.2%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
21.9%
78.1%
78.1%
78.1%
75%
75%
71.9%
71.9%
71.9%
46.9%
37.5%
21.9%
9.4%
78.1%
78.1%
78.1%
75%
65.6%
59.4%
59.4%
53.1%
37.5%
34.4%
34.4%
31.2%
Advanced Levels
Stranger Shapes
Advanced Levels
Sugar Cones
Advanced Levels
Tall and Small
Advanced Levels
Unnecessary
Advanced Levels
Warm Up
Advanced Levels
Zzz
Intermediate Levels
Square Cross-Sections
Advanced Levels
Bear Market
Advanced Levels
Bull Market
Advanced Levels
Few Clues
Advanced Levels
M nM. T. Meets Your Eyessye
Advanced Levels
Not Bird
Advanced Levels
Orange Dance
Advanced Levels
Ramp Up and Can It
Tutorial Levels
9. Scaling Round Obts
Intermediate Levels
Degree Rotations
Intermediate Levels
Angled Silhouette
Intermediate Levels
Bird Fez
Intermediate Levels
bBoxes Obscure Spheres
Intermediate Levels
Object Limits
Intermediate Levels
Pi Henge
Intermediate Levels
Pyramids are Strange
Tutorial Levels
One Box
Tutorial Levels
Separated Boxes
Tutorial Levels
Rotate a Pyramid
Tutorial Levels
Match Silhouettes
Tutorial Levels
Removing Objects
Tutorial Levels
Stretch a Ramp
Tutorial Levels
Max 2 Boxes
Tutorial Levels
Combine 2 Ramps
Percentage
Funnel
Funnel
Completed
Submitted
Create_shape
Started
Fig. 2: Funnel for each puzzle in a classroom.
With this overview of the entire class, the teacher could quickly detect strange
behaviours or problems in a particular puzzle. For this specific class, at the
tutorial level we see that the puzzle funnel is good, having in most cases an
identical number of puzzles in stage completed and in started, which means
that most puzzles that were started, were also completed. For the intermediate
level, we can see that is more frequent to find puzzles with a higher number of
started than completed, which is indicative of the increased difficulty. Once
we review the advanced puzzles, we identify several puzzles that might represent
that students are facing issues to solve them, for example for “Orange Dance”
Visualizing Educational Game Data 9
(46.9% started, 9.4% completed) and “Bear Market” (43.8% started, 3.1%
completed), we find that the percentage of started puzzles is much higher than
the percentage of completed puzzles. Once these puzzles have been identified,
the next step is to check whether this is due to the high difficulty of the selected
puzzles, conceptual issues or other factors. To do this, we analyze the puzzles
with the levels of difficulty metric.
0
2
6
8
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Minutes
completed_time
0
50
100
150
200
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Number of actions
completed_actions
0
25
50
75
100
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Percentage
p_incorrect
25
50
75
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Percentage
p_abandoned
0.00
0.25
0.50
0.75
1.00
1. One Box
2. Separated Boxes
3. Rotate a Pyramid
4. Match Silhouettes
5. Removing Objects
6. Stretch a Ramp
7. Max 2 Boxes
8. Combine 2 Ramps
9. Scaling Round Objects
Square Cross-Sections
Bird Fez
Pi Henge
45-Degree Rotations
Pyramids are Strange
Boxes Obscure Spheres
Object Limits
Warm Up
Angled Silhouette
Sugar Cones
Stranger Shapes
Tall and Small
Ramp Up and Can It
More Than Meets Your Eye
Not Bird
Zzz
Bull Market
Few Clues
Orange Dance
Bear Market
Difficulty
norm_all_measures
Fig. 3: Levels of difficulty by puzzle in a classroom.
In the Figure 3, we represent the metric of levels of difficulty with the four
parameters and the composite measure. In the x-axis we have each one of the
puzzles following the sequence order of Shadowspect, and the y-axis represent
the difficulty parameter as explained in Section 3. For both puzzles, “Orange
Dance” and “Bear Market”, that we identified as having a problematic funnel
in Figure 2, we see that their difficulty parameters are also high. In the time
required to complete the puzzle, we have that “Orange Dance” has a lower time
with 2 minutes on average, while “Bear Market” goes up to 8 minutes, so there
is quite a difference between both, as in the number of actions to complete the
puzzle, which is of 50 actions and 200 respectively. However, for the percentage
of incorrect attempts and abandoned we get almost equal values for both puz-
zles, with percentages close to 100%. Finally, the composite difficulty measure
10 P. A. Mart´ınez et al.
shows what already appeared in the previous parameters, for “Orange dance”
the difficulty is around 0.8 and for “Bear Market” it is 1, the most difficult puz-
zle. With this last parameter of the metric, teachers can get a quick overview of
the puzzles that are posing a major challenge among their students.
5.2 Use Case 2: Individual Student Analysis
In this second use case we select a particular group of students in the classroom
to see how they are progressing in the completion of the different game puzzles.
Then, we will see how a teacher can observe students’ progress and difficulties
at individual levels based on three of the metrics we defined in Section 4.
46.7%
46.7%
46.7%
33.3%
73.3%
63.3%
63.3%
43.3%
73.3%
73.3%
70%
70%
100%
70%
66.7%
46.7%
30%
30%
30%
23.3%
16.7%
10%
10%
3.3%
100%
100%
96.7%
90%
16.7%
16.7%
13.3%
3.3%
70%
70%
70%
66.7%
100%
100%
100%
96.7%
50%
50%
50%
46.7%
20%
20%
16.7%
16.7%
33.3%
33.3%
30%
23.3%
50%
50%
50%
46.7%
60%
60%
60%
53.3%
Student 277
Student 278
Student 285
Student 287
Student 290
Student 220
Student 232
Student 238
Student 247
Student 250
Student 151
Student 155
Student 160
Student 188
Student 202
Percentage
Funnel
Funnel
Completed
Submitted
Create shape
Started
Fig. 4: Individual funnel for each student in a classroom.
As we can see in Figure 4, we have selected 15 students of the group. Using
the funnel metric, the teacher can easily see the relationship between the number
of started puzzles and the number of completed puzzles for every student. While
some students show good progress as they have high completion rates (Student
188 has 96.7% completed of the total of puzzles available in the game), others
reveal that they have been struggling with some puzzles. For example, Student
155 has started every puzzle in the game but only completed 46.7% of them. We
are going to focus on this last student and figure out if the student is struggling
with solving the tasks or if the student did not put enough effort.
Visualizing Educational Game Data 11
0.00
0.25
0.50
0.75
1.00
One Box
Separated Boxes
Rotate a Pyramid
Match Silhouettes
Removing Objects
Stretch a Ramp
Max 2 Boxes
Combine 2 Ramps
Scaling Round Objects
Square Cross-Sections
Bird Fez
Boxes Obscure Spheres
Object Limits
Bear Market
Sugar Cones
Zzz
Ramp Up and Can It
Few Clues
More Than Meets Your Eye
Orange Dance
Stranger Shapes
Tall and Small
Not Bird
Bull Market
45-Degree Rotations
Angled Silhouette
Pyramids are Strange
Warm Up
Pi Henge
Difficulty
Funnel
Completed
Submitted
Create shape
Started
5.1 Sequence of puzzles.
active_time
n_events
One Box
Separated Boxes
Rotate a Pyramid
Match Silhouettes
45-Degree Rotations
Removing Objects
Stretch a Ramp
Max 2 Boxes
Combine 2 Ramps
Scaling Round Objects
Angled Silhouette
Bear Market
Bird Fez
Boxes Obscure Spheres
Bull Market
Few Clues
More Than Meets Your Eye
Not Bird
Object Limits
Orange Dance
Pi Henge
Pyramids are Strange
Ramp Up and Can It
Sandbox
Square Cross-Sections
Stranger Shapes
Sugar Cones
Tall and Small
Unnecessary
Warm Up
Zzz
One Box
Separated Boxes
Rotate a Pyramid
Match Silhouettes
45-Degree Rotations
Removing Objects
Stretch a Ramp
Max 2 Boxes
Combine 2 Ramps
Scaling Round Objects
Angled Silhouette
Bear Market
Bird Fez
Boxes Obscure Spheres
Bull Market
Few Clues
More Than Meets Your Eye
Not Bird
Object Limits
Orange Dance
Pi Henge
Pyramids are Strange
Ramp Up and Can It
Sandbox
Square Cross-Sections
Stranger Shapes
Sugar Cones
Tall and Small
Unnecessary
Warm Up
Zzz
0
50
100
150
100
200
300
Seconds
#Events
0
5.2 Number of seconds spent and number of events performed in each puzzle.
Fig. 5: Sequence of actions and levels of activity for Student 155.
In Figure 5.1, we combine three different metrics at the same time: The x-
axis with the dots represent the sequence of puzzles of the student, while the
color of the dots represents the funnel stage of each puzzle, and then we have
incorporated the difficulty metric of each puzzle by adjusting the position on
the y-axis. The puzzles completed by the student are, in most cases, tutorial
levels and levels with low difficulty. Another thing we can figure out from this
plot is that there are a lot of puzzles that the student has started but not even
put a shape into it. We could think the student has been entering and exiting
the different puzzles without doing anything else, but as it is not sure. To know
what the student has done in each task, we introduce levels of activity metric
for the teacher to review the actions of the student through puzzles.
12 P. A. Mart´ınez et al.
From Figure 5.2, we can draw some conclusions about the student interac-
tion. From the previous metric we know that puzzle level named as “Zzz” was
submitted and then we see the active time and n events in this puzzle has
been high. So we know the student has spent some time trying to solve it, and
we could now say that the student has experienced difficulties with this puzzle.
As we could imagine, most part of tasks that were in a started stage, have in-
significant values of active time and n events performed so it can be assumed
that the student has entered the puzzle but not even tried to solve it.
0.00
0.25
0.50
0.75
1.00
One Box
Separated Boxes
Rotate a Pyramid
Match Silhouettes
Removing Objects
Stretch a Ramp
Max 2 Boxes
Combine 2 Ramps
Scaling Round Objects
Square Cross-Sections
Bird Fez
Pi Henge
45-Degree Rotations
Pyramids are Strange
Boxes Obscure Spheres
Difficulty
Funnel
Completed
Submitted
6.1 Sequence of puzzles
active_time
n_events
One Box
Separated Boxes
Rotate a Pyramid
Match Silhouettes
45-Degree Rotations
Removing Objects
Stretch a Ramp
Max 2 Boxes
Combine 2 Ramps
Scaling Round Objects
Bird Fez
Boxes Obscure Spheres
Pi Henge
Pyramids are Strange
Square Cross-Sections
One Box
Separated Boxes
Rotate a Pyramid
Match Silhouettes
45-Degree Rotations
Removing Objects
Stretch a Ramp
Max 2 Boxes
Combine 2 Ramps
Scaling Round Objects
Bird Fez
Boxes Obscure Spheres
Pi Henge
Pyramids are Strange
Square Cross-Sections
0
50
100
0
100
200
300
Seconds
#events
6.2 Number of seconds spent and number of events performed in each puzzle.
Fig. 6: Sequence of actions and levels of activity for Student 247.
Now we have analysed a student experiencing certain issues with the game,
let’s focus on a student with good progress to see the sequence of puzzles and the
levels of activity this other student has. In Figure 6.1, we now see a very linear
Visualizing Educational Game Data 13
progression, where the student has not entered a level until the previous one was
already completed, so the student has been working fine but maybe needs more
time than other students to solve the same amount of tasks. In Figure 6.2 we see
that Student 247 has spent more than 300 seconds in task “45-degree rotation”,
but finally completed it, which shows high resiliency. The time spent by the
student is regularly distributed in the different levels that have been solved.
6 Conclusions and Future Work
The objective of this study was twofold: First to propose a series of metrics that
can provide comprehensive information regarding the process of students with
the puzzles in Shadowspect and second to achieve simple but detailed visualiza-
tions of these metrics that can allow teachers to track the students within their
class, so that they can evaluate or detect problems quickly and effectively. In Sec-
tion 4, we proposed four metrics that can generate explainable insights regarding
the interaction of students with puzzles; the main goal was to provide easy-to-
understand and actionable information for teachers. In the Subsection 5.1, we
reported a set of visualizations at a classroom level for those metrics, so that
teachers can evaluate the learning process of the class in a global way. Within the
overall analysis, the teacher can identify the puzzles where students have found
problems and analyse the cause with the difficulty metric. This approach can
help alleviate one of the main problems when implementing educational games
in the classroom, which is the possibility to better understand how students
interact with the game and the overall progress. Then, in Subsection 5.2 we pre-
sented a use case where teachers can monitor the individual activity of a student
in Shadowspect using visualizations of the metrics. We analysed two different
individual students, one of them with poor performance in the resolution of puz-
zles, and another showing a better puzzle funnel, but with a low percentage of
completed puzzles in relation to the total number of levels available in the game.
These two analyses exemplify how a teacher can assess the status of a student
in the class and solve the possible problems a student might be having during
a session. This represents an opportunity for educators to provide personalized
attention to their students and help them in their learning process.
The next stage is to use this opportunity to implement just-in-time interven-
tions that aim to provide support at the right time by adapting to the needs
of each individual. One of the main limitations of this work is that these are
offline static visualizations, and thus inflexible and not scalable. Therefore, as
part of the future work we will be working on the co-creation of a dashboard
for teachers that can provide greater speed and interactivity when displaying
data from a class or individual students, and hence enabling just-in-time inter-
ventions during the sessions. Also we will be working on obtaining evidences
of the interpretability of these visualizations and to make them explainable so
that teachers can easily intervene. Shadowspect is designed as a formative as-
sessment tool, and thus we can also use these visualizations for students so that
they can receive feedback and improve their self-awareness. More nuanced met-
14 P. A. Mart´ınez et al.
rics and visualizations will allow students to visualize their mistakes and areas
of improvement. In this way we can use Shadowspect as a robust learning tool
with that can be easily implemented by teachers in the classroom and that em-
phasizes the formative feedback to the student. This study has proposed a new
dynamic approaches that can be helpful to facilitate systematic implementation
of educational games in the classrooms of the future.
Acknowledgements
We want to acknowledge support from the MIT-SPAIN “la Caixa” Founda-
tion SEED FUND and the Spanish Ministry of Economy and Competitiveness
through the Juan de la Cierva Formaci´on program (FJCI-2017-34926).
References
1. Alonso-Fernandez, C., Calvo, A., Freire, M., Martinez-Ortiz, I., Fernandez-Manjon,
B.: Systematizing game learning analytics for serious games. In: 2017 IEEE Global
Engineering Education Conference (EDUCON). pp. 1111–1118. IEEE (2017)
2. Alonso-Fern´andez, C., Cano, A.R., Calvo-Morata, A., Freire, M., Mart´ınez-Ortiz,
I., Fern´andez-Manj´on, B.: Lessons learned applying learning analytics to assess
serious games. Computers in Human Behavior 99, 301–309 (2019)
3. Alonso-Fern´andez, C., Perez-Colado, I., Freire, M., Mart´ınez-Ortiz, I., Fern´andez-
Manj´on, B.: Improving serious games analyzing learning analytics data: lessons
learned. In: International Conference on Games and Learning Alliance. pp. 287–
296. Springer (2018)
4. Boyle, E.A., MacArthur, E.W., Connolly, T.M., Hainey, T., Manea, M., K¨arki,
A., Van Rosmalen, P.: A narrative literature review of games, animations and
simulations to teach research methods and statistics. Computers & Education 74,
1–14 (2014)
5. Chew, B.S.: An efficient framework for game-based learning activity. In: 2017 IEEE
6th International Conference on Teaching, Assessment, and Learning for Engineer-
ing (TALE). pp. 147–150. IEEE (2017)
6. Clark, D.B., Tanner-Smith, E.E., Killingsworth, S.S.: Digital games, design, and
learning: A systematic review and meta-analysis. Review of educational research
86(1), 79–122 (2016)
7. ESA: 2019 essential facts about the computer and video game industry. Tech. rep.,
Entertainment Software Association (2019)
8. Gee, J.P.: Are video games good for learning? Nordic Journal of Digital Literacy
1(03), 172–183 (2006)
9. Hamari, J., Shernoff, D.J., Rowe, E., Coller, B., Asbell-Clarke, J., Edwards, T.:
Challenging games help students learn: An empirical study on engagement, flow
and immersion in game-based learning. Computers in human behavior 54, 170–179
(2016)
10. Holstein, K., McLaren, B.M., Aleven, V.: Intelligent tutors as teachers’ aides: ex-
ploring teacher needs for real-time analytics in blended classrooms. In: Proceedings
of the Seventh International Learning Analytics & Knowledge Conference. pp. 257–
266 (2017)
Visualizing Educational Game Data 15
11. Kang, J., Liu, M., Qu, W.: Using gameplay data to examine learning behavior
patterns in a serious game. Computers in Human Behavior 72, 757–770 (2017)
12. Kiili, K., Moeller, K., Ninaus, M.: Evaluating the effectiveness of a game-based
rational number training-in-game metrics as learning indicators. Computers & Ed-
ucation 120, 13–28 (2018)
13. Kim, Y.J., Ifenthaler, D.: Game-based assessment: The past ten years and moving
forward. In: Game-Based Assessment Revisited, pp. 3–11. Springer (2019)
14. Kirriemuir, J., McFarlane, A.: Use of computer and video games in the classroom.
In: DiGRA Conference (2003)
15. Klopfer, E., Osterweil, S., Groff, J., Haas, J.: Using the technology of today in
the classroom today: The instructional power of digital games, social networking,
simulations and how teachers can leverage them. The Education Arcade 1, 20
(2009)
16. Koster, R.: Theory of fun for game design. ” O’Reilly Media, Inc.” (2013)
17. ases, S., Hallaq, B., Maennel, O.: Obtaining better metrics for complex serious
games within virtualised simulation environments. In: European Conference on
Games Based Learning. pp. 428–434. Academic Conferences International Limited
(2017)
18. Ninaus, M., Tsarava, K., Moeller, K.: A pilot study on the feasibility of dynamic
difficulty adjustment in game-based learning using heart-rate. In: International
Conference on Games and Learning Alliance. pp. 117–128. Springer (2019)
19. Plass, J.L., Homer, B.D., Kinzer, C.K., Chang, Y.K., Frye, J., Kaczetow, W.,
Isbister, K., Perlin, K.: Metrics in simulations and games for learning. In: Game
analytics, pp. 697–729. Springer (2013)
20. Prensky, M.: Digital game-based learning. Computers in Entertainment (CIE) 1(1),
21–21 (2003)
21. Qian, M., Clark, K.R.: Game-based learning and 21st century skills: A review of
recent research. Computers in Human Behavior 63, 50–58 (2016)
22. Ruiperez-Valiente, J.A., Gaydos, M., Rosenheck, L., Kim, Y.J., Klopfer, E.: Pat-
terns of engagement in an educational massive multiplayer online game: A multi-
dimensional view. IEEE Transactions on Learning Technologies (2020)
23. Ruiperez-Valiente, J.A., Munoz-Merino, P.J., Gascon-Pinedo, J.A., Kloos, C.D.:
Scaling to massiveness with analyse: A learning analytics tool for open edx. IEEE
Transactions on Human-Machine Systems 47(6), 909–914 (2016)
24. Serrano-Laguna, ´
A., Mart´ınez-Ortiz, I., Haag, J., Regan, D., Johnson, A.,
Fern´andez-Manj´on, B.: Applying standards to systematize learning analytics in
serious games. Computer Standards & Interfaces 50, 116–123 (2017)
25. Smith, S.P., Hickmott, D., Southgate, E., Bille, R., Stephens, L.: Exploring play-
learners’ analytics in a serious game for literacy improvement. In: Joint Interna-
tional Conference on Serious Games. pp. 13–24. Springer (2016)
26. Squire, K.: Video games and learning. Teaching and participatory culture in the
digital age (2011)
27. Tsai, M.J., Huang, L.J., Hou, H.T., Hsu, C.Y., Chiou, G.L.: Visual behavior, flow
and achievement in game-based learning. Computers & Education 98, 115–129
(2016)
28. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics
dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013)
... While multiple studies used learning analytics techniques in games, for example, to examine how students are collaborating with each other , to function as game-based assessment purposes (Kim & Ifenthaler, 2019), or to model learning behaviors within the game (Kang et al., 2017), teachers' implementation of games, coupled with learning analytics in classrooms, is still somewhat limited. A few of the barriers are the lack of actionable assessment data, the fact that teachers often do not have a clear sense of how students are interacting with the game, and if the gameplay is leading to productive learning (Martınez et al., 2020). ...
Chapter
Full-text available
The use of learning analytics (LA) in educational technology has emerged as a key interest to researchers with the promise that this technology will help teachers and schools make data-informed decisions that were not feasible without big data and AI-driven algorithms. Despite its potential, LA has not yet effectively connected research and practice broadly. In the field, we have yet to understand how research-based advances in LA can become accessible assets for teachers, and often LA tools are generally not aligned with teachers’ needs. To see the real impact of LA in classrooms, the first step is to understand teacher literacy for using sophisticated technology-enhanced learning systems that use algorithms and analytics. In this chapter, we present a framework that enables a collaborative design and development process for learning analytics and data visualizations, specifically using games developed for learning and assessment purposes. Using a 3D puzzle game, Shadowspect, the team has been exploring a balanced design of data visualization that considers teachers’ needs and desires as well as their assessment literacy. In this chapter, we (1) define what it means to be assessment literate in the context of game-based learning and assessment, (2) present a process of creating data visualizations with teachers as co-designers, and (3) present several use cases. This chapter can contribute to establishing the foundations of how to design dashboard systems for learning games that can lead to broad use of game data in classrooms.
Article
The rapid technological evolution of the last years has motivated students to develop capabilities that will prepare them for an unknown future in the 21st century. In this context, many teachers intend to optimise the learning process, making it more dynamic and exciting through the introduction of gamification. Thus, this paper focuses on a data-driven assessment of geometry competencies, which are essential for developing problem-solving and higher-order thinking skills. Our main goal is to adapt, evaluate and compare Bayesian Knowledge Tracing (BKT), Performance Factor Analysis (PFA), Elo and Deep Knowledge Tracing (DKT) algorithms applied to the data of a geometry game named Shadowspect, in order to predict students’ performance by means of several classifier metrics. We analysed two algorithmic configurations, with and without prioritisation of Knowledge Components (KCs) – the skills needed to complete a puzzle successfully, and we found Elo to be the algorithm with the best prediction power with the ability to model the real knowledge of students. However, the best results are achieved without KCs because it is a challenging task to differentiate between KCs effectively in game environments. Our results prove that the above-mentioned algorithms can be applied in formal education to improve teaching, learning, and organisational efficiency.
Conference Paper
Full-text available
Games are increasingly being recognized as valuable tools for learning. In addition, they are also being explored for their potential to provide valid and reliable assessments, as they allow to create authentic and engaging assessment contexts through interactive and immersive environments. However, there are challenges to enable Game-based Assessment (GBA) at scale, including the need for interoperability between assessment models and machinery, and the complexity of managing and processing large amounts of data generated by users' interaction with games. In this study, we propose a novel approach that combines the use of ontologies and Big Data technologies for developing interoperable GBAs. The architecture enables assessments to be performed using data from different games, and we also designed and implemented a service API that facilitates the Game-Based Assessment as a Service (GBAaaS) paradigm. GBAaaS simplifies the GBA development process and enables its adoption at scale, making it a promising approach for future developments in this field.
Article
Full-text available
Games have become one of the most popular activities across cultures and ages. There is ample evidence that supports the benefits of using games for learning and assessment. However, incorporating game activities as part of the curriculum in schools remains limited. Some of the barriers for broader adoption in classrooms is the lack of actionable assessment data, the fact that teachers often do not have a clear sense of how students are interacting with the game, and it is unclear if the gameplay is leading to productive learning. To address this gap, we seek to provide sequence and process mining metrics to teachers that are easily interpretable and actionable. More specifically, we build our work on top of Shadowspect, a three-dimensional geometry game that has been developed to measure geometry skills as well other cognitive and noncognitive skills. We use data from its implementation across schools in the U.S. to implement two sequence and process mining metrics in an interactive dashboard for teachers. The final objective is to facilitate that teachers can understand the sequence of actions and common errors of students using Shadowspect so they can better understand the process, make proper assessment, and conduct personalized interventions when appropriate.
Conference Paper
Full-text available
Games have become one of the most popular mediums across cultures and ages and the use of educational games is growing. There is ample evidence that supports the benefits of using games for learning and assessment. However, we do not usually find games incorporated into educational environments. One of the main problems that teachers face is to actually know how students are interacting with the game as they cannot analyze properly the effect of the activity on the students. To improve this issue, we can use the data generated by the interaction of students with such educational games to analyze the sequences and errors by transforming raw data into meaningful sequences that are interpretable and actionable for teachers. In this study we use a data collection from our game Shadowspect and implement learning analytics with process and sequence mining techniques to generate two metrics that aim to help teachers make proper assessment and better understand the process.
Article
Full-text available
Learning games have great potential to become an integral part of new classrooms of the future. One of the key reported benefits is the capacity to keep students deeply engaged during their learning process. Therefore, it is necessary to develop models that can measure quantitatively how learners are engaging with learning games to inform game designers and educators, and to find ways to maximize learner engagement. In this work, we present our proposal to multidimensionally measure engagement in a learning game over four dimensions: general activity, social, exploration and quests. We apply metrics from these dimensions to data from The Radix Endeavor, an inquiry-based online game for STEM learning that has been tested in K12 classrooms as part of a pilot study across numerous schools. Based on these dimensions, we apply clustering and report four different engagement profiles that we define as: “integrally engaged”, “lone achiever”, “social explorer” and “non-engaged.” We also use three variables (account type, class grade, and gender) to perform a cross-sectional analysis finding interesting, statistically significant differences in engagement. For example, in-school students and accounts registered to males engaged socially much more than out-of-school learners or accounts registered to females, and that older students have better performance metrics than younger ones.
Chapter
Full-text available
The implementation of assessment features into game-based learning environments is only in its early stages because it adds a very time-consuming step to the design process. The impact on learning and questions toward reliability and validity of technology-based assessment systems are still being questioned. To answer the question of what people are learning from playing games, researchers have been using a variety of methods including external measures, log data capturing in-game actions, and game-related actions beyond the game context. This chapter seeks to identify why research on game-based assessment is still in its infancy, what advances have been achieved over the past 10 years, and which challenges lie ahead for advancing assessment in game-based learning.
Article
Full-text available
Serious Games have already proved their advantages in different educational environments. Combining them with Game Learning Analytics can further improve the life-cycle of serious games, by informing decisions that shorten development time and reduce development iterations while improving their impact, therefore fostering their adoption. Game Learning Analytics is an evidence-based methodology based on in-game user interaction data, and can provide insight about the game-based educational experience promoting aspects such as a better assessment of the learning process. In this article, we review our experiences and results applying Game Learning Analytics for serious games in three different scenarios: (1) validating and deploying a game to raise awareness about cyberbullying, (2) validating the design of a game to improve independent living of users with intellectual disabilities and (3) improving the evaluation of a game on first aid techniques. These experiences show different uses of game learning analytics in the context of serious games to improve their design, evaluation and deployment processes. Building up from these experiences, we discuss the results obtained and provide lessons learnt from these different applications, to provide an approach that can be generalized to improve the design and application of a wide range of serious games in different educational settings.
Article
Full-text available
Abstract It was argued recently that number line based training supports the development of conceptual rational number knowledge. To test this hypothesis, we evaluated training effects of a digital game based on the measurement interpretation of rational numbers. Ninety-five fourth graders were assigned to either a game-based training group (n = 54) who played a digital rational number game for five 30-min sessions or a control group (n = 41) who attended regular math curriculum. Conceptual rational number knowledge was assessed in a pre- and posttest session. Additionally, the game groups' playing behavior was evaluated. Results indicated that the game-based training group improved their conceptual rational number knowledge significantly more strongly than the control group. In particular, improvement of the game-based training group was driven by significant performance increases in number magnitude estimation and ordering tasks. Moreover, results revealed that in-game metrics, such as overall game performance and maximum level achieved provided valid information about students’ conceptual rational number knowledge at posttest. Therefore, results of the current study not only suggest that aspects of conceptual rational number knowledge can be improved by a game-based training but also that in-game metrics provide crucial indicators for learning.
Conference Paper
Full-text available
Recent technological advancements are providing significant results in enriching learning through serious games in the field of information and communication technology (ICT). One such advancement is the ability to create highly complex virtualised environments that realistically simulate organisations' ICT systems, enabling participants to develop practical hands-on skills in a controlled environment. During such exercises, a critical component is the ability to track individual participants progress. This requires an evaluation system to be in place. Within traditional computer gaming environments, it is relatively easy to automate the tracking and capture of a specific player's moves, clicks and other interaction. Within simulations of serious games such as those used for training defence and attack mitigation techniques in a computer network, tracking such activities in an automated manner is significantly more complex. Using serious games in the field of cybersecurity as an example, we provide in this work a mapping of different types of metrics for serious games run within virtual lab environments and suggest various ways in which they can be measured. This work will assist serious game designers, developers, organisers and assessors obtain a greater understanding of the state of the art possibilities for measuring the performance of participants. It will also enable researchers to build a solid foundation in which they can develop new approaches for more efficient learning through virtualized simulations.
Conference Paper
Full-text available
Applying games in education provides multiple benefits clearly visible in entertainment games: their engaging, goal-oriented nature encourages students to improve while they play. Educational games, also known as Serious Games (SGs) are video games designed with a main purpose other than pure entertainment; their main purpose may be to teach, to change an attitude or behavior, or to create awareness of a certain issue. As educators and game developers, the validity and effectiveness of these games towards their defined educational purposes needs to be both measurable and measured. Fortunately, the highly interactive nature of games makes the application of Learning Analytics (LA) perfect to capture students' interaction data with the purpose of better understanding or improving the learning process. However, there is a lack of widely adopted standards to communicate information between games and their tracking modules. Game Learning Analytics (GLA) combines the educational goals of LA with technologies that are commonplace in Game Analytics (GA), and also suffers from a lack of standards adoption that would facilitate its use across different SGs. In this paper, we describe two key steps towards the systematization of GLA: 1), the use of a newly-proposed standard tracking model to exchange information between the SG and the analytics platform, allowing reusable tracker components to be developed for each game engine or development platform; and 2), the use of standardized analysis and visualization assets to provide general but useful information for any SG that sends its data in the aforementioned format. These analysis and visualizations can be further customized and adapted for particular games when needed. We examine the use of this complete standard model in the GLA system currently under development for use in two EU H2020 SG projects.
Article
Full-text available
The emergence of massive open online courses (MOOCs) has caused a major impact on online education. However, learning analytics support for MOOCs still needs to improve to fulfill requirements of instructors and students. In addition, MOOCs pose challenges for learning analytics tools due to the number of learners, such as scalability in terms of computing time and visualizations. In this work, we present different visualizations of our “Add-on of the learNing AnaLYtics Support for open Edx” (ANALYSE), which is a learning analytics tool that we have designed and implemented for Open edX, based on MOOC features, teacher feedback, and pedagogical foundations. In addition, we provide a technical solution that addresses scalability at two levels: first, in terms of performance scalability, where we propose an architecture for handling massive amounts of data within educational settings; and, second, regarding the representation of visualizations under massiveness conditions, as well as advice on color usage and plot types. Finally, we provide some examples on how to use these visualizations to evaluate student performance and detect problems in resources.
Article
Full-text available
Research has shown how open-ended serious games can facilitate students' development of specific skills and improve learning performance through problem-solving. However, understanding how students learn these complex skills in a game environment is a challenge, as much research uses typical paper-and-pencil assessments and self-reported surveys or other traditional observational and quantitative methods. The purpose of this study is to identify students' learning behavior patterns of problem-solving and explore behavior patterns of different performing groups within an open-ended serious game called Alien Rescue. To accomplish this purpose, this study intends to use gameplay data by incorporating sequential pattern mining and statistical analysis. The findings of this study confirmed the results from previous research (using ex situ data such as interviews) and at the same time provide an analytical approach to understand in-depth students' sequential behavior patterns using in situ gameplay data. This study examined the frequent sequential patterns between low- and high-performing students and showed that problem-solving strategies were different between these two performing groups. By using this integrated analytical method, we can gain a better understanding of the learning pathway of students’ performance and problem-solving strategies of students with different learning characteristics in a serious games context.
Conference Paper
Intelligent tutoring systems (ITSs) are commonly designed to enhance student learning. However, they are not typically designed to meet the needs of teachers who use them in their classrooms. ITSs generate a wealth of analytics about student learning and behavior, opening a rich design space for real-time teacher support tools such as dashboards. Whereas real-time dashboards for teachers have become popular with many learning technologies, we are not aware of projects that have designed dashboards for ITSs based on a broad investigation of teachers' needs. We conducted design interviews with ten middle school math teachers to explore their needs for on-the-spot support during blended class sessions, as a first step in a user-centered design process of a real-time dashboard. Based on multi-methods analyses of this interview data, we identify several opportunities for ITSs to better support teachers' needs, noting that the analytics commonly generated by existing teacher support tools do not strongly align with the analytics teachers expect to be most useful. We highlight key tensions and tradeoffs in the design of such real-time supports for teachers, as revealed by "Speed Dating" possible futures with teachers. This paper has implications for our ongoing co-design of a real-time dashboard for ITSs, as well as broader implications for the design of ITSs that can effectively collaborate with teachers in classroom settings.