Conference PaperPDF Available

A Playful Game Changer: Fostering Student Retention in Online Education with Social Gamification

Authors:

Abstract and Figures

Many MOOCs report high drop off rates for their students. Among the factors reportedly contributing to this picture are lack of motivation, feelings of isolation, and lack of interac-tivity in MOOCs. This paper investigates the potential of gamification with social game elements for increasing retention and learning success. Students in our experiment showed a significant increase of 25% in retention period (videos watched) and 23% higher average scores when the course interface was gamified. Social game elements amplify this effect significantly – students in this condition showed an increase of 50% in retention period and 40% higher average test scores.
Content may be subject to copyright.
A Playful Game Changer: Fostering Student Retention
in Online Education with Social Gamification
Markus Krause
Leibniz University
Hannover, Germany
Marc Mogalle
Leibniz University
Hannover, Germany
Henning Pohl
Leibniz University
Hannover, Germany
Joseph Jay Williams
Harvard University
Cambridge, MA, USA
{markus, marc, henning}@hci.uni-hannover.de, joseph_jay_williams@harvard.edu
ABSTRACT
Many MOOCs report high drop off rates for their students.
Among the factors reportedly contributing to this picture are
lack of motivation, feelings of isolation, and lack of interac-
tivity in MOOCs. This paper investigates the potential of
gamification with social game elements for increasing reten-
tion and learning success. Students in our experiment
showed a significant increase of 25% in retention period
(videos watched) and 23% higher average scores when the
course interface was gamified. Social game elements amplify
this effect significantly students in this condition showed
an increase of 50% in retention period and 40% higher aver-
age test scores.
Author Keywords
Massive Open Online Courses; MOOC; Gamification; So-
cial Engagement
ACM Classification Keywords
J.1 [Administrative Data Processing]: Education
K.3.1 [Computer Uses in Education]: Distance Learning
INTRODUCTION
Low retention rates are a widely discussed issue of learning
at scale. Students of online courses report various inconven-
iences resulting in their dropout. Gamification is a promising
method to strengthen student engagement and ease some of
the disadvantages connected to online education [27]. High
dropout rates are often attributed to feelings of isolation and
lack of interactivity [15]—reasons directly relating to miss-
ing social engagement. Thus, we use social game elements
to strengthen social engagement in students of online
courses.
We conducted a controlled experiment with 213 students ma-
joring in psychology or computer science. To discriminate
the effects of gamification from effects induced through so-
cial elements, we compare three conditions: (1) a baseline
condition without explicit gamification, (2) a version with
game elements, but no social elements, and (3) a gamified
version with social elements. We hypothesize that gamifica-
tion has a stronger effect when it exhibits social elements.
We found that students in our gamification condition had
23% higher average scores and an increase of 25% in reten-
tion. With social elements, the increase was almost 40% for
final scores and 50% for retention period.
RELATED WORK
Retention remains a major challenge in MOOCs. Jordan re-
ported that although completion rates occasionally exceed
40% the average rate is below 13% and sometimes even be-
low 1% [13]. When analyzing retention it is important to note
that student motivation is more diverse in online courses than
traditional courses. The motivation also varies significantly
between courses [17]. Willkowski et al. [32] asked students
about their motivation to take part in the MOOC Mapping
with Google. Only 10% of students reported that they wanted
to earn a certificate. Kizilcec et. al [17] investigates this di-
versity in student motivation more closely. Their findings
also suggest that getting a certificate is not the primary inten-
tion of students in MOOCs. Kizilcec et. al [17] further illus-
trate that the retention rate of students actually taking a
course for a certificate are still low. According to Willkowski
et. al [32] only 25% percent of those students aiming for a
certificate actually achieved their goal. These results show
that even students with the goal to complete the course strug-
gle to achieve it.
Permission to make digital or hard copies of all or part of this work for personal or class-
room use is granted without fee provided that copies are not made or distributed for profi
t
or commercial advantage and that copies bear this notice and the full citation on the firs
t
page. Copyrights for components of this work owned by others than ACM must be hon-
ored. Abstracting with credit is permitted. To copy otherwise, or republish, to post o
n
servers or to redistribute to lists, requires prior specific permission and/or a fee. Reques
t
permissions from permissions@acm.org.
L
@S 2015, March 14–18, 2015, Vancouver, BC, Canada.
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-3411-2/15/03…$15.00.
http://dx.doi.org/10.1145/2724660.2724665
Figure 1. We investigate the impact of social gaming ele-
ments on student retention in online learnin
g
. Shown here
is the interface of our s
y
stem on a tablet. In particular, the
lesson panel of the social game condition.
There has been work on using games or game-like elements
for education [1,27,30] and training [31] and explorative
studies on the effects of game elements in online courses
[10]. Domínguez et al. [8] investigated the effects of gamifi-
cation on adult learners. Their study indicates that gamifica-
tion may have an effect on students. In contrast to these ex-
plorative studies, we investigate a more specific question:
RQ1: Does gamification have a positive effect on retention?
Khalil et al. [15] investigates why students drop off from
online courses and proposes strategies how to increase reten-
tion. They report that feelings of isolation and a lack of in-
teractivity are both factors for high drop of rates in MOOCs.
This indicates that social engagement is an important aspect
of students’ success. This is also consistent with other studies
[7,17,25], which hypothesize that social engagement is a rel-
evant factor for student retention, success, and satisfaction.
Research on social factors of learning is a well-established
field [3,28,34] and there are attempts to strengthen social in-
teractions in MOOCs [29] and various endeavors to under-
stand social elements in online and distance education
[4,18,19]. Simões et al. [25] use social elements to support
younger students (K-6) in an offline environment with social
game elements. Thus, there is evidence that social elements
may amplify the effect of gamification. Therefore, we extend
our initial research question and ask:
RQ2: Is gamification with social elements more efficient?
STUDY DESIGN
Our study follows a between subjects design with three dif-
ferent conditions. The first condition has no game elements
(plain condition), while the second condition has game ele-
ments (game condition), and the third condition has social
game elements (social condition). Apart from the main factor
(condition) we also collect information on two more fixed
factors: gender and major. As we required verified infor-
mation of these factors, we integrated our experiment with
offline courses at a university. Student retention is valuable
and even more when it is correlated with learning success.
Therefore, we also analyze how much students can recall
from a lecture. We hypothesize that:
H1: Gamification increases retention.
H1a: Gamification increases learning success.
H2: Social game elements amplify retention gain.
H2a: Social game elements amplify learning success.
The course used for the experiment introduces Python as a
programming language for statistical analysis and has an ex-
pected duration of four-weeks with an average workload of
4 hours per week. We organized the course into four lectures,
each lecture containing 15 video lessons on average. No
video lesson was longer than 4 minutes and videos showed
either tablet drawings or code examples. Each video also fea-
tures a short quiz. Quizzes are either multiple-choice ques-
tionnaires or free text input.
The online course was not required for any course of the uni-
versity but lecturers of graduate and undergraduate courses
in computer science (human-computer interaction) and psy-
chology promoted the course in their lectures as a valuable
addition.
PARTICIPANTS
We collected data from 71 students in the plain condition, 67
students in the game condition, and 68 students in the social
condition. Students were studying either computer science
(113) or psychology (93) for their undergraduate (87), master
(116), or Ph.D. (3). Only 12 participants reported prior
knowledge in python. Many students, however, had basic
knowledge of statistics (85) and sometimes experience with
R (56), Java (95), or C/C++ (56).
PROCEDURE
Students had to sign up for the course via an online system.
As we had verified information about gender and major of
students, we were able to balance condition assignment of
students based on gender and major. Table 1 shows the final
distribution of students in each condition:
Plain Condition
The plain condition does not have explicit game elements.
After logging in with their university credentials students can
choose one of the four lectures from a dashboard. The dash-
board also provides an overview of the progress in each
course and lesson. Figure 2 shows a screenshot of the dash-
board. Selecting a lecture in the dashboard opens the respec-
tive lesson panel.
The lesson panel has four central elements: a video in the
center, an element to navigate over the video, and a quiz
panel on the left. Quizzes could take one of two forms either
a multiple-choice quiz or a free text question. Students only
need to type one or two words to answer free text questions.
Gender Ma
j
or
Condition
F
emal
e
Male Ps
ch. C
S
Plain 30 41 32 39
Game 28 39 30 37
Social 28 40 31 37
Table 1. Demographics of students in each condition.
Figure 2. The dashboard of the plain condition. Students
get an overview of their progress in courses and lessons.
The system is able to deal with common spelling errors. The
lesson panel shows a discussion board below the video as
seen in Figure 2.
Game Condition
We designed the game condition with all features of the so-
cial condition omitting only those that incorporate social el-
ements. We only use gamification instead of more sophisti-
cated game concepts allowing an easier interpretation of ef-
fects. We deliberately chose to leave out complex game me-
chanics. Complex game mechanics themselves have a learn-
ing curve and thereby introduce noise.
We restructured the dashboard. A panel on the left side
shows players an overview of their achievements. This panel
also contains a customizable avatar in the social condition.
We show the details in Figure 5. In the game condition, the
panel does not show this avatar but is otherwise identical.
The full lesson panel is depicted in Figure 5. As the game
condition is already very close to the social condition the fig-
ure shows only the final version of the panel. The lessons
panels of the game and the social condition differ only in the
design of their quiz elements. We show these differences in
more detail in Figure 7.
For the game condition, we use basic gamification mechan-
ics [33]. Achievements or badges are widespread elements
in gamification. They are a representation of an accomplish-
ment. In the game condition, students can acquire achieve-
ments for answering a number of quizzes correctly, taking a
number of lessons, or being ranked among the top ten stu-
dents of a lesson or the entire course. We award five achieve-
ments with three levels for each achievement.
Players also earn score points for each correct answer to a
quiz question. We use these score points to place players in
a leaderboard for each course and each lesson in a course.
We are aware of the fact that leaderboards already constitute
a social element and that this decision might tone down the
effects between the game and the social condition.
We deliberately chose to integrate leaderboards in the game
condition as they are an essential element in almost every
gamification approach and do not connect players directly
with one another. Aesthetics do play a relevant role in gamifi-
cation using visually appealing graphics and representations one
can create a sense of pleasurable satisfaction [9,14,23,24]. For
the experiment, we chose a neutral comic like design. Players
can design their own avatar. Later during the course, students
can turn in their achievements and score points to acquire dif-
ferent visual add-ons to place on their avatar such as hats and
other items. Figure 4 shows a screenshot of the dashboard of the
game condition showing an avatar and a list of achievements.
The dashboard also provides an overview of the progress in
each course and lesson. Beside the integration of a customi-
zable avatar, we overhauled the general appearance of the
interface to fit the theme laid out with the avatar. Figure 6
shows the customization interface for the avatar and some of
the avatars created by students during the course.
Finally, we added a countdown to each quiz in a lesson. This
restricts student to a certain amount of time when solving a
quiz, effectively inducing tension with this time limit. To al-
low students to watch the video without time pressure, quiz-
zes have to be explicitly activated via clicking. Figure 7
shows two screenshots of the quiz.
Figure 5. Main elements of the lesson panel of the socia
l
and game condition. The onl
y
differences between social
and game condition is the presence of the quiz (Figure 7
shows this differences in more detail).
Figure 4. Dashboard of the social condition. Players can
customize their avatar and get an overview of their pro-
gress in courses and lessons.
Figure 3. Main components of the lesson panel of the base-
line condition (plain). We omitted the discussion board un-
der the video, as it we did not change it.
Social Game Condition
The social condition is identical to the game condition except
for an additional set of social game elements. Prior to starting
a new lecture, participants were able to choose an opponent.
Students could choose a person they know from the list of
participants, or play against a randomly chosen opponent.
Although typical for games that highlight social aspects, we
do not allow participants to invite friends via social networks
or e-mail at this point. When a player starts a lesson with a
random opponent, a screen illustrates the search for another
player (see Figure 8).
The system does not require two players to be online at the
same time. A student can play against pre-recorded actions
of another student. To ensure that the system always has
enough recordings we pre-recorded some sessions.
The system pairs a student with a recorded session if it can-
not find another player within 30 seconds of the student
choosing a random opponent. Students can also choose a spe-
cific player from a list of students currently online or with
recorded play sessions. If the second student is online, the
system waits until both students start the lesson. If the second
student is not online but has a recorded session for this lesson
the first student plays against this recording.
To highlight the social interaction we add a status bar to the
quiz panel. This bar illustrates the progress of ones opponent
on the quiz as seen in Figure 7. Students get a notification if
their opponent solves a quiz if they as well have an active
quiz. Finally, we added a summary screen that compares the
results of both students (see Figure 9).
MEASUREMENTS
To estimate student retention we measure the number of
video lessons a student watches. We will refer to this meas-
ure as retention period. For this experiment we consider
video as watched if it played completely and the continue
button was clicked afterwards. As basic knowledge of statis-
tics and programming languages was present in our user
group, we assumed that some participants skip lectures due
to prior knowledge. For the experiment, we define skipping
a lecture as starting the video but moving on to the next one
before the video has finished. We asked participants to indi-
cate if they skip a lecture because of prior knowledge. If a
student skips a video because of prior knowledge, we still
consider the video lesson as completed.
To measure learning success we conducted an exam with
some of our students. We invited students that took part in
our experiment to an offline test. We scheduled this test 3
month after the start of the course. 101 Students took part in
Figure 7. We chan
g
ed the appearance and the behavior o
f
the quiz from the plain condition. The quiz in the game
condition (center) features a countdown (
g
reen bar) and
students activate the quiz manually (left). In the social con-
dition, students see the status of their opponent (right).
Figure 8. To hi
g
hli
g
ht the social aspect we visualize the
search process for a competitor (top). Students can skip
this process (button in the center). The s
y
stem then pairs
students with a recorded session of another student. Oth-
erwise, a notification window shows that the s
y
stem found
another student (lower part of the figure).
Figure 9. The resume screen of the social condition. Illus-
trates the pla
y
er performance compared to her opponent.
On the left is a leaderboard showing ranking and badges.
Figure 6. Students can customize their avatar. Later dur-
in
g
the course, students can collect visual
g
ad
g
ets, e.
g
., hats
or other props. Some possible avatars are shown on the
right.
this test. On average, students took the test one week after
the course (self-assessed). We asked students to write a py-
thon script that calculates the 95% confidence interval of the
distribution of means using only standard functions.
We reviewed and graded all submissions in a blind review
process. At least three different reviewers graded each sub-
mission on a range from 5 (excellent) to 1 (a lot of room for
improvement). We averaged the test performance of all re-
viewers to measure learning success for each participant.
Besides test scores, we also use a second measure to estimate
learning success. Students solve quizzes throughout the
course; the average ratio of correctly solved quizzes is our
last measurement: quiz accuracy.
RESULTS
In accordance with Cramer and Bock [5], we performed a
MANOVA on the means to help protect against inflating our
Type-I error rate in the follow-up ANOVAs and post-hoc
comparisons. Prior to conducting the MANOVA, we calcu-
lated a series of Pearson correlations between all dependent
variables in order to test the MANOVA assumption that the
dependent variables correlate. We found correlations ranging
between 0.2 and 0.4 all correlations are significant at an -
level of 0.01 or lower. These values are within acceptable
ranges according to Meyers et al. [26].
Additionally, the Box’s M value of 16.25 was associated
with a p value of 0.012, which was interpreted as non-signif-
icant in accordance with Huberty and Petoskey’s (i.e., p <
.005) [12]. Thus, the covariance matrices between the groups
were assumed equal for the purpose of the MANOVA. We
conducted a three-way multivariate analysis of variance
(MANOVA) to test the hypothesis that there would be one
or more mean differences between our conditions (plain,
game, and social) and our measurements (retention period,
quiz accuracy, and test score). A significant MANOVA ef-
fect was obtained, Pillais’ Trace = 0.22, F(6, 176) = 3.38, p
= 0.003 with an estimated multivariate effect size of 0.112.
Retention Period
We calculate the average retention period for each condition
and the 95% confidence interval of the distribution of means
as shown in Table 2. To estimate confidence intervals we
draw 10,000 bootstrap samples from each condition using
sampling with replacement.
All three conditions differ by their mean but show a slight
overlap of their confidence intervals. As hypothesized the
game and social conditions have a higher average retention
period than the version without game elements. The game
condition shows a 25% increase in retention period com-
pared to the plain condition. The social condition more than
doubles this effect showing an increase of the average reten-
tion period of more than 55% compared to plain.
Before testing our results for significance, we ensured that
our data is suitable for parametric tests as hypothesized. We
used an omnibus test for normality [6] for each condition and
did not find significant differences from a normal distribu-
tion. As we have different numbers of participants in our con-
ditions, we also verified that our conditions have equal vari-
ance for the dependent variable prior to executing an analysis
of variance (ANOVA).
As the distributions do not differ significantly from normal
distributions, we use Bartlett’s test for homoscedasticity
(equal variance) [2]. We found, as assumed, that the variance
does not differ significantly between our conditions t(2) =
2.649, p = 0.265. As our data does not hold evidence that it
violates the assumptions of the ANOVA we analyze main
and interaction effects with a three-way independent analysis
of variance. Table 3 shows the results of this test.
We found that our conditions are the only significant factor
in our experiment. We use post-hoc one-tailed t-tests to test
for significance between individual levels of this factor and
use the Holm–Bonferroni [11] method to control for family
wise error rates. Figure 10 shows a violin plot of the retention
periods found in our experiment. From the results we can
support hypothesis H1 as the game condition has a signifi-
cantly higher average retention period (M = 14.9, SD = 8.2)
than the plain condition (M = 11.9, SD = 6.6), with t(137) =
2.22, p = 0.02, d = 0.38. We can also support H2 as the social
condition has a significantly higher average retention period
(M = 18.5, SD = 8.7) than the game condition (M = 14.9, SD
= 8.2), with t(134) = 2.48, p = 0.01, d = 0.42. Accordingly
the social condition also has a significantly higher average
retention period (μ=18.5, σ=8.7) than the plain condition (M
= 11.9, SD = 6.6), with t(138) = 4.82, p<0.001, d = 0.79.
Quiz Accuracy
In order to assess the influence of our conditions on students
success we calculate average quiz accuracy for each condi-
tion and the 95% confidence interval of the distribution of
means as shown in Table 4.
Source df SS MS F p
(C)ondition 2 1559.1 779.5 11.53 <0.001
(G)ender 1 194.3 194.3 3.07 0.081
(M)ajor 1 52.6 52.6 0.83 0.362
CG 2 187.5 187.5 1.48 0.229
CM 2 14.1 14.1 0.11 0.894
CGM 2 133.9 133.9 1.05 0.348
Residual 194 12270.0 63.24
Table 3. Interaction and main effects on retention period.
Abbreviations: df (de
g
rees of freedom), SS (sum o
f
squares), MS (mean squares).
Cond. n
95% CI
plain 71 11.9 [10.2, 13.6]
game 67 14.9 [12.9, 16.9]
social 68 18.5 [16.5, 20.7]
Table 2. Number of participants (n), mean retention pe-
riod (
), and the 95% confidence interval of the distribu-
tion of retention period means. The CI and the boxplot o
f
means are calculated from 10,000 bootstrap samples.
To estimate confidence intervals we draw 10,000 bootstrap
samples from each condition using sampling with replace-
ment. All three conditions differ by their mean and overlap
in their confidence intervals. The game and social conditions
have a higher average quiz accuracy than the plain condition.
The game condition shows a 12.5% increase in quiz accuracy
compared to the plain condition. The social condition again
doubles this effect showing an increase of quiz accuracy of
more than 31%. As for the previous measures, we conducted
an omnibus test for normality and Bartlett’s test for homo-
scedasticity. We did not find our distributions to differ sig-
nificantly from the ANOVA assumptions. We therefore an-
alyze main and interaction effects again with a three-way in-
dependent ANOVA. Table 5 shows the results of this test.
Condition is the only significant factor for quiz accuracy.
Figure 11 shows a violin plot of the quiz accuracy found in
our experiment. From the results we cannot entirely support
hypothesis H1a as the difference between the plain (M =
0.46, SD = 0.25) and the game condition (M = 0.52, SD =
0.25) is not significant t(137) = 1.34, p=0.09, d = 0.23. We
can however support our hypothesis H2a as the social con-
dition has a significantly higher average quiz accuracy (M =
0.61, SD = 0.21) than the game condition, with t(134) = 2.19,
p = 0.031, d = 0.37 and the plain condition, with t(138) =
3.63, p<0.001, d = 0.59.
Final Test Performance
Our finale measurement for student success is test perfor-
mance in our offline exam. We again calculate the mean and
the 95% confidence interval as shown in Table 5. Again, all
three conditions differ by their mean and overlap in their con-
fidence intervals. Students in the game and social conditions
have higher test scores than students in the plain condition.
Students in the game condition have a 22.5% higher test per-
formance than those in the plain condition. Students in the
social condition show 40% higher scores on average.
We conducted an omnibus test for normality and Bartlett’s
test for homoscedasticity and analyzed main and interaction
effects with a three-way ANOVA. Table 6 shows the results
of this test. Condition is, as expected, the only significant
factor for quiz accuracy. Figure 12 shows a violin plot of the
quiz accuracies found in our experiment.
From the results we can support hypothesis H1a as the dif-
ference between the plain (M = 2.5, SD = 1.12) and the game
condition (M = 3.06, SD = 1.10) is significant t(64) = 2.01, p
= 0.049, d = 0.49. We cannot directly support our hypothesis
H2a as the social condition does not show significantly higher
average test scores (M = 3.50, SD = 1.12) than the game con-
dition, with t(68) = 2.19, p = 0.055, d = 0.39 and the plain
condition, with t(67) = 3.63, p < 0.01, d = 0.81.
Cond. n
95% CI
plain 71 0.46 [0.40, 0.52]
game 67 0.52 [0.46, 0.58]
social 68 0.62 [0.55, 0.66]
Table 4. Number of participants (n), average quiz accurac
y
(
), and the 95% confidence interval of the distribution o
f
means. The CI and the boxplot are calculated from 10,000
bootstrap samples.
Figure 10. The social condition shows a si
g
nificantl
y
higher retention period than the game and plain condi-
tions. The game condition also shows si
g
nificantl
y
hi
g
her
retention rates than the plain condition. The
g
re
y
lines in
the violin plot indicate min and max values the
g
re
y
lines
on top significant differences: ∗∗∗: p < 0.001, : p < 0.05.
Figure 11. Student success measured with quiz accuracy.
The
g
re
y
lines in the plot indicate min and max values the
grey lines on top significant differences: ∗∗: p < 0.01, :p
< 0.05, ‘:p < 0.1.
Source df SS MS F p
(C)ondition 2 76.75 76.75 6.54 0.002
(G)ender 1 5.36 5.36 0.91 0.340
(M)ajor 1 15.92 15.92 2.71 0.101
CG 2 4.87 2.43 0.41 0.661
CM 2 27.74 13.87 2.36 0.097
CGM 2 28.06 14.03 2.39 0.094
Residual 194 113.81 5.86
Table 5. Interaction and main effects of quiz accuracy. Ab-
breviations: df (degrees of freedom), SS (sum of squares),
MS (mean squares).
CONCLUSION
This work presented a systematic analysis of the impact of
social gamification on student retention and learning success
in online courses. We posed two research questions: RQ1:
does gamification support students of our online course and
RQ2: do social elements amplify possible positive effects.
We hypothesized that social gamification increases retention
as well as strengthening learning success.
In order to assess the impact of social gamification, we ana-
lyzed three dependent variables (DVs) (retention period,
quiz accuracy, and test scores) on three independent varia-
bles (IVs) (condition, gender, and major). All DVs showed
a significant increase for the factor condition between the
plain and social level. For retention period and quiz accu-
racy, we found two differences that were not significant on a
level of 0.05 but on a level of 0.1. However, given the p-
values from both ANOVA and MANOVA a Type-I error
seems unlikely in both cases. In response to our hypothesis
H1 we analyzed retention period and observed a significant
increase of 25% between our plain and our game condition
and a significant increase of 55% from plain to social. This
lends strong support to our initial hypothesis that gamifica-
tion can increase retention and that social gamification am-
plifies this effect.
For quiz accuracy, we found an increase of 12.5% between
our plain and game conditions and an increase of 31% be-
tween plain and social, in the final test students in the game
condition had a 22.5% higher test score compared to students
in the plain condition. Students in the social condition
showed an even stronger increase of almost 40% compared
to students in the plain condition. This again lends support
for our hypothesis H2 that we can amplify beneficial effects
with social gamification.
FUTURE WORK
In our experiment, social elements showed a significant im-
pact. To control our population variables and to reduce noise
we restricted our experiment in terms of the user pool and the
used game mechanics. Based on our findings and previous
experiments we expect positive effects to be much stronger
when we apply more sophisticated design concepts. Badges,
achievements, and leaderboards are visually pleasing and
provide a certain engagement. However, we do not expect
these basic mechanics to uphold student motivation for a
complete online curriculum.
In the future, we plan to investigate different social game me-
chanics and their impact on student success. In this paper we
explored the effects of a competitive setting were students
challenge each other. In the future, we also want to investi-
gate the differences between playful elements that foster col-
laboration instead of competition and both methods com-
bined. We also aim at providing sophisticated feedback to
support students. However, complex game mechanics may
require more advanced interaction techniques. For instance,
the game could allow students who earned a certain status to
add questions to the system. Thereby expanding the learning
materials and allowing advanced students to compete on an-
other level. Tools [20] and methods [16,21] that support such
games already exist [22].
ACKNOWLEDGEMENTS
Many thanks to Rene Kizilcec and Gabriel Barata for sharing
their research results and papers early on. Their work helped
us a lot in shaping our paper. Special thanks go to Claus
Brenner and Daniel Eggert from the Institute of Cartography
and Geoinformatics for an insight into their MOOC platform.
REFERENCES
1. Barata, G., Gamma, S., Jorge, J., and Gonçalves, D. Re-
lating Gaming Habits with Student Performance in a
Gamified Learning Experience. Proc. CHI PLAY'14.
Figure 12. We measured learning success with a final of-
fline test. In this test, students wrote a short P
y
thon script
to estimate confidence intervals of means. Three inde-
pendent reviewers
g
raded each submission on a scale from
5 (excellent) to 1 (underperformed). The grey lines indi-
cate min and max values, the
g
re
y
lines on top si
g
nificant
differences: ∗∗: p < 0.01, :p < 0.05, ‘:p < 0.1.
Cond. n
95% CI
plain 32 2.50 [2.12, 2.87]
game 33 3.06 [2.69, 3.45]
social 36 3.50 [3.13, 3.86]
Table 6. Number of participants (n), average test score (
),
and 95% confidence interval of the distribution of means
(CI is estimated from 10,000 bootstrap samples).
Source df SS MS F p
(C)ondition 2 76.75 76.75 6.54 0.002
(G)ender 1 5.36 5.36 0.91 0.340
(M)ajor 1 15.92 15.92 2.71 0.101
CG 2 4.87 2.43 0.41 0.661
CM 2 27.74 13.87 2.36 0.097
CGM 2 28.06 14.03 2.39 0.094
Residual 194 113.81 5.86
Table 7. Interaction and main effects of quiz accuracy. Ab-
breviations: df (de
g
rees of freedom), SS (sum of squares),
MS (mean squares).
2. Bartlett, M. Properties of sufficiency and statistical tests.
Proceedings of the Royal Statistical Society Series A,
160 (1937), 268–282.
3. Blumenfeld, P.C. Classroom learning and motivation:
Clarifying and expanding goal theory. Journal of Educa-
tional Psychology 84, 3 (1992), 272–281.
4. Brinton, C., Chiang, M., Jain, S., Lam, H., Liu, Z., and
Wong, F. Learning about social learning in MOOCs:
From statistical analysis to generative model. IEEE
Trans. on Learning Technologies PP, 99 (2014), 1–14.
5. Cramer, E.M. and Bock, R.D. Multivariate Analysis.
Review of Educational Research 36, (1966), 604–617.
6. D’Agostino, R. and Pearson, E.S. Testing for departures
from normality. Biometrika 60, (1973), 613–622.
7. Dietz-Uhler, B., Fisher, A., and Han, A. Designing
online courses to promote student retention. Journal of
Educational Technology Systems 36, 1 (2007), 105–
112.
8. Domínguez, A., Saenz-de-Navarrete, J., De-Marcos, L.,
Fernández-Sanz, L., Pagés, C., and Martínez-Herráiz, J.-
J. Gamifying learning experiences: Practical implica-
tions and outcomes. Computers & Education 63, (2013),
380–392.
9. De Freitas, A. a. and de Freitas, M.M. Classroom Live:
a software-assisted gamification tool. Computer Science
Education 23, 2 (2013), 186–206.
10. Grünewald, F., Meinel, C., Totschnig, M., and Willems,
C. Designing MOOCs for the Support of Multiple
Learning Styles. In Scaling up Learning for Sustained
Impact. 2013, 371–382.
11. Holm, S. A simple sequentially rejective multiple test
procedure. Scandinavian Journal of Statistics 6, 2
(1979), 65–70.
12. Huberty, C.J. and Petoskey, M.D. Multivariate Analysis
of Variance and Covariance. In H. Tinsley and S.
Brown, eds., Handbook of applied multivariate statistics
and mathematical modeling. 2000.
13. Jordan, K. MOOC Completion Rates: The Data. MOOC
Completion Rates: The Data, 2013. http://www.katyjor-
dan.com/MOOCproject.html.
14. Kapp, K.M. The Gamification of Learning and Instruc-
tion: Game-based Methods and Strategies for Training
and Education. Pfeiffer, 2012.
15. Khalil, H. and Ebner, M. MOOCs Completion Rates and
Possible Methods to Improve Retention-A Literature
Review. World Conference on Educational Multimedia,
Hypermedia and Telecommunications 2014, 1236–
1244.
16. Kilian, N., Krause, M., Runge, N., and Smeddinck, J.
Predicting Crowd-based Translation Quality with Lan-
guage-independent Feature Vectors. Proc. HComp’12,
114–115.
17. Kizilcec, R. and Al, E. Motivation as a Lens for Under-
standing Online Learners : Introducing and Applying the
Online Learner Enrollment Intentions (OLEI ) Scale. in
press 0, 0 (2014).
18. Kizilcec, R.F., Piech, C., and Schneider, E. Deconstruct-
ing disengagement. Proc. LAK ’13, 170.
19. Koutropoulos, A., Gallagher, M.S., Abajian, S.C., et al.
Emotive Vocabulary in MOOCs: Context & Participant
Retention. European Journal of Open, Distance, and E-
Learning, (2012), 1–22.
20. Krause, M. and Porzel, R. It is about time. Proc. CHI
EA ’13, 163–168.
21. Krause, M. A behavioral biometrics based authentica-
tion method for MOOC’s that is robust against imitation
attempts. Proc. L@S’14 (Work in Progress), 201–202.
22. Krause, M. Homo Ludens in the Loop: Playful Human
Computation Systems. tredition GmbH, Hamburg, Ham-
burg, Germany, 2014.
23. Liu, Y., Alexandrova, T., and Nakajima, T. Gamifying
intelligent environments. Proc. Ubi-MUI’11, , 7–12.
24. Marache-francisco, C. and Brangier, E. Perception of
Gamification : Between Graphical Design. (2013), 558–
567.
25. Martz, W.B., Reddy, V.K., and Sangermano, K. Look-
ing for indicators of success for distance education. In
Distance learning and university effectiveness: changing
education paradigms for online learning. Hershey, PA:
Information Science Pub, 2004.
26. Meyers, L., Gamst, G., and Guarino, A. Applied multi-
variate research: Design and interpretation. Sage Pub-
lishers, Thousand Oaks, CA, USA, 2006.
27. Oblinger, D. The next generation of educational engage-
ment. Journal of interactive media in education 8,
(2004), 1–18.
28. Ryan, A.M. and Patrick, H. The Classroom Social Envi-
ronment and Changes in Adolescents’ Motivation and
Engagement During Middle School. American Educa-
tional Research Journal 38, 2 (2001), 437–460.
29. Staubitz, T. and Renz, J. Supporting Social Interaction
and Collaboration on an xMOOC Platform. Proc.
EDULEARN14, 6667–6677.
30. Steinkuehler, C.A. Learning in Massively Multiplayer
Online Games. Proc. ICLS ’04, 521–528.
31. Stone, B., Factors, H., Defence, I., and Centre, T. Seri-
ous gaming. Education and Training, (2001), 142–144.
32. Wilkowski, J., Deutsch, A., and Russell, D. Student skill
and goal achievement in the mapping with google
MOOC. Proc. L@S ’14, 3–9.
33. Zichermann, G. and Cunningham, C. Gamification by
Design. 2011.
34. Zimmerman, B.J. A social cognitive view of self-regu-
lated academic learning. Journal of Educational Psy-
chology 81, 3 (1989), 329–339.
... Scholars who agree that gamification of education is a strategy for increasing engagement by incorporating game elements into an educational environment (Dichev & Dicheva 2017) also believe the main goals of gamification are to enhance certain abilities, introduce objectives that give learning a purpose, engage students, optimize learning, support behaviour change, and socialize (Knutas et al., 2014;Krause et al., 2015;Dichev & Dicheva, 2017;Borges et al., 2013). Being stimulated by the games elements and its favourable impact, many researchers have investigated the effect of gamification in an educational context, getting favourable results, such as the increase of engagement, user retention, knowledge, and cooperation (Hakulinen & Auvinen 2014;Tvarozek & Brza, 2014). ...
... Secondly, gamification prepares both teachers and pupils to meet the demands of globalization in education. The result obtained supported (Dichev & Dicheva, 2017) who also believe the main goals of gamification are to enhance certain abilities, introduce objectives that give learning a purpose, engage students, optimize learning, support behaviour change, and socialize (Knutas et al., 2014;Krause et al., 2015;Dichev & Dicheva, 2017;Borges et al., 2013). ...
Chapter
Full-text available
This book is a compilation of articles produced by the participants who joined the Virtual Innovation Competition 2022 3rd Edition (VIC22). The event has been virtually contested from September to November 2022. It is organized by DIGIT360 and Digital Information Interest Group (DIGIT), in collaboration with the College of Computing, Informatics, and Media, Universiti Teknologi MARA Kelantan Branch, Malaysia; Universitas Ngudi Waluyo, Indonesia; Camarines Sur Polytechnic Colleges, Philippines; Indian Innovators Association, India; Indonesia Scientific Society, Indonesia; The Union Of Arab Academics, Yemen; Asia Research News; Jawatankuasa Tetap Pusat Sumber, Persatuan Pustakawan Malaysia; Nusantara Training and Research, Indonesia; Nobel International School, Malaysia; LSL International Academy, Malaysia; Academica Press Solutions; and Laman Teknologi Sdn Bhd. This collection consists of articles from the social science field and science and technology (S&T) field. It is hoped that the innovations compiled can be an inspiration to all readers.
... In the majority of cases, these studies showed that gamification is an effective tool and has a high potential for teaching, but that its effect on achievement is reflected in different ways (e.g., Alomari et al., 2019;Bai et al., 2020;Yıldırım & Ş en, 2021). Although many studies have presented evidence that gamification contributes positively to the achievement of learners (e.g., Karayılan et al., 2018;Krause et al., 2015;Parra-González et al., 2020), some studies reported no significant change in student achievement in environments where gamification was used (e.g., Attali & Arieli-Attali, 2015;Meşe & Dursun, 2019;Ferriz-Valero et al., 2020;Prasetyo Aji & Napitupulu, 2018). Researchers examining the effect of gamification on student achievement noted that the use of gamification at different grade levels and in different learning fields contributed to student achievement. ...
Article
Defined as the utilization of game elements in nongame environments, gamification has been frequently used in education in recent years. The aim of the present study is to summarize the studies previously conducted on the use of gamification in education through a systematic literature review. When the studies conducted in 2000–2021 were examined, four main dimensions came to the fore: (i) the aim of gamification studies, (ii) the learning fields where gamification studies were carried out, (iii) the level of education at which gamification studies were carried out, and (iv) how gamification was integrated into the learning environment. The results showed that gamification is used for various educational purposes, at many learning levels in various environments, and in a wide variety of learning fields. In most of the studies, the positive effects of gamification and its potential to solve problems in education were reported.
... On a similar view, a comparative study by Krause et al. (2015) on ' learning achievements by learners in developed and developing countries' established the gap between learning achievements in developed economies and learning achievements in sub-Saharan Africa. These studies found that inadequate teacher as human resource contributes to lower achievements of goals, knowledge, skills and values among learners. ...
Article
Full-text available
This present study sought to investigate the role of relationships on the efficacy of 100 per cent transition in public secondary schools in Kisii County, Kenya. Sound relationship stance a conducive teaching and learning environment for learners to excel. This is achieved through controlled social interaction among learners. The study adopted both descriptive survey and mixed research designs. The study targeted a total of 33,593 students, 4,986 teachers and 186 principals, out of which a sample of 380 students, 357 teachers and 27 principals was selected. Regression analysis showed that the variations of relationships could result in an improved efficacy of learner's transition in secondary schools in the County by 37.9 per cent, and it was statistically significant, p<.05. Additionally, ANOVA (F (1,206) =82.471, p=.000<.05) showed a statistically significant effect of relationships on the efficacy of 100 per cent transition in the County. The interviews revealed poor student-student and student-teacher relationships, like prefects were excessively empowered to the extent of bullying their colleagues. Therefore, the school’s management should organise team-building activities like ball games, school debates, exposure trips, clubs and societies to foster working relationships in schools to improve the efficacy of 100 per cent transition. Additionally, the need for strengthening the roles of the Guidance and Counselling department in schools was noted.
... 0 ‫ب‬ ‫كٌيبٙ٧كابٓيُج‬ (Hung, 2017, p58) (Barata, et al., 2013) (Kapp, 2012,p 93) (Surendeleg, et al., 2014(Surendeleg, et al., , p 1612 0 ‫ب‬ ‫ب‬ ‫ب‬ ‫ب‬ ‫ب‬ ‫دمب‬ ‫د٘ئٗتبٗبٚ‬ ‫دٞبم‬ ‫دودبئتحخ‬ ‫ٗم٘ض‬ (Todor & Pitic, 2013) Hamari et al., 2014) ‫ب‬B Surendeleg, et al., 2014 ‫ب‬B Dicheva, et al., 2011 ‫ب‬B (Lee & Hammer, 2011 (Bunchball, 2010) 0 ‫ب‬ ‫بثْٗب‬ ‫ٌابٗ‬ ‫ٗٙ٘غاب‬ Mohamad, et al., 2017) (Mekler, et al. 2013 (Jang, et al., 2015) ‫ب‬ ٗ ‫ب‬ ‫ب‬ ‫ب‬ ‫دثْٗب‬ ‫ب‬ ‫دثٗربٗ‬ ‫دٞبك‬ ‫ئتحخ‬ (Krause, et al. 2015 (Abramovich, 2013) ...
... Studies conducted in the field of education have considered the results of the use of gamification as mostly positive (Kogotkova et al., 2021;Sycheva et al., 2020). But there were also negative results, which need to be paid close attention to (Krause et al., 2015;Hamari et al., 2014). ...
Conference Paper
This article examines the feasibility and effectiveness of introducing gamification technologies into the educational process, which is the use of game mechanisms, principles and tools to solve real non-game problems and problems in various areas of public life. The conducted research is a meta-analysis of the current experience of gamification application, obtained from numerous international scientific studies, practical cases and reports of companies. The collected data show an important practical understanding of the application of gamification in the processes of planning and implementing projects, staff education and motivation, customer attraction and retention through loyalty programs. The influence of gamification technologies on the processes of cognition and learning is proved thanks to hardware studies of the brain activity of students during the game exercises using electroencephalography, functional magnetic resonance imaging (fMRI) and functional near-infrared spectroscopy (fNIRS). As a result of the study of the effects of gamification in terms of improving productivity through perceptual learning, conclusions were made that formed the basis for a generalized list of opportunities for using gamification elements in companies. The study systematized the practical results presented by case studies of companies and educational literature, which considered the results of the application of gamification as mostly positive.
... In this type of gamification, students are awarded with rewards, containing a signifier (e.g., name, image) when a predefined completion logic (i.e., condition) is fulfilled [10], [11]. Reward-based strategies have been proved to be beneficial in educational environments, MOOCs included, regarding student engagement [12]- [15], learning outcomes [14], [16] and task participation [15], [17]. ...
Article
Full-text available
Reward-based gamification strategies are proposed as a promising technique to increase student engagement in Massive Open Online Courses (MOOCs), following its success in other small-scale educational settings. However, these strategies imply a number of orchestration tasks (e.g., design, management) that are usually carried out by instructors, and which may hinder their use and adoption. Furthermore, some MOOC distinctive features (e.g., scale, 24/7 availability, etc.) have considerable implications on how these gamification strategies are orchestrated, resulting in an unmanageable instructors’ workload in cases of manual operation. Therefore, an eventual adoption of gamification in MOOCs calls for automatic systems capable of decreasing the additional workload of instructors. The limitations identified in the current solutions (e.g., non-usable graphical interfaces, inflexible gamification designs) led us to propose and develop a new gamification system named GamiTool. An evaluation with 19 MOOC instructors and gamification designers showed the high design expressiveness, usability and potential for adoption of GamiTool. Hence, GamiTool can be used by instructors to improve students’ engagement, and also, by researchers to keep understanding the effects of gamification in MOOC settings.
... Gamification shows different reinforcement mechanisms: (a) virtual goods which are not physical objects or money for use online (i.e., Avatars, powers, social likes, progress bars, and weapons) (Chang & Wei, 2016;Krause et al., 2015); (b) redeemable points that students earn to exchange virtual items and rewards (Huang & Hew, 2015); (c) team leaderboards in which the winning teams are recognised (Kuo & Chuang, 2016); (d) Where is Wally in which students are challenged to find a specific learning object hidden on the platform (i.e., bright star or exclamation point) (Armstrong & Landers, 2017) and (e) badges, which are reinforcing elements located in games, (i.e., cups, medals, trophies) (De-Marcos et al., 2016;Tenorio et al., 2016). The above reinforcements strengthen the learning process and motivate students to stay connected to the course and not drop out. ...
Chapter
Despite the successful, widespread adoption of MOOCs in recent years, their low retention rates cast doubt on their effectiveness. This research analyzes the influence of gamification and course video lengths on the attrition rates of three MOOCs, where students received reinforcing awards as they answered assessment questions. The variables gender, educational level, previous experience in MOOCs, and age were considered as covariates. A factorial design, a survival analysis, and a risk analysis for four weeks were used to determine the percentage of attrition from the MOOC. The results indicated that video duration and gamification decrease attrition. The most significant predictor of survival is the use of reinforcers in gamification. The most significant predictor of the risk of attrition is the number of videos seen between weeks one and two. In the longitudinal study, weeks one and two presented the highest risks of desertion, regardless of the manipulated factors. Finally, the discussion presents pedagogical strategies that directly benefit the survival rates in MOOCs and notes the differences between our findings and others in the existing literature.KeywordsGamificationDropoutMOOCVideo lengthHigher educationEducational innovation
... The purpose of gamification can be served only when both the following are satisfied: (a) the Learning Objectives and outcomes are well defined; and (b) efforts are spent to assess the effect of gamification on those learning outcomes (Morschheuser et al., 2018;Bhatt et al., 2021b). Various empirical studies show that gamification has a positive effect on student interest, early engagement (Betts et al., 2013), and learning outcomes (e.g., retention rate) (Vaibhav and Gupta, 2014;Krause et al., 2015). Therefore, gamification can be a valuable technique for enhancing learning outcomes. ...
Article
Full-text available
The goal of this paper is to develop and test a gamified design thinking framework, including its pedagogical elements, for supporting various learning objectives for school students. By synthesizing the elements and principles of design, learning and games, the authors propose a framework for a learning tool for school students to fulfil a number of learning objectives; the framework includes a design thinking process called “IISC Design Thinking” and its gamified version called “IISC DBox”. The effectiveness of the framework as a learning tool has been evaluated by conducting workshops that involved 77 school students. The results suggest that the gamification used had a positive effect on the design outcomes, fulfilment of learning objectives, and learners' achievements, indicating the potential of the framework for offering an effective, gamified tool for promoting design thinking in school education. In addition to presenting results from empirical studies for fulfilment of the objectives, this paper also proposes an approach that can be used for identifying appropriate learning objectives, selecting appropriate game elements to fulfil these objectives, and integrating appropriate game elements with design and learning elements. The paper also proposes a general approach for assessing the effectiveness of a gamified version for attaining a given set of learning objectives. The methodology used in this paper thus can be used as a reference for developing and evaluating a gamified version of design thinking course suitable not only for school education but also for other domains (e.g., engineering, management) with minimal changes.
Article
Research related to Gamification in MOOCs and the elements that support them has gained popularity and received particular attention in the last decade. Gamification in MOOC has several elements and goals to increase learner engagement, learner retention, and learner motivation to complete the course. This research uses a qualitative method with a systematic literature review approach to answer the research questions. This research resulted in 21 research papers on MOOC gamification. The purpose of the study is to identify gamification elements, the type of MOOC, and the impact of gamification in MOOC to optimize the completion rate of the course. Keywords: Massive Open Online Course; MOOC; Gamification; Gamification Element; Systematic Literature Review eISSN: 2398-4287 © 2022. The Authors. Published for AMER ABRA cE-Bs by E-International Publishing House, Ltd., UK. This is an open-access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of AMER (Association of Malaysian Environment-Behavior Researchers), ABRA (Association of Behavioral Researchers on Asians), and cE-Bs (Centre for Environment-Behavior Studies), Faculty of Architecture, Planning & Surveying, Universiti Teknologi MARA, Malaysia.
Article
Full-text available
This paper examines intrinsic motivation-driven game elements and designs an Intrinsic Motivation Gamification Framework to provide undergraduates with a persisting, scaffolding, and satisfying online learning experience based on different user profiles. Gamification became the main resolution for the low retention rate in learning platforms for undergraduates. Most gamified learning studies focus on the player-type profiling method, while no intrinsic motivation-driven game elements are mapped to certain players. This paper refined self-needs and user motivation in the learning experience from different user behaviours through the distinction method of Grasha Learner Style, deriving six types of learners: independent, dependent, participant, avoidance, collaborative and competitive. Based on the literature review, a gamified framework has been developed to engage the undergraduates to persist in using a self-developed gamified platform, GamiClass up to 16 weeks. Suggested 20 game elements have been deployed based on the needs of the Octalysis Framework and Self Determination Theory. The analysis mirrored the positive impact of game elements (group quest, challenges and time pressure) on the undergraduates. In contrast, choices/consequences and exploration game elements negatively affect the undergraduates’ continuity in the gamified platform. Furthermore, the group quest was identified as the critical game element that stimulates various types of undergraduates in competition and collaboration, thus internalising their extrinsic motivation into intrinsic motivation during their learning journey.
Article
Full-text available
Many MOOCs initiatives continue to report high attrition rates among distance education students. This study investigates why students dropped out or failed their MOOCs. It also provides strategies that can be implemented to increase the retention rate as well as increasing overall student satisfaction. Through studying literature, accurate data analysis and personal observations, the most significant factors that cause high attrition rate of MOOCs are identified. The reasons found are lack of time, lack of learners’ motivation, feelings of isolation and the lack of interactivity in MOOCs, insufficient background and skills, and finally hidden costs. As a result, some strategies are identified to increase the online retention rate, and will allow more online students to graduate.
Conference Paper
Full-text available
Particularly in the context of education and e-learning, the topics social interaction and collaboration have been a little strained during the recent decades. Particularly, stakeholders1 with an educational background often criticize that xMOOCs follow an approach that is too close to traditional frontal instruction; that they lack sophisticated options for user interaction. The team that is developing and running the openHPI (x)MOOC platform, in general, approves these statements. To improve this situation, a major research effort has been started to explore the different aspects of social interaction amongst course participants. The range of the examined options includes importing existing friend connections from social networks such as Facebook or Google+, adding social network features within the openHPI platform, but also collaboration features such as learning groups and group exercises. The paper at hand focuses on the evaluation of a survey that has been conducted amongst the users of the openHPI platform.
Conference Paper
Full-text available
“Internetworking with TCP/IP” is a Massive Open Online Course, held in German at openHPI end of 2012, that attracted a large audience that has not been in contact with higher education before. The course followed the xMOOC model based on a well-defined sequence of learning content, mainly video lectures and interactive self-tests, and with heavy reliance on social collaboration features. From 2726 active participants, 38% have participated in a survey at the end of the course. This paper presents an analysis of the survey responses with respect to the following questions: 1) How can a MOOC accommodate different learning styles and 2) What recommendations for the design and organization of a MOOC can be concluded from the responses? We finally give an outlook on challenges for the further development of openHPI. Those challenges are based on didactical and technical affordances for a better support of the different learning styles. We propose an evolution of the xMOOC, that bridges the gap to the cMOOC model by developing tools that allow users to create diverging paths through the learning material, involve the user personally in the problem domain with (group) hands-on exercises and reward user contributions by means of gamification.
Article
Full-text available
Gamified education is a novel concept, and early trials show its potential to engage students and improve their performance. However, little is known about how different students learn with gamification, and how their gaming habits influence their experience. In this paper we present a study where data regarding student performance and gaming preferences, from a gamified engineering course, was collected and analyzed. We performed cluster analysis to understand what different kinds of students could be observed in our gamified experience, and how their behavior could be correlated to their gaming characteristics. We identified four main student types: the Achievers, the Regular students, the Halfhearted students, and the Underachievers, all representing different strategies towards the course and with different gaming preferences. Here we will thoroughly describe each student type and address how different gaming preferences might have impacted the students' learning experience.
Chapter
The purpose of this chapter is to identify key components of distance education satisfaction. The distance education environment is an expanding market driven by several market forces. A working list of potential variables for satisfaction can be developed from the previous research done to compare the traditional to the distance education environments. A questionnaire was developed using these variables and administered to 341 distance students in a successful, top 26, M.B.A., distance education program. The results of the questionnaire are factored into five constructs that ultimately correlate well with the satisfaction ratings of the subjects. Using these factors as guidance, some operational and administrative implications of those findings are discussed.
Article
This article suggests directions for future research on goal theory. It points out areas in which attention to definition of constructs and to problems in classroom implementation would strengthen the field. In addition, suggestions for use of different sampling techniques and designs and introduction of qualitative methods are discussed. Possible contributions of constructivist views of learning to goal theory are considered. Finally, investigation of the ways in which social goals interact with mastery and performance goals to influence strategy use is proposed.
Chapter
This chapter provides an overview of some of the conceptual details related to multivariate analysis of variance (MANOVA) and multivariate analysis of covariance (MANCOVA) using a design involving one or two grouping variables and a collection of response variables that may include some concomitant variables. The chapter begins with a discussion of the purposes of multivariate analyses and describe research situations found in the behavioral science literature that call for the use of a MANOVA or a MANCOVA. However, MANCOVA is more involved than MANOVA from three standpoints: substantive theory, study design, and data analysis. Next, the chapter describes some MANOVA design aspects with a focus on the initial choice of a response variable system and on sampling. It also discusses at length a number of suggested guidelines for data analysis strategies and for reporting and interpreting MANOVA results, and illustrates these guidelines using a research example. The chapter concludes with some recommended practices regarding the typical use of MANOVA and MANCOVA by applied researchers.