ArticlePDF Available

Abstract and Figures

Massive Open Online Courses (MOOCs) require students' commitment and engagement to earn the completion, certified or passing status. This study presents a conceptual Learning Analytics Activity-Motivation framework that looks into increasing students' activity in MOOCs. The proposed framework followed an empirical data analysis from MOOC variables using different case studies. The results of this analysis show that students who are more active within the offered environment are more likely to complete MOOCs. The framework strongly relies on a direct gamified feedback that seeks driving students' inner motivation of competency.
Content may be subject to copyright.
Scientific Contribution
Mohammad KHALIL & Martin EBNER
Driving Student Motivation in MOOCs through
a Conceptual Activity-Motivation Framework
Massive Open Online Courses (MOOCs) require students’ commitment and
engagement to earn the completion, certified or passing status. This study presents
a conceptual Learning Analytics Activity-Motivation framework that looks into
increasing students’ activity in MOOCs. The proposed framework followed an
empirical data analysis from MOOC variables using different case studies. The
results of this analysis show that students who are more active within the offered
environment are more likely to complete MOOCs. The framework strongly relies on
a direct gamified feedback that seeks driving students’ inner motivation of
Learning Analytics, Massive Open Online Courses (MOOCs), Motivation, Activity,
Mohammad Khalil & Martin Ebner
1 Introduction
Distance learning in the form of Massive Open Online Courses (MOOCs) has ex-
perienced a quantum leap in the development of open educational resources and
educational technology. The main advantage of such courses is that hundreds or
even thousands of students can enrol in one course, which normally impossible in a
regular classroom setting. The story of MOOCs started with Siemens and Downes’
first course in 2008 and reached an early peak with Sebastian Thrun’s “Introduc-
tion to Artificial Intelligence” course, which attracted more than 160,000 students
from all over the world (YUAN & POWELL, 2013). Since then, MOOCs have
received significant attention from the media and educationalists for their potential
to extend the reach of education via technology.
MOOCs allow anyone to learn and interact through available learning technology-
enhanced learning materials and tools such as video lectures, recommended arti-
cles, content downloads, discussion forums, assessments, etc. The two popular
MOOC models, cMOOCs (connectivism MOOC) and xMOOCs (extended
MOOCs), deliver courses at a remarkably large scale in terms of enrollments, di-
versity of topics and geographical reach. In addition, the characteristics of the open
environment of MOOCs bring a distinct range of motivations and beliefs among
students (LITTLEJOHN et al., 2016). Assuming the direct interaction between
teachers and students does not reach the level of traditional face-to-face classroom
lectures, students are forced to organize their own learning. Furthermore, MOOCs
differ from the traditional settings in which student engagement varies in objec-
tives. As a result, the need for students to self-regulate their learning, maintain their
personal motivation (ZIMMERMAN, 2000), and actively interact with online
learning objects (KHALIL, KASTL & EBNER, 2016) becomes crucial.
To increase great learning activities, newly adopted technologies in MOOCs allow
researchers to track students’ behavior (i.e. what they do and how they learn) by
examining stored information about student engagement with different digital
learning activities (e.g. videos, discussion forums, and quizzes). The collection of
such a high volume of student data transforms the conventional data analytics into
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
so-called “Big Data” analytics. Examining large data sets of student interactions
and activities with online learning platforms provides a valuable opportunity to
discover patterns and understand student behavior. To that end, there are two ap-
proaches for examining data in educational settings: Educational Data Mining
(EDM) and Learning Analytics (PAPAMITSIOU & ECONOMIDES, 2014). Rely-
ing on data analytics, both approaches share common goals of improving education
and optimizing learning environments.
Despite the fact that MOOCs have great benefits, there are corresponding challeng-
es that affect their growth, especially in higher education courses. For instance,
keeping students engaged and motivated (KHALIL, TARAGHI & EBNER, 2016;
XU & YANG, 2016), the high attrition rate, the boring pedagogical design
(STACEY, 2014). Being on the same track, Learning Analytics shows great poten-
tial when meets MOOCs (KNOX, 2014). The key benefits of Learning Analytics in
connection with MOOCs are embodied in predicting, visualizing, recommending,
personalizing, saving costs, and improving students’ engagement (SIEMENS &
In this research study, we aim at preserving students’ engagement (and participa-
tion) within the MOOC sphere. Based on the empirical data from the Austrian
MOOC platform, iMooX (, this study proposes a framework
to improve students’ activity by examining various MOOC indicators. The article
is outlined along the following hypothesis and research question:
Hypothesis: There is a relation between MOOC learning activities a stu-
dent performs and his/her retention till the end of the MOOC.
RQ: How could we motivate MOOC students to stay active during the
The paper begins with a literature review on Learning Analytics and Learning Ana-
lytics of MOOCs as well as the related topics of students’ motivation and activity.
After that, we describe the used methodology to validate the hypothesis and answer
the research question with a description of the investigated dataset and its analysis.
Further, the proposed Activity-Motivation module that describes our scheme of
Mohammad Khalil & Martin Ebner
motivating students is discussed. At the end of the article, the key findings are
2 Related Work
In order to provide the context of this study, this section gives a brief literature
review of Learning Analytics and Learning Analytics of MOOCs, and reads previ-
ous studies of motivation and activity in online learning environments.
2.1 Learning Analytics
The prominent field of Learning Analytics has been widely discussed since the first
international conference on Learning Analytics and Knowledge in 2011 (LAK’11).
While there were a plethora of definitions describing its objectives, the Learning
Analytics community has agreed to finally define it as “…the measurement, collec-
tion, analysis and reporting of data about learners and their contexts, for purposes
of understanding and optimizing learning and the environments in which it occurs
(SIEMENS ET AL., 2011). Learning Analytics uses the generated data from stu-
dents in order to discover patterns and develop insights into their behaviors. The
borderless Internet and the increasing demand to reduce attrition as well as enhance
learning environments and learning are considered to be the major driving factors
behind the emergence expansion of this field (SIEMENS et al., 2011; PA-
Acquiring and analyzing students’ data are no more than two stages that open an
iteration loop in the holistic Learning Analytics lifecycle. According to Doug Clow
(2013) paper The Learning Analytics Cycle: Closing the loop effectively”, Learn-
ing Analytics loop should be closed in a way to invest the analysis phase outcome
with proper action(s). Provided that, Learning Analytics encompasses four main
phases (KHALIL & EBNER, 2015): a) learners generate data, b) data gets pro-
cessed, c) results get interpreted, and finally d) actions are optimized.
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
2.2 Learning Analytics of MOOCs
Software and online platforms are usually supported by event recording systems,
called log files. MOOC platforms are online approaches that are set up using Inter-
net programming languages such as HTML or JavaScript. MOOCs’ log files record
every event happening on the platform and on that basis, Learning Analytics appli-
cations of MOOCs are developed to mine and interpret such data in order to inter-
vene or predict actions (KHALIL & EBNER, 2016a).
Despite their great advantages in online learning, MOOCs face challenges of drop-
out, disengagement, and lack of motivation (KHALIL & EBNER, 2016b). For such
purposes, researchers were looking into finding adequate algorithms, tools, or ele-
ments to surpass MOOC issues. There are various research studies and topics re-
garding Learning Analytics and MOOCs. For example, Kizilcec and his colleagues
focused on engagement among students in MOOCs (KIZILCEC, PIECH &
SCHNEIDER, 2013). They developed a classification model to identify a small
number of longitudinal engagement trajectories. Another study about engagement
is based on the examination of assignments and video lecture views (ANDERSON
et al., 2014). The authors clustered MOOC students into five subpopulations:
Viewers, Solvers, All-Rounders, Collectors, and Bystanders.
Other research studies utilized Learning Analytics to detect and intervene before a
student drops out of the course. For instance, a recent study by XING et al. (2016)
developed a mechanism to detect at-risk students through their activity in discus-
sion forums. The researchers used the decision tree as an Educational Data Mining
technique for building the at-risk detection algorithm. Researchers from HTW Ber-
lin were able to predict students’ success based on MOOC discussion forum by
building student profiles (KLÜSENER & FORTENBACHER, 2015).
2.3 Student Activity and Motivation
Recent work on MOOCs revealed a high attrition scale in activities in the first two
weeks (BALAKRISHNAN & COETZEE, 2013). The authors reported a 50%
dropout at the end of the second week. Some researchers suggested cutting course
Mohammad Khalil & Martin Ebner
duration in half (LACKNER, EBNER & KHALIL, 2015). Others pushed the con-
cept of grabbing students’ attention by looking into boosting the extrinsic factors
such as offering badges, certificates and honor awards (WÜSTER & EBNER,
2016). Researchers from Northeastern University of China have noticed that the
activities performed by students in the MOOC platform reflect their motivation
(XU & YANG, 2016). The authors concluded a strong relation between someone’s
behavior and his/her evaluation of excitation. From there, they tried to find a rela-
tion between grade prediction and certification ratio along with their activities in
the MOOC through a developed classification model. While such prediction might
be hard to examine because of the complex nature of the predictive models (KLÜ-
SENER & FORTENBACHER, 2015), others used online surveys and semi-
structured interviews to identify learners’ motivation (LITTLEJOHN et al., 2016).
Further research about understanding the motivation of online learners in MOOCs
can be found in the article by Stanford University researchers who listed 13 factors
that could captivate learners’ motivation (KIZILCEC & SCHNEIDER, 2015). De-
spite their benefits, online surveys lack the proper target group and might provide
inaccurate results.
This study was further influenced by a couple of Learning Analytics applications,
which were considered in our proposed framework. One of these tools was Course
Signals (ARNOLD & PISTILLI, 2012). It is an application that provides feedback
according to the traffic light system. Whenever a green light is shown, it means that
the student is on track, whereas the orange and the red lights imply at-risk situa-
tions and intervention(s) by either a teacher or an institution would be required.
The literature review described previously is strongly related to our research. By
examining engagement and activity either in the discussion forums or video events,
this research study leverages the data from MOOC variables to preserve students’
activities and motivate them to stay engaged. The existing literature, however, pro-
vides very little research in regards to direct Learning Analytics feedback for stu-
dents on MOOC platforms. Despite the fact that showing statistics or gamification
elements to students is usually obtainable in most online environments as a motiva-
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
tion factor, to the best of our knowledge, we could rarely find a module that looks
at preserving students’ activity in MOOCs and providing them a direct feedback.
3 Methodology
Our methodology focused on obtaining data from the following MOOC variables:
watching video lectures, login frequency, posts in forum, reading of forum posts,
and quizzes in order to identify a competent activity level. Henceforth, an analysis
that includes finding patterns in visualizations and an examination using explorato-
ry analysis on empirical data was conducted. Data collection was performed using
the iMooX Learning Analytics Prototype (iLAP). iLAP is a Learning Analytics
application developed to track students on the iMooX MOOC-platform to improve
online learning and to provide a rich repository of data for research purposes
(KHALIL & EBNER, 2016a). When students log into the MOOC platform, the
database starts to be filled with low-level data related to students’ performance and
behavior. Every action performed is recorded, saved and filtered for a large scale
processing phase. We followed the content analysis methodology (NEUENDORF,
2002), in which these variables were measured and referenced to answer the re-
search questions. The study also employed WANG and HANNAFIN’s (2005) de-
sign-based research methodology that depends on identifying goals, collecting data
during the whole design process, and refining according to the required goals.
3.1 Dataset Description
iMooX MOOC-platform offers various courses which target people from German
speaking countries from secondary school level to Higher Education and beyond.
As a case study, we have chosen a MOOC called “Gratis Online Lernen” which
translates to “Free Online Learning”, abbreviated in this article as GOL-2014 and
GOL-2015 (EBNER, SCHÖN & KÄFMÜLLER, 2015). The course has been of-
fered in a continued series in the years: 2014 and 2015, and educates people about
using the Internet for learning. The MOOCs duration was set to be 8 weeks with a
Mohammad Khalil & Martin Ebner
workload of 2 hours/week (in total, 16 hours). Students had to score 50% in every
weekly quiz in order to pass. The MOOC platform offers self-assessment quizzes
in which every test can be repeated up to five times with a systematic approach to
consider the highest grade out of the attempts.
The main content of the courses were video lectures with an average duration of 5
minutes per video. Students were rewarded with certificates after they successfully
passed all the quizzes.
4 Dataset Analysis
In this section, we try to validate the questioned hypothesis by examining whether
the certified students show more activity using MOOC variables (forums and vide-
os). For this purpose, we chose to analyse the following three MOOC variables:
posts in forum, views in forum and video lectures for MOOCs GOL-2014 and
GOL-2015. We split the students into two categories: certified and non-certified.
The first group includes those who completed a MOOC and therefore received a
certificate at the end of the course, while the second group includes the students
who dropped out of the MOOC at any time during the course. The certified stu-
dents in GOL-2014 and GOL-2015 were (N= 193, N= 117) respectively, while the
non-certified students in GOL-2014 and GOL-2015 were (N= 810, N= 359). The
analysis results in the following subsections proved that learning activities have
quite an impact on students to persist in a massive open online course.
4.1 Forum Readings Analysis
During the 8 weeks of forum discussions, there were 22,565 views of forum
threads in GOL-2014 and 8,214 views of forum threads in GOL-2015. Figure 1a
and figure 1b show the average number of thread views for both MOOCs. The
difference between the reading activity of the two groups is quite obvious. Figure
1a depicts a maximum number of reads in week 1 for both groups, which rapidly
drops until week 4. This follows the condition that attrition rate becomes more
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
stable after the first four weeks of a MOOC (LACKNER, EBNER & KHALIL,
2015). However, in figure 1b, we realized that certified students’ forum views es-
calated in week 5 and then dropped to around 4 views per user till the end of the
MOOC. A study by Wong and his colleagues recorded similar student behavior
(WONG et al., 2015). The authors unveiled that active users showed higher activity
after the first weeks of the MOOC.
Figure 1a: The average number of discussion forum views in GOL-2014 MOOC
Mohammad Khalil & Martin Ebner
Figure 1b: The average number of discussion forum views in GOL-2015 MOOC
4.2 Forum Posts Analysis
In total, in GOL-2014 there were 828 and in GOL-2015 there were 408 posts writ-
ten in the respective forum. These posts took the forms of comments, threads, and
replies. Figure 2a and Figure 2b illustrate the average number of written posts in
both MOOCs forums.
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
Figure 2a: The average number of discussion forum posts in GOL-2014 MOOC
Figure 2b: The average number of discussion forum posts in GOL-2015 MOOC
In fact, it is apparent that certified students are more active in posting and com-
menting in MOOC forums. In Figure 2a, the average number of contributions is
Mohammad Khalil & Martin Ebner
very low after the fourth week. There are various reasons for this, such as the steep
drop out rate after the first weeks (see Figure 3), or the low motivation to contrib-
ute and comment (MANNING & SANDERS, 2013).
Figure 3: Remarkable activities attrition after the first weeks of the GOL-2014
and GOL-2015 MOOCs
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
4.3 Video Lectures Analysis
The third MOOC variable we analyzed was video lectures. Video content was
hosted on YouTube; however, the iLAP can only mine events of participants using
play and pause/stop that happen on the iMooX platform. We summed up the total
number of video interactions and showed the average number of events (play,
pause, and full-watch) per week. There were 17,384 video events in GOL-2014 and
8,102 video events in GOL-2015. Figure 4a and Figure 4b show a graph line of
learner interactions in GOL-2014 and GOL-2015.
Figure 4a: The average number of video events in GOL-2014 MOOC
Mohammad Khalil & Martin Ebner
Figure 4b: The average number of video events in GOL-2015 MOOC
The figures displayed above show that the average number of video events of certi-
fied students is undoubtedly higher than non-certified students. Non-certified stu-
dents show weak video lectures activity.
4.4 Data Analysis summary
The seven figures in the previous subsections showed that the active users demon-
strated higher activity during the MOOC weeks. With regards to students’ forum
activities, we found there was an obvious gap between certified and non-certified
students that made us consider the first hypothesis. Motivated students are more
likely to engage in discussion forums (LACKNER, EBNER & KHALIL, 2015).
Gilly Salmon identified four learner strategies in online discussions: (1) “Some do
not try to read all messages.” (2) “Some remove themselves from conferences of
little or no interest to them, and save or download others.” (3) “Others try to read
everything and spend considerable time happily online, responding where appro-
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
priate.” (4) “Yet others try to read everything but rarely respond.” (SALMON,
2007). The data presented in sections 4.1 and 4.2 corresponds to Salmon’s learner
types 1, 2 and 4. Non-active students do not ask questions or comment in the fo-
rums. Presumably, certified students are more likely to post questions to ask a
teacher or colleague for help which means they are more active in forums.
The video analysis also showed the difference between certified and non-certified
students. As MOOCs rely on videos, students need to watch them in order to pass
quizzes. Thus, active students who want to pass quizzes need to watch videos, ex-
cept for some cases where students try to game the MOOC system (KHALIL &
EBNER, 2016c).
5 Proposed Activity-Motivation Framework
According to the previous analysis results and the impact of activities on students’
motivation to complete a MOOC, we propose an Activity-Motivation framework to
motivate learners to do more activities. We designed this framework in corre-
spondence to the iMooX MOOC-platform’s potential of offering variables such as
quiz attempts, logins, forum posts and views, and the tested empirical data in sec-
tion 4. The Activity-Motivation framework intends to assist in increasing students’
motivation and engagement.
Mohammad Khalil & Martin Ebner
Figure 5: The Activity-Motivation framework
The proposed model is shown above in Figure 5 and consists of four main dimen-
sions. Each of the dimensions contributes with a portion to a gamification element.
Our choice of such items was a battery as we believed it expresses a “filling up”
animation. We thought that what happens to a battery is similar to what a student
does with the MOOC activities. We aim to keep the students charged with activity,
motivation, and incentive. Gamification elements have a positive impact on student
motivation and learning (GONZALEZ, TOLEDO & MUNOZ, 2016).
The four dimensions of the proposed framework are: login, video, quiz, and forum.
It is worth noting that these dimensions can be extended and are not exhaustive to
the ones listed. For instance, extra dimensions involve readable content such as a
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
downloadable article or assignments, can be included when required. The gamifica-
tion element was divided into four segments based on the number of selected
MOOC variables. The proposed Activity-Motivation model can be implemented as
a plugin or as an independent tab on the MOOC page and would be updated on a
weekly-basis. In the following paragraph, we will briefly elaborate on the model
and describe its mechanism. As seen below, each element counts for a 25%
charged portion in the battery:
Login: When a student logs into the MOOC, he/she will reflect relatively
on the gamification element (battery). The first segment of the battery will
be 25% charged. Several logins will not increase the charged portion.
Video: The second dimension is the video lecture. When a student interacts
with the MOOC video lectures and completes a number of predefined
events the battery is charged a bit further.
Quiz: The battery will be filled with one extra portion when a student does
a quiz. As previously described, iMooX MOOC-Platform allows each stu-
dent to try the every weekly quiz up to five times. However, just one trial
would be enough to indicate that the student is active. Identical to the pre-
vious dimensions, several attempts will not increase the battery’s charged
Forum: The analysis in sections 4.1 and 4.2, showed the relation of discus-
sion forums and student activity. Being engaged in the forums either by
writing or reading threads will increase the battery charging portion.
6 Discussion and Conclusion
Massive Open Online Courses (MOOCs) are a new trend in the domain of Tech-
nology-Enhanced Learning. Higher Education institutions have come under pres-
sure to adopt an accessible and open educational environment. MOOCs provide
such an opportunity, yet, there are issues regarding drop-out rate, engagement with
MOOC elements, the interaction between students and instructors as well as moti-
Mohammad Khalil & Martin Ebner
vation. On the other hand, Learning Analytics offers techniques and tools to predict
and intervene to enhance both the learning and the educational environment.
In this research study, we utilized analysis techniques on students’ data in order to
investigate the hypothesis of the relation between students’ activities and retention
in MOOCs. We found that certified students, who got certificates at the end of the
course, participated in more activities than the non-certified students. Certified
students engaged more in discussion forums; they viewed more forum posts and
wrote more frequently than non-certified students. Additionally, they often inter-
acted more with video lectures. In fact, the seven figures in section 4 show that the
active users demonstrated higher activity during the MOOC weeks. As a result of
that, we became quite certain of the hypothesis that the more activities are done,
the more likely the students are to complete the MOOC.
Based on the validation of this hypothesis, we proposed an Activity-Motivation
model with the aid of Learning Analytics techniques and a gamification element.
The framework was built on the previous analysis results in this study using the
inheritance of MOOC indicators. The proposed framework can be extended with
extra MOOC indicators and is easy to implement and adopt in similar MOOC plat-
forms. While we agree that the didactical approaches and the intrinsic factors of
MOOCs can affect students’ motivation, we also strongly believe in the need to
develop such a model to stir students’ motivation of competency.
7 References
Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: using learning
analytics to increase student success. In Proceedings of the 2nd International
Conference on Learning Analytics and Knowledge (pp. 267-270). Vancouver,
Canada: ACM.
Anderson, A., Huttenlocher, D., Kleinberg, J., & Leskovec, J. (2014). Engaging
with massive online courses. In Proceedings of the 23rd international conference on
World wide web (pp. 687-698). Seoul, Korea: ACM.
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
Balakrishnan, G. & Coetzee, D. (2013). Predicting student retention in massive
open online courses using hidden Markov models. Tech. Rep. UCB/EECS-2013-
109, University of California.
Clow, D. (2012). The learning analytics cycle: closing the loop effectively. In
Proceedings of the 2nd international conference on learning analytics and
knowledge (pp. 134-138). Vancouver, Canada: ACM.
Ebner, M., Schön, S., & Käfmüller, K. (2015). Inverse Blended Learning bei
„Gratis Online Lernen“ – über den Versuch, einen Online-Kurs für viele in die
Lebenswelt von EinsteigerInnen zu integrieren. In N. Nistor, & S. Schirlitz (Eds.),
Digitale Medien und Interdisziplinarität (pp. 197-206). Waxmann, Medien in der
Wissenschaft Bd 68.
Gonzalez, C. S., Toledo, P., & Munoz, V. (2016). Enhancing the Engagement of
Intelligent Tutorial Systems through Personalization of Gamification. International
Journal of Engineering Education, 32(1), 532-541.
Khalil, M., & Ebner, M. (2015). Learning Analytics: Principles and Constraints. In
Proceedings of World Conference on Educational Multimedia, Hypermedia and
Telecommunications (pp. 1789-1799). AACE.
Khalil, M., & Ebner, M. (2016a). What Massive Open Online Course (MOOC)
Stakeholders Can Learn from Learning Analytics? In M. Spector, B. Lockee, & M.
Childress (Eds.), Learning, Design, and Technology: An International Compendium
of Theory, Research, Practice, and Policy (pp. 1-30). Springer International
Khalil, M., & Ebner, M. (2016b). Learning Analytics in MOOCs: Can Data Improve
Students Retention and Learning?. In Proceedings of the World Conference on
Educational Media and Technology (EdMedia 2016) (pp. 569-576). Vancouver,
Canada: AACE.
Khalil, M., & Ebner, M. (2016c). Clustering patterns of engagement in Massive
Open Online Courses (MOOCs): the use of learning analytics to reveal student
categories. Journal of Computing in Higher Education, 1-19.
Khalil, M., Taraghi, B., & Ebner, M. (2016). Engaging Learning Analytics in
MOOCs: the good, the bad, and the ugly. In Proceedings of the International
Mohammad Khalil & Martin Ebner
Conference on Education and New Developments (END 2016) (pp. 3-7). Ljubljana,
Khalil, M., Kastl, C., & Ebner, M. (2016). Portraying MOOCs Learners: a
Clustering Experience Using Learning Analytics. In M. Khalil, M. Ebner, M. Kopp,
A. Lorenz, & M. Kalz (Eds.), Proceedings of the European Stakeholder Summit on
experiences and best practices in and around MOOCs (EMOOCS 2016) (pp.265-
278). Graz, Austria.
Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing
disengagement: analyzing learner subpopulations in massive open online courses.
In Proceedings of the third international conference on learning analytics and
knowledge (pp. 170-179). Leuven, Belgium: ACM.
Kizilcec, R. F., & Schneider, E. (2015). Motivation as a lens to understand online
learners: Toward data-driven design with the OLEI scale. ACM Transactions on
Computer-Human Interaction (TOCHI), 22(2), 6. Chicago.
Klüsener, M., & Fortenbacher, A. (2015). Predicting students’ success based on
forum activities in MOOCs. In the 8th Internation Conference on Intelligent Data
Acquisition and Advanced Computing Systems: Technology and Applications (pp.
925-928). Warsaw, Poland: IEEE.
Knox, J. (2014). From MOOCs to Learning Analytics: Scratching the surface of the
‘visual’. eLearn 2014, 11.
Lackner, E., Ebner, M., & Khalil, M. (2015). MOOCs as granular systems: design
patterns to foster participant activity. eLearning Papers, 42, 28-37.
Littlejohn, A., Hood, N., Milligan, C., & Mustain, P. (2016). Learning in MOOCs:
Motivations and self-regulated learning in MOOCs. The Internet and Higher
Education, 29, 40-48.
Manning, J., & Sanders, M. (2013). How widely used are MOOC forums? A first
look. Retrieved July 02, 2016, from
Neuendorf, K. A. (2002). The content analysis guidebook. Vol. 300. Thousand
Oaks. CA: Sage Publications.
ZFHE Vol. 12 / Issue 1 (March 2017) pp. 101-122
Scientific Contribution
Papamitsiou, Z. K., & Economides, A. A. (2014). Learning Analytics and
Educational Data Mining in Practice: A Systematic Literature Review of Empirical
Evidence. Educational Technology & Society, 17(4), 49-64.
Siemens, G., & Baker, R. S. J. D. (2012). Learning analytics and educational data
mining: towards communication and collaboration. In Proceedings of the 2nd
international conference on learning analytics and knowledge (pp. 252-254).
Vancouver, Canada: ACM.
Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B.,
Ferguson, R., ... & Baker, R. S. J. D. (2011). Open Learning Analytics: an
integrated & modularized platform. Retrieved June 30, 2016, from
Stacey, P. (2014). The Pedagogy of MOOCs. The International Journal for
Innovation and Quality in Learning, 2(3), 111-115.
Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-
enhanced learning environments. Educational technology research and
development, 53(4), 5-23.
Wong, J. S., Pursel, B., Divinsky, A., & Jansen, B. J. (2015). An Analysis of
MOOC Discussion Forum Interactions from the Most Active Users. In International
Conference on Social Computing, Behavioral-Cultural Modeling, and Prediction
(pp. 452-457). Springer International Publishing.
Wüster, M., & Ebner, M. (2016). How to integrate and automatically issue Open
Badges in MOOC platforms. In M. Khalil, M. Ebner, M. Kopp, A. Lorenz, & M. Kalz
(Eds.), Proceedings of the European Stakeholder Summit on experiences and best
practices in and around MOOCs (EMOOCS 2016) (pp.279-286). Graz, Austria.
Xing, W., Chen, X., Stein, J., & Marcinkowski, M. (2016). Temporal predication
of dropouts in MOOCs: Reaching the low hanging fruit through stacking
generalization. Computers in Human Behavior, 58, 119-129.
Xu, B., & Yang, D. (2016). Motivation classification and grade prediction for
MOOCs learners. Computational intelligence and neuroscience, 2016, 4.
Mohammad Khalil & Martin Ebner
Yuan, Li, & Powell, S. (2013). MOOCs and Open Education: Implications for
Higher Education. Cetis White Paper. Retrieved from
Zimmerman, B. (2000). Attaining self-regulation: A social cognitive perspective. In
M. Boekaerts, M. Zeidner, & P. Pintrich (Eds.), Handbook of self-regulation (pp. 13-
39). San Diego, CA: Academic Press.
Salmon, G. (2007). E-Moderating. The key to teaching & learning online.
Abingdon: RoutledgeFarmer.
Mohammad KHALIL  Graz University of Technology, Educa-
tional Technology  Münzgrabenstraße 35a, A-8010 Graz
Adjunct Prof. PhD. Martin EBNER  Graz University of Technol-
ogy, Educational Technology  Münzgrabenstraße 35a,
A-8010 Graz,,
... At the same time, participants do not know how other learners pass the course, or whether their chosen learning path is effective. In this situation, motivation plays a significant role in learners' retention in a MOOC (Khalil & Ebner, 2017). Studies have shown, that the motivation of completers in MOOCs differs from the motivation of dropouts (Watted & Barak, 2018), and different motives predict a different level of engagement with course materials (Kizilcec & Schneider, 2015). ...
... Studies have shown that in a traditional environment, students with extrinsic motivation get lower grades, compared to students with intrinsic motivation, who use meaningful learning strategies, demonstrate more insistence in their studying, and try to solve more difficult tasks (Goldberg & Cornell, 1998;Mitchell, 1992;Vallerand & Bissonnette, 1992). In an online environment, both internal and external motives have a significant relationship with course completion (Khalil & Ebner, 2017;Yuan & Powell, 2013). ...
... Fourth, participants with amotivation (taking the course to have access to its materials) are less likely to complete a MOOC. The same result was found in Khalil and Ebner (2017), where it was shown that the lack of motivation was a key predictor of dropping out of a MOOC. Amotivation may cause a lack of persistence in the online environment, which can adversely impact MOOC performance (Hart, 2012;Vanthournout et al., 2012). ...
Studies have shown that learners’ motivation is a significant predictor of the level of engagement in a MOOC. However, the role of motivation in a MOOC’s completion remains questionable. In our study, we estimated the role of motivation in a MOOC’s completion, controlling for the characteristics of participants and their level of engagement with the course materials. The research database includes the survey and trace data on participants of nine MOOCs related to the economic field, launched on Coursera in 2014–2015. Two research models were created: the first model for all MOOCs’ participants; the second model for university-affiliated participants. The results of the logistic regression showed that learners’ motivation has a significant relationship with a MOOC’s completion. However, not all motives for participation in MOOCs are significantly related to the chances of earning a certificate of completion. Intrinsic motivation, a motive for getting skills that could be useful for changing the workplace, and a motive for earning a certificate significantly increase the chances of a MOOC’s completion. In turn, amotivation has a negative relationship with a MOOC’s completion.
... Higher motivated learners are more actively involved in their learning and are more likely to complete the program (Zimmerman, 1990;Sungur, 2007). Indeed, findings from various surveys highlight the positive role of (intrinsic) motivation in participation and program completion (Littlejohn, Hood, Milligan, & Mustain, 2016;Khalil & Ebner, 2017;Watted & Barak, 2018;Shukor & Sulaiman, 2019). In contrast, those motivated by extrinsic motivations face more difficulties (Rabin, Henderikx, Yoram, & Kalz, 2020). ...
... 201 Open Journal of Social Sciences according to the literature, which ranges from 5% -15% (Jordan, 2013). The instructional material of the program and its general instructional design contributed to the very small dropout rate (19.8%), as the findings of various researches have shown that it decreases when learners are satisfied with the program and the instructional material (Khalil & Ebner, 2013; Gütl, Rizzardini, Chang, & Morales, 2014; Nawrot & Doucet, 2014; Whitmer, Schiorring, & James, 2014;Yousef et al., 2014;Alraimi, Zo, & Ciganek, 2015;Castaño, Maiz, & Garay, 2015;Hew, 2016;Hone & El Said, 2016;Loizzo, Ertmer, Watson, & Watson, 2017;Jung et al., 2019) or when they are motivated by it(Littlejohn, Hood, Milligan, & Mustain, 2016;Khalil & Ebner, 2017;Hew, 2018;Watted & Barak, 2018;Shukor & Sulaiman, 2019), something that happened in our program as well. ...
... For example, Hmedna et al. (2019) use clustering analysis, DT, RF, K-nearest neighbor and a neural network to propose a predictive model for learning styles that can distinguish between learners in terms of their degree of preference for each learning style. Also, cluster analysis was used by Hmedna et al. (2019), Khalil and Ebner (2017), Lan et al. (2019) and Wang et al. (2018) to propose an approach and make a prediction to enhance educational process mining centered on the collected data from logs and learners' engagement. ...
Full-text available
Purpose The general goal of this paper is to help educators understand the importance of MOOC training to school teachers and their hypothetical value for predicting the use of teaching strategies in the face-to face-classroom teaching. With this purpose, the study is guided by two research questions: (1) Are there different patterns of preferences in teaching strategies among school teachers when they participate in MOOC training? (2) To what extent the attributes selected from the data set to visualize patterns are suitable for the formation of models? Design/methodology/approach Peer instruction (PI) and think-pair-share (TPS) strategies might bring positive outcome during classroom teaching. When introduced properly to school teachers, these strategies help students see reason beyond the answers by sharing with other students their response and thus learning from each other. This study aims to use educational data mining (EDM) techniques to visualize patterns and propose models based on the teaching strategies training to be used in face-to-face classroom teaching. The data set includes five attributes extracted from school teachers' Massive Open Online Courses (MOOC) training interaction data. All analysis and visualization were performed using Python, and the models were evaluated using fivefold cross-validation. The modeling performance of three different algorithms (decision tree, random forest and K -means) was tested on the data set. The results of model accuracy were presented as a confusion matrix. The experimental results indicate that the random forest (RF) algorithm outperforms decision tree (DT) and K -means algorithms with an accuracy of 96.4%. Findings This visualization information on the grouping of school teachers based on the teaching strategies serves as an essential reference for school teachers choosing between the two types of strategies within their face-to-face classroom settings. Teachers may use the finding obtained for an initial understanding of which strategies will fit well on their classroom teaching based on their subject majors. Moreover, the classification accuracy rates of DT and RF algorithms were the highest and considered highly significant to allow developing predictive models for similar EDM cases and provide a positive effect on the learning environment. Research limitations/implications This visualization information on the grouping of school teachers based on the teaching strategies serves as an essential reference for school teachers choosing between the two types of strategies within their face-to-face classroom settings. Teachers may use the finding obtained for an initial understanding of which strategies will fit well on their classroom teaching based on their subject majors. Unlike predicting different patterns of preferences in teaching strategies among school teachers when they participate in MOOC training, using visualization was found much more comfortable, less complicated and more time-efficient for small data sets. Moreover, the classification accuracy rates of decision tree and random forest algorithms were the highest and considered highly significant to allow developing predictive models for similar educational data mining cases and provide a positive effect on the learning environment. Practical implications DT classifier in this study ranks first before model optimization, but second after model optimization in terms of accuracy. Therefore, the goodness of the indicators needs to be further studied to devise a reasonable intervention. Social implications A different group of school teachers attending training on teaching strategies in a different online platform is required in future research to cross-validate these study findings. Originality/value The authors declare that this submission is their own work and to the best of their knowledge it contains no materials previously published or written by another person, or substantial proportions of material that have been accepted for the award of any other degree at any other educational institution.
... Nevertheless, there are some remaining points to be considered: At the start of a MOOC, the number of participants is often very high and to be able to keep a sustainable community, measures must be taken to maintain the motivation level of the participants. It is important to note that these actions result in an increased workload for the lecturer [19]. ...
Conference Paper
Full-text available
Involving citizens in science projects has become a more common method to collect data and gather information for science projects. A good tool is needed to carry out a successful citizen science project. Massive Open Online Courses (MOOC) can provide various strengths to deal with some challenges in citizen science. Two MOOCs, produced at the University of Innsbruck, concerning public engagement, interactivity and transferring scientific knowledge were analysed. The current state of our research indicates that a MOOC can help reduce difficulties in the field of citizen science.
The research studied learner satisfaction and learner intention-fulfilment (IF) with the massive open online courses (MOOCs). Many authors have studied various prospects from the point of view of the developer. Retention and completion rate were studied by the authors but the satisfaction level of the students was untouched. The modern education system is more learner-centred which leads to lifelong learning. Lifelong learning focuses on the learner rather than the developer. The questionnaire was harvested among 177 MOOC participants. Various analyses such as mean, standard deviation, correlation, regression and analysis of variance (ANOVA) were applied with the help of statistical tool SPSS v26.0. The results reveal that the satisfaction level of the learner is affected positively by variables like online self-regulated learning which includes goal setting, behavioural variables and perceived course usability. Further, the satisfaction level of the participant is not dependent upon gender and age. Also, the previous learning experience of the participant does not affect their IF.
Le recours aux dispositifs massifs de formation de type MOOC a connu une forte augmentation depuis 2012 (Friendman, 2012). Cet engouement s’est accompagné d’un grand défi à savoir comment arriver à augmenter le taux de réussite des apprenants dans ces nouveaux dispositifs de formation. Dans le cadre de notre recherche on s’intéresse aux candidats qui réussissent dans le MOOC et plus particulièrement à leurs caractéristiques personnelles. Nous nous intéressons donc, et en se référant à la théorie sociocognitive, au profil motivationnel de l’apprenant, les stratégies mobilisées pour autoréguler son apprentissage ainsi que l’impact de son sentiment d’auto-efficacité sur la réussite au MOOC.Ce travail s’appuie sur le modèle d’autodétermination de Deci et Ryan (2000), l’approche sociocognitive de Bandura (1997) dans sa composante relative au sentiment d’efficacité personnelle ainsi que le modèle cyclique de l’autorégulation de l’apprentissage de Zimmerman (1998). Ces théories ont été regroupées dans le modèle d’autodirection de l’apprentissage de Philippe Carré (2002).Sur le plan conceptuel, nous travaillons sur la mise au point d’une nouvelle échelle de mesure de la motivation dans un dispositif de formation de type MOOC. Nous mobilisons cette échelle pour travailler sur une classification des apprenants sous forme de clusters. Cette classification nous servira pour trouver une éventuelle relation entre la réussite au MOOC et le profil motivationnel autodéterminé. Par rapport aussi à la motivation, on essaye d’étudier les différents motifs d’entrée en formation des apprenants et l’impact du profil sociodémographique sur ces motifs.Ce travail a permis aussi de comprendre les processus mobilisés par les apprenants pour autoréguler leur apprentissage dans un dispositif de formation présentant peu ou pas d’accompagnement tout en essayant de comprendre quels sont les processus les plus mobilisés par les apprenants qui réussissent au MOOC par rapport à ceux qui échouent. Cela nous a permis aussi de dresser une cartographie des stratégies d’autorégulation d’apprentissage mobilisées par les apprenants du MOOC.Pour comprendre la réussite des apprenants, nous nous sommes aussi penchés sur la corrélation pouvant exister entre le niveau de sentiment d’efficacité personnelle (SEP) et la réussite au MOOC tout en s’intéressant aussi aux sources de ce SEP. Les résultats de la présente recherche indiquent un lien entre les apprenants ayant réussi le MOOC et leur niveau d’autodétermination préalable. De même, et par rapport à l’autorégulation de l’apprentissage, une cartographie des stratégies mobilisées par les apprenants, nous a permis de détecter un lien de causalité significatif entre le nombre de stratégies d’autorégulation du contrôle de l’apprentissage et la réussite au MOOC. Quant au sentiment d’efficacité personnelle, les analyses menées ont montré un lien statistique très significatif entre son niveau élevé et la réussite des apprenants. A la fin de notre travail, nos résultats de recherche sont discutés afin d’apporter une évaluation critique mais aussi valoriser les apports de ce travail. Dans une perspective d’amélioration à apporter au processus de conception des formations de type MOOC, nous ouvrons des nouvelles perspectives possibles permettant aux concepteurs de ces dispositifs d’apprentissage d’intégrer la composante ingénierie d’autodirection de l’apprentissage, comme élément important facilitant la réussite des apprenants.
Full-text available
Digitale Medien sind in der modernen Bildungslandschaft stetiger Hoffnungsträger für die Aufhebung der Bildungsungleichheit. Im Fokus der 2010er Jahre standen Massive Open Online Courses (MOOCs), die spätestens seit dem grossen MOOC-Hype 2012 kontinuierlich und in stetig wechselnder Gestalt als Optimierungen der Hochschulbildung gehandelt werden und selbst in den vergangenen sieben Jahren zahlreiche Veränderungen durchlaufen haben. MOOCs haben in der deutschen Hochschulbildung zu kontroversen Diskussionen geführt, bisweilen nehmen MOOCs an deutschen Hochschulen jedoch keinen besonderen Stellenwert ein. Dabei gäbees durchaus Potenziale zur Weiterentwicklung (und damit auch Optimierung) der Hochschullehre, wenn man an internationale Entwicklungen der MOOC-Forschung anknüpfen würde. Dieser Beitrag bilanziert, ob MOOCs den Ansprüchen der Digitalisierung in Hochschulen und den gesellschaftlichen Erwartungen, der Bildungsungleichheit entgegenzuwirken, gerecht werden und stützt sich in der kritischen Einschätzung auf die Ergebnisse einer gross angelegten (N > 1000) und abgeschlossenen quantitativen Studie über vier internationale MOOCs. Die Forschungsergebnisse begründen, warum MOOCs in ihrer traditionellen Form nicht zwingend zu einer Optimierung der Hochschulbildung führen, geben jedoch Hinweise darauf, wie dies nach wie vor gelingen könnte.
Christian T. Toth untersucht den Einfluss von Persönlichkeitsfaktoren und dem Prokrastinationsverhalten von Teilnehmern und Teilnehmerinnen in Massive Open Online Courses (MOOCs). Der Autor bettet die Forschungsergebnisse in eine Rezeption und Systematisierung der internationalen MOOC-Forschung ein und diskutiert sie hinsichtlich ihrer Bedeutsamkeit für die (Weiter-)Entwicklung. Schließlich werden die Ergebnisse, auch bezogen auf den digital divide, aus einer gesellschaftsanalytischen Perspektive gedeutet. Die Basis der vorliegenden Forschung sind die in Kooperation mit dem MOOC-Provider ‚Iversity‘ erhobenen Daten aus vier internationalen MOOCs. Der Inhalt • Forschungsüberblick zu Massive Open Online Courses (MOOCs) • Die Bedeutung der Persönlichkeit und der Prokrastination im E-Learning • Quantitative Analysen von vier internationalen MOOCs • Kritische Analyse im Zusammenhang mit der digitalen Spaltung Die Zielgruppen • Dozierende und Studierende der Erziehungs-, Bildungs- und Sozialwissenschaften • Praktiker und Praktikerinnen im Bereich des E-Learning, der Medien- und Erwachsenenbildung und der außerberuflichen Weiterbildung Der Autor Christian T. Toth ist wissenschaftlicher Mitarbeiter am Institut für Erziehungswissenschaft der Johannes Gutenberg-Universität in Mainz und forscht an der Schnittstelle von Allgemeiner Pädagogik und Medienpädagogik mit Schwerpunkten in der Persönlichkeits- und Bildungsforschung.
E-Learning (von electronic learning) ist “ein vielgestaltiges gegenständliches und organisatorisches Arrangement von elektronischen bzw. digitalen Medien zum Lernen, virtuellen Lernräumen und ‘Blended Learning’” . E-Learning ist deshalb ein sehr weitgefasster und zunehmend unpräziser Begriff, denn das Spektrum des E-Learning reicht von den ersten Ansätzen des ‘Computer Based Training’ aus den 1980er Jahren bis hin zu rein digitalen Lernumgebungen wie bei MOOCs.
Full-text available
Though MOOC platforms offer quite good online learning opportunities, thereby gained skills and knowledge is not recognized appropriately. Also, they fail in main-taining the initial learner’s motivation to complete the course. Mozilla’s Open Badges, which are digital artifacts with embedded meta-data, could help to solve these problems. An Open Badge contains, beside its visual component, data to trustworthy verify its receipt. In addition, badges of different granularity cannot just certify successful course completion, but also help to steer the learning process of learners through formative feedback during the course. Therefore, a web application was developed that enabled iMooX to issue Open Badges for formative feedback as well as summative evaluation. A course about Open Educa-tional Resources served as prototype evaluation, which confirmed its aptitude to be also used in other courses.
Conference Paper
Full-text available
Many massive open online courses (MOOCs) offer mainly video-based lectures, which limits the opportunity for interactions and communications among students and instructors. Thus, the discussion forums of MOOC become indispensable in providing a platform for facilitating interactions and communications. In this research, discussion forum users who continually and actively participate in the forum discussions throughout the course are identified. We then employ different measures for evaluating whether those active users have more influence on overall forum activities. We further analyze forum votes, both positive and negative, on posts and comments to verify if active users make positive contributions to the course conversations. Based the result of analysis, users who constantly participate in forum discussions are identified as statistically more influential users, and these users also produce a positive effect on the discussions. Implications for MOOC student engagement and retention are discussed.
Conference Paper
Full-text available
Massive Open Online Courses are remote courses that excel in their students' heterogeneity and quantity. Due to the peculiarity of being massiveness, the large datasets generated by MOOCs platforms require advance tools to reveal hidden patterns for enhancing learning and educational environments. This paper offers an interesting study on using one of these tools, clustering, to portray learners' engagement in MOOCs. The research study analyse a university mandatory MOOC, and also opened to the public, in order to classify students into appropriate profiles based on their engagement. We compared the clustering results across MOOC variables and finally, we evaluated our results with an eighties students' motivation scheme to examine the contrast between classical classes and MOOCs classes. Our research pointed out that MOOC participants are strongly following the Cryer's scheme of Elton (1996).
Full-text available
While MOOCs offer educational data on a new scale, many educators find great potential of the big data including detailed activity records of every learner. A learner’s behavior such as if a learner will drop out from the course can be predicted. How to provide an effective, economical, and scalable method to detect cheating on tests such as surrogate exam-taker is a challenging problem. In this paper, we present a grade predicting method that uses student activity features to predict whether a learner may get a certification if he/she takes a test. The method consists of two-step classifications: motivation classification (MC) and grade classification (GC). The MC divides all learners into three groups including certification earning, video watching, and course sampling. The GC then predicts a certification earning learner may or may not obtain a certification. Our experiment shows that the proposed method can fit the classification model at a fine scale and it is possible to find a surrogate exam-taker.
Full-text available
MOOCs as granular systems: design patterns to foster participant activity MOOCs often suffer from high drop-out and low completion rates. At the beginning of the course, the audience is indeed " massive " ; thousands of people wait for the course to begin, but in the end only a low number of participants stay active and complete the course. This paper answers the research question " Is there a specific point during an xMOOC where learners decide to drop out of the course or to become lurkers? " by identifying MOOCs as a challenging learning setting with a " drop-out problem " and a decrease in participant activity after the fourth to fifth course week. These are the first results of a Learning Analytics view on participant activity within three Austrian MOOCs. This " drop-out point " led the paper to introduce a design pattern or strategy to overcome the " drop-out point " : " Think granular! " can be seen as an instructional design claim for MOOCs in order to keep participant activity and motivation high, and that results in three design patterns: four-week MOOCs, granular certificates and suspense peak narratives. 1. MOOCs: a challenging learning setting with a drop-out problem? The MOOC phenomenon was born in Canada in 2008 and has since then become a worldwide movement (Hay-Jew 2015, 614; Hollands & Tirthali 2014, 25f.; Jasnani 2013). MOOCs can be seen as an expression for a modern orientation towards learning as learning can no longer be seen as a formal act that depends only on universities, schools and other institutions within a formal education system. Learning has to be seen as a lifelong process that has become flexible and seamless, as Wong (2012) and Hay-Jew (2015) resume. It encompasses formal and informal learning and physical and digital (learning) worlds (Wong & Looi 2011; Wong 2012). MOOCs – in our short research study, mainly xMOOCs – are open (Rodriguez 2013) and conducted online, with only an internet connection and registration on an xMOOC platform. The American providers Coursera (www.coursera. org), edX (, the German platforms iversity (www. and MOOIN ( or the Austrian iMooX (, for example, are necessary for attending courses from different fields. Therefore, the audience is very heterogeneous and cannot be predicted in advance, as it can be for traditional learning settings. It can nevertheless be stated that " the majority of MOOC participants are already well-educated with at least a B.A. degree " (Hollands & Tirthali 2014, 42). They have a certain experience within the learning or the educational context (Gaebel 2014, 25). There are almost no limitations regarding location, age, sex and education, to name a few variables. Thus, MOOC design has to respect this unpredictable heterogeneity, which results in a balancing act between multicity and unity regarding, for example, resources and prior knowledge or further information. As a consequence, MOOCs need to have a special instructional design (Jasnani 2013; Kopp & Lackner 2014) that focuses on different framework conditions. Jasnani (2013, 7) thus mentions a " lack of professional instructional design for MOOCs " which can be cited as one of the reasons for the low completion rates MOOCs suffer from. If we assume " an average 50,000 enrollments in MOOCs, with the typical completion rate of below 10%, approximately 7.5%, that amounts to 3,700 completions per 50,000 enrollments " (Ibid., 6) or even less: " Completion rates for courses offered by our interviewees ranged from around 3% to 15% of all enrollees. " (Hollands & Tirthali 2014, 42) Several investigations (Khalil & Ebner 2014) have already been conducted to identify reasons for
Massive open online courses (MOOCs) have recently taken center stage in discussions surrounding online education, both in terms of their potential as well as their high dropout rates. The high attrition rates associated with MOOCs have often been described in terms of a scale-efficacy tradeoff. Building from the large numbers associated with MOOCs and the ability to track individual student performance, this study takes an initial step towards a mechanism for the early and accurate identification of students at risk for dropping out. Focusing on struggling students who remain active in course discussion forums and who are already more likely to finish a course, we design a temporal modeling approach, one which prioritizes the at-risk students in order of their likelihood to drop out of a course. In identifying only a small subset of at-risk students, we seek to provide systematic insight for instructors so they may better provide targeted support for those students most in need of intervention. Moreover, we proffer appending historical features to the current week of features for model building and to introduce principle component analysis in order to identify the breakpoint for turning off the features of previous weeks. This appended modeling method is shown to outperform simpler temporal models which simply sum features. To deal with the kind of data variability presented by MOOCs, this study illustrates the effectiveness of an ensemble stacking generalization approach to build more robust and accurate prediction models than the direct application of base learners.
Massive open online courses (MOOCs) require individual learners to be able to self-regulate their learning, determining when and how they engage. However, MOOCs attract a diverse range of learners, each with different motivations and prior experience. This study investigates the self-regulated learning (SRL) learners apply in a MOOC, in particular focusing on how learners' motivations for taking a MOOC influence their behaviour and employment of SRL strategies. Following a quantitative investigation of the learning behaviours of 788 MOOC participants, follow-up interviews were conducted with 32 learners. The study compares the narrative descriptions of behaviour between learners with self-reported high and low SRL scores. Substantial differences were detected between the self-described learning behaviours of these two groups in five of the sub-processes examined. Learners' motivations and goals were found to shape how they conceptualised the purpose of the MOOC, which in turn affected their perception of the learning process.
The visualization of big MOOC data enables us to see trends in student behaviors and activities around the globe, but what is it that we are not seeing?
Extracts available on Google Books (see link below). For integral text, go to publisher's website :