Conference PaperPDF Available

When Learning Analytics Meets MOOCs - a Review on iMooX Case Studies


Abstract and Figures

The field of Learning Analytics has proven to provide various solu- tions to online educational environments. Massive Open Online Courses (MOOCs) are considered as one of the most emerging online environments. Its substantial growth attracts researchers from the analytics field to examine the rich repositories of data they provide. The present paper contributes with a brief literature review in both prominent fields. Further, the authors overview their developed Learning Analytics application and show the potential of Learning Analytics in tracking students of MOOCs using empirical data from iMooX.
Content may be subject to copyright.
Draft Original finally published here: Khalil, M. and M. Ebner (2016). When Learning Analytics Meets
MOOCs - a Review on iMooX Case Studies. Innovations for Community Services: 16th International
Conference, I4CS 2016, Vienna, Austria, June 27-29, 2016, Revised Selected Papers. G. Fahrnberger, G.
Eichler and C. Erfurth. Cham, Springer International Publishing: 3-19. DOI: 10.1007/978-3-319-49466-1_1
When Learning Analytics Meets MOOCs - a Review on
iMooX Case Studies
Mohammad Khalil and Martin Ebner
{Mohammad.khalil, martin.ebner}
Educational Technology, Graz University of Technology, Graz, Austria
Abstract. The field of Learning Analytics has proven to provide various solu-
tions to online educational environments. Massive Open Online Courses
(MOOCs) are considered as one of the most emerging online environments. Its
substantial growth attracts researchers from the analytics field to examine the
rich repositories of data they provide. The present paper contributes with a brief
literature review in both prominent fields. Further, the authors overview their
developed Learning Analytics application and show the potential of Learning
Analytics in tracking students of MOOCs using empirical data from iMooX.
Keywords: Learning Analytics · Massive Open Online Courses (MOOCs) ·
Completion Rate · Literature · Engagement · Evaluation · Prototype
1 Introduction
The growth of Massive Open Online Courses (MOOCs) in the modernistic era of
online learning has seen millions of enrollments from all over the world. They are
defined as online courses that are open to the public, with open registration option and
open-ended outcomes that require no prerequisites or fees [23]. These courses have
brought a drastic action to the Higher Education from one side and to the elementary
education from the other side [13]. The number of offered MOOCs has exploded in
the recent years. Particularly, until January 2016, there have been over 4500 courses
with 35 million learners from 12 MOOC providers [25]. Some of these courses are
provided by prestigious and renowned universities such as Harvard, MIT, and Stan-
ford. At the same time, other institutions have joined the MOOC hype and became
providers of their own local universities like the Austrian MOOC platform, iMooX
It is important to realize that MOOCs have split into two major types: cMOOCs
and xMOOCs. The cMOOCs are based on the philosophy of connectivism which is
about creating networks of learning [27]. On the other hand, the xMOOCs term is
shortened from extended MOOCs based on classical information transmission [10].
Further, new types of online courses related to MOOCs have germinated recently
such as Small Private Online Courses (SPOCs) and Distributed Open Collaborative
Courses (DOCCs).
MOOCs have the potential of scaling education in different fields and subjects.
The study of [25] showed that computer science and programming grabbed the largest
percentage of the offered courses. Yet, substantial growth of MOOCs has also been
noticed in Science, Technology, Engineering, and Mathematics (STEM) fields. The
anticipated results of MOOCs were varied between business purposes like saving
costs, and improving the pedagogical and educational concepts of online learning
[16]. Nevertheless, there is still altercation about the pedagogical approach of infor-
mation delivery to the students. The quality of the offered courses, completion rate,
lack of interaction, and grouping students in MOOCs have been, in addition, debated
recently [4, 12, 17].
Since MOOCs are an environment of online learning, the educational process is
based on video lecturing. In fact, learning in MOOCs is not only exclusive to that, but
social networking and active engagement are major factors too [23]. Contexts that
include topics, articles or documents are also considered as a supporting material in
the learning process.
While MOOC providers initialize and host online courses, the hidden part embod-
ied in recording learners’ activities. Nowadays, ubiquitous technologies have spread
among online learning environments and tracking students online becomes much
easier. The pressing needs of ensuring that the audience of eLearning platforms is
getting the most out of the online learning process and the needs to study their behav-
ior lead to what is so-called “Learning Analytics”. One of its key aspects is identify-
ing trends, discovering patterns and evaluating learning environments, MOOCs here
as an example. Khalil and Ebner listed factors that have driven the expansion of this
emerging field [14]: A) technology spread among educational categories, b) the “big
data” available from learning environments, and c) the availability of analytical tools.
In this research publication, we will discuss the potential of the collaboration be-
tween Learning Analytics and MOOCs. There have been various discussions among
researchers from different disciplines regarding these apparent trends. For instance,
Knox said that “Learning Analytics promises a technological fix to the long-standing
problems of education” [19]. Respectively, we will line up our experience within both
of the fields in the recent years and list the up to date related work. Further, different
scenarios and analysis from offered MOOCs of the iMooX will be discussed using the
iMooX Learning Analytics Prototype. At the end, we will list our proposed interven-
tions that will be adopted in the next MOOCs.
This publication is organized as follows: Section 2 covers literature and related
work. In section 3, we list a systematic mapping from the Scopus library to under-
stand what has been researched in Learning Analytics of MOOCs. Section 4 covers
the iMooX Learning Analytics Prototype while section 5 covers case studies and the
derived analytics outcomes from the empirically provided data.
2 Literature Review
2.1 MOOCs
The new technologies of the World Wide Web, mobile development, social networks
and the Internet of Things have advanced the traditional learning. eLearning and
Technology Enhanced Learning (TEL) have risen up with new models of learning
environments such as Personal Learning Environments (PLE), Virtual Learning Envi-
ronments (VLE) and MOOCs. Since 2008, MOOCs reserved a valuable position in
educational practices. Non-profits platforms like edX ( and profit plat-
forms like Coursera ( attracted millions of students. As long as
they only require an Internet connection and intention for learning, MOOCs are con-
sidered to be welfare for the Open Educational Resources (OER) and the lifelong
learning orientation [7].
Despite all these benefits, MOOCs turn out badly with several issues. Dropout and
the failure to complete courses are considered as one of the biggest issues. Katy Jor-
dan showed that the completion rate of many courses merely reached 10% [11]. Rea-
sons behind were explained because of poor course design, out of motivation, course
takes much time, lack of interaction and the assumption of too much knowledge
needed [16, 21]. Fetching other issues of MOOCs through available empirical data is
discussed later in this paper.
2.2 Learning Analytics
The birth of Learning Analytics has first seen the light in 2011. A Plethora of defini-
tions were used since then. However, the trend is strongly associated with previously
well-known topics such as web analytics, academic analytics, data analysis, data min-
ing as well as psychometrics and educational measurement [2]. Learning Analytics
mainly targets educational data sets from the modern online learning environments
where learners leave traces behind. The process then includes searching, filtering,
mining and visualizing data in order to retrieve meaningful information.
Learning Analytics involves different key methods of analysis. They vary from da-
ta mining, statistics, and mathematics, text analysis, visualizations, social network
analysis, qualitative to gamification techniques [15, 26]. On the other hands, the aims
of Learning Analytics diversify between different frameworks, but most of them
agreed on common goals. Despite its learning environment, Papamitsiou and Econo-
mides showed that studies of Learning Analytics focused on the pedagogical analysis
of behavior modeling, performance prediction, participation and satisfaction [26].
Benefits utilized in prediction, intervention, recommendation, personalization, evalua-
tion, reflection, monitoring and assessment improvement [3, 9, 14]. In fact, these
goals are considered useless without optimizing, refining and taking the full power of
it on stakeholders [5].
2.3 MOOCs and Learning Analytics
Learners of the online learning environments such as MOOCs are not only considered
as consumers, but they are also generators of data [14]. Lately, the research part of
studying the behavior of online students in MOOCs becomes widely spread across
journals and conferences. A recent survey study done by Khalil and Ebner on Learn-
ing Analytics showed that the ultimate number of citations using Google scholar
( were relevant to MOOC articles [15]. They listed the most
common techniques used by Learning Analytics in MOOCs, varying from machine
learning, statistics, information visualization, Natural Language Processing (NLP),
social network analysis, to gamification tools. Moissa and her colleagues mentioned
that Learning Analytics in MOOCs literature studies are still not deeply researched
[24]. We also found that valid in the next section.
3 Learning Analytics of MOOCs
In this section, we did a brief text analysis and mapped the screening of the abstracts
from the Scopus database (, in order to:
1. Grasp what has been researched in Learning Analytics of MOOCs.
2. Realize the main research trends of the current literature of Learning Analytics and
Scopus is a database powered by Elsevier Science. Our selection of this library is
because of the valuable indexing information it provides and the usability of perform-
ing search queries. The conducted literature exploration was performed by searching
for the following keywords: “Learning Analytics” and “MOOC”, “MOOCs” or “Mas-
sive Open Online Course”. The used query to retrieve the results was executed on 11-
April- 2016 and is shown in figure 1. The language was refined to English only.
Fig. 1. Search query to conduct the literature mapping
The returned results equaled to 80 papers. Only one paper was retrieved in 2011,
none from 2012, 11 papers from 2013, 23 papers from 2014, 37 from 2015 and 8 pa-
pers from 2016. Abstracts were then extracted and processed to a Comma-Separated
Values (CSV) file. After that, we created a word cloud in furtherance of representing
text data to identify the most prominent terms. Figure 2 depicts the word cloud of the
extracted abstracts. We looked at the single, bi-grams, tri-grams and quad-grams com-
mon terms. The most repeated single words were “MOOCs”, “education”, and “en-
gagement”. On the other hand, “Learning Analytics”, “Online Courses” and “Higher
Education” were recorded as the prominent bi-grams. “Khan Academy platform” and
“Massive Open Online Courses” were listed on the top of the tri-grams and quad-
grams respectively. As long as massive open online courses are represented in different
terms in the abstracts, we abbreviated all the terms to “MOOCs” in the corpus.
Figure 3 shows the most frequent phrases fetched from the text. Figure 2 and Fig-
ure 3 show interesting observations of the researched topics of Learning Analytics in
MOOCs. By doing a simple grouping of the topics and disregarding the main phrases
which are “Learning Analytics” and “MOOCs”, we found that researchers were look-
ing mostly at the engagement and interactions.
[ADD Fig 2. here]
Fig. 2. World cloud of the most prominent terms from the abstracts
Fig. 3. The most frequent terms extracted from the abstracts
It was quite interesting that the dropout and the completion rate were not the major
topics as we believed. Design and framework principles as well as assessment were
ranked the second most cited terms. Social factors and learning as well as discussions
grabbed the afterward attention, while tools and methods were mentioned to show the
mechanism done in offering solutions and case studies.
4 Learning Analytics Prototype of iMooX
The analyses of this study are based on the different courses provided by the Austrian
MOOC provider (iMooX). The platform was first initiated in 2013 with the coopera-
tion of University of Graz and Graz University of Technology [20]. iMooX offers
German courses in different disciplines and proposes certificates for students who
successfully complete the courses for free.
A MOOC platform cannot be considered as a real modern technology enhanced
learning environment without a tracking approach for analysis purposes [16]. Track-
ing students left traces on MOOC platforms with a Learning Analytics application is
essential to enhance the educational environment and understand students’ needs.
iMooX pursued the steps and applied an analytical approach called the “iMoox Learn-
ing Analytics Approach” to track students for research purposes. It embodies the
functionality to interpret low-level data and present them to the administrators and
researchers. The tool is built based on the architecture of the early presented Learning
Analytics framework by the authors [14]. Several goals were anticipated, but mainly
the intention to use data from the iMooX enterprise and examine what is happening
on the platform as well as rendering useful decisions upon the interpretation.
4.1 Design Ontology
The design of the tool is to propose integration with the data generated from MOOCs.
The large amount of available courses and participants in MOOCs, create a huge
amount of low-level data related to students’ performance and behavior [1]. For in-
stance, low-level data like the number of students who watched a certain video can be
used to interpret valuable actions regarding boring segments [30].
In order to fulfill our proposed framework, we divided the design architecture of
the prototype into four stages. Figure 4 depicts these main stages. Briefly summa-
rized, the first stage is the generation part of the data. Generating log files start when a
student enrolls in a course, begins watching videos, discusses topics in forums, does
quizzes, and answering evaluations. The next stage is followed by a suitable data
management and administration into stamping a time-reference descriptions of every
interaction. Parsing log files and processing them such as filtering unstructured data
and mining keywords from bulk text occur in the third stage. Finally, the fourth stage
characterizes the visualization part, and the processed data are displayed to the admins
and researchers.
Fig. 4. The iMooX Learning Analytics Prototype design architecture [16]
4.2 Implementation Architecture and User Interface
The implementation framework adopts the design architecture with more detailed
processing steps for the visualization part. We aimed to develop an easy-to-read dash-
board. The intended plan was to make visualizations for taking actions. They should
not only be connected with meaning and facts [6]. Thus, the data are presented in a
statistical text format and in charts like pie charts and bar plots as shown below in
figure 5.
This user dashboard is only accessible by researchers and administrators. A teacher
version, however, is attainable in a static format which shows general statistics about
his/her teaching course. The detailed personal information of students is kept confi-
dential and is only available for research and administrative reasons. The Dashboard
shows various MOOC objects and indicators. These objects inherent pedagogical
purposes and require appropriate interpretation for proper actions [8]. The Dashboard
offers searching for any specific user in a particular period. The returned results cov-
Quiz attempt, scores, and self-assessment
Downloaded documents from the course
Login frequency
Forums reading frequency
Forums posting frequency
Watched videos
[ADD Fig. 5 HERE]
Fig. 5. iMooX Learning Analytics Prototype user dashboard - admin view
Further, comprehensive details can be carried out of each indicator when required by
clicking on the learning object tab.
5 Analysis and Case Studies
This section shows some of detailed analyses done previously. This examination is
carried out using the log data fetched from the prototype. The awaited results are: a)
evaluating the prototype efficiency in revealing patterns, b) recognizing the potentiali-
ty of Learning Analytics in MOOCs.
5.1 Building Activity Profiles
Building an activity profile using Learning Analytics becomes possible using the rich
available data provided by the prototype. We have analyzed a MOOC called “Me-
chanics in Everyday life”. The course was ten weeks long, and the target group was
secondary school students from Austria. The MOOC, however, was also open to the
public. There were (N=269) participants. The aim behind the activity profile is to
deeply examine the activity of participants and to distinguish between their activities.
Figure 6 displays the activity profile only for school pupils. The green represents the
certified students (N=5), while the red represents the non-certified students (N=27). It
is obvious that week-1, week-3, and week-4 were very active in discussion forums.
Watching videos were totally uninteresting in the last week. Thorough observations
and differences between pupils and other enrollees can be trailed from [13].
Fig. 6. The activity profile
5.2 Tracking Forums Activity
Various discussions about the role of social activity in MOOCs forums were regularly
debated. Recently, the study by Tseng et al., found out that the activity in forum dis-
cussion is strongly related to the course retention and performance [28]. We have
done several exploratory analyses to uncover diverse pedagogical relations and results
[16, 17, 21, 22]. The analyses are based on different offered MOOCs. The following
outcomes were concluded:
Defined drop-out point where students posting and reading in forums clearly di-
minishes in week-4, as shown in figure 7. We found such patterns being recurred
among a collection of MOOCs.
Figure 8 shows the relation between reading and writing in the discussion forums.
Different samples were tested randomly. The Pearson-moment correlation coeffi-
cient of 0.52 and p-value < 0.01 was calculated. This indicates a moderate correla-
tion of students who write more are likely to read more. Further, the active instruc-
tor drives a positive interaction into creating a dynamic social environment.
Figure 9 depicts forum posts in two courses. Students usually write more often in
the first two weeks.
Figure 10 shows the timing trends of learning happening during the whole day.
Peaks were detected between 6 p.m. and 10 p.m.
Fig. 7. Reading in forums leads to define drop-out peak points [16]
Fig. 8. Positive relationship between reading and writing in forums [22]
Fig. 9. Participants discuss and ask more often in the first two weeks [16]
Fig. 10. Time spent reading in forums [22]
5.3 Grouping and Clustering Participants
A recent systematic analysis done by Veletsianos and Shepherdson showed that lim-
ited research was done into identifying learners and examining subpopulations of
MOOCs [29]. Defining dropping out from MOOCs can take a different way when
categorizing students. Students may register in a course and then never show up. In-
cluding them in the total dropout share implies unjustified retention rate. Henceforth,
a case study of two offered MOOCs was examined to scrutinize this issue. We divid-
ed the participants and investigated the dropout ratio of each category. New student
types were defined based on their activity, quizzes and successful completion of the
course: registrants, active learners, completers and certified learners. The total drop-
out gap between registrants and active students was the highest. However, the new
dropout ratio between active students and certified learners was quite promising. The
completion rate in the first MOOC was 37%, while 30% in the second MOOC. This is
considered a very high completion rate compared to Jordan’s study [11]. Figure 11
shows the newly defined student types.
Fig. 11. New types of MOOC learners are defined using Learning Analytics [16]
On the other hand, explaining activity or engagement and their interaction with
MOOCs is needed to cluster students to subpopulations. Classifying students into
subpopulations improve decisions and interventions taken by lead managements [17,
18]. However, engagements vary and depend on the tested sample of students. In our
paper “Portraying MOOCs learners: a clustering experience using learning analytics”,
we studied the engagement of university students using the k-means clustering algo-
rithm [17]. Table 1, shows the activities of each cluster (reading frequency, writing
frequency, watching videos, quiz attendance, and certification ratio).
Table 1. University students clustering results [17]
Read F.
Write F.
Watch Vid.
Quiz Att.
Cert. Ratio
Four sorts of students were detected using the algorithm. The “dropout” cluster
shown as C1 is clarified by students who have a low activity in all MOOC learning
activities. “On track” or excellent students displayed as C2 in the table and those who
are involved in most of the MOOC activities and have a certification rate of 96.1 %.
The “Gamblers” or students who play the system shown as C3, and these barely
watch learning videos, but they did every quiz seeking for the grade. “Social” stu-
dents, shown as C4 and these are more engaged in forums and their certification rate
was around 50%.
5.4 Quiz Attendance
In this part of the overview, we were concerned about the quiz attendance. The ques-
tion seeks to ascertain whether there is a relation between the dropout and the number
of quiz tries?
Fig. 12. Quiz attendance in one of the MOOCs (eight weeks long)
A student in iMooX has the option to attend a quiz up to five times. In figure 12,
the total number of quiz attendance is apparently decreasing in the first four weeks.
Starting from week-5 till the last week, the drop rate from quizzes was quite low. This
emphasizes our results in figure 7, in which a course of four weeks is critical from
different points. Our study in [13] has proven that the students who did more quiz
trials apparently retained and reached further weeks.
6 Discussion and Conclusion
This research study is mainly divided into three parts. The first part lists literature
study of the topics of Learning Analytics, MOOCs, and Learning Analytics in
MOOCs. With the 80 collected papers from the Elsevier Science library: Scopus, we
did a word cloud to remark the vital trends in these two prominent fields. Topics of
engagement, interactions, social factors, as well as design and frameworks, were re-
ferred the most. Further, Learning Analytics was employed more into improving in-
teractions and engagements of students in the MOOCs environment instead of the
dropout matter. The second part shows our experience in implementing the iMooX
Learning Analytics prototype. It eases collecting and tracking students of the exam-
ined MOOC platform. We discussed its ontology, implementation architecture and
user interface. The third part evaluated the application. Different scenarios from
iMooX were analyzed using advanced visualizations, statistics, clustering and qualita-
tive decisions.
The potentiality of Learning Analytics in MOOCs crystallizes in the subsequent in-
terventions upon the evaluation results. We believe in designing shorter courses such
as four weeks MOOC instead of eight weeks [21]. As a result, the workload would be
cut in half, and students’ efficiency will be higher. Additionally, the concept of en-
hancing social communications in the discussion forums, especially between the in-
structor and the students would attract students into being connected which by all
means would decrease the dropout rate. We further discovered new types of students
using categorization and clustering by depending on their activity. This will lead us
into portraying engagement and behavior of a subpopulation of learners in the plat-
We think learning analytics carries significant values to MOOCs from the peda-
gogical and technological perspectives. Proper interventions, predictions, and bench-
marking learning environments are difficult to optimize on MOOCs without the assis-
tance of Learning Analytics. In the end, the under development algorithm of design-
ing an assistant tool that sends a direct feedback to the student in order to improve the
completion rate is in our future plans. It will notify students directly in order to sup-
port a live awareness and reflection system.
1. Alario-Hoyos, C., Muñoz-Merino, P. J., Pérez-Sanagustín, M., Delgado Kloos, C., Parada,
G. H. A.: Who are the top contributors in a MOOC? Relating participants' performance
and contributions. Journal of Computer Assisted Learning. 32 (3), 232-243 (2016).
2. Baker, R. S., Siemens, G.: Educational Data Mining and Learning Analytics. Accessed: 20 April
2016 (2016).
3. Chatti, M., Dyckhoff, A., Schroeder, U., Thüs, H.: A reference model for learning analyt-
ics. International Journal of Technology Enhanced Learning. 4, 5/6, 318-331 (2012).
4. Clow, D.: MOOCs and the funnel of participation. In: the Third International Conference
on Learning Analytics and Knowledge (LAK 13), Leuven, Belgium, pp. 185-189. ACM
5. Clow, D.: The learning analytics cycle: closing the loop effectively. In: the 2nd Interna-
tional Conference on Learning Analytics and Knowledge (LAK '12), Vancouver, Canada,
pp. 134-138. ACM (2012).
6. Duval, E.: Attention please!: learning analytics for visualization and recommendation. In:
the 1st International Conference on Learning Analytics and Knowledge (LAK 11), Alber-
ta, Canada, pp. 9-17. ACM (2011).
7. Ebner, M., Schön, S., Kumar, S.: Guidelines for leveraging university didactics centers to
support OER uptake in German-speaking Europe. Education Policy Analysis Archives,
24(39) (2016).
8. Graf, S., Ives, C., Rahman, N., Ferri, A.: AAT: a tool for accessing and analysing students'
behaviour data in learning systems. In: the 1st International Conference on Learning Ana-
lytics and Knowledge (LAK 11), Alberta, Canada, pp. 174-179. ACM (2011).
9. Greller, W., Drachsler, H.: Translating Learning into Numbers: A Generic Framework for
Learning Analytics. Educational Technology & Society. 15 (3), 42-57 (2012).
10. Hollands, F. M., Tirthali, D.: MOOCs: Expectations and reality. Full report. Center for
Benefit-Cost Studies of Education, Teachers College, Columbia University, NY.
content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf. Accessed: 19 April
2016 (2014).
11. Jordan, K.: MOOC completion rates: The data.
html. Accessed: 12 April 2016 (2013).
12. Khalil, H. Ebner, M.: MOOCs Completion Rates and Possible Methods to Improve Reten-
tion - A Literature Review. In: Proceedings of World Conference on Educational Multi-
media, Hypermedia and Telecommunications 2014, pp. 1236-1244. Chesapeake, VA:
AACE (2014).
13. Khalil, M., Ebner, M.: A STEM MOOC for school childrenWhat does learning analytics
tell us?. In: the 2015 International Conference on Interactive Collaborative Learning (ICL
2015), Florence, Italy, pp. 1217-1221. IEEE (2015).
14. Khalil, M., Ebner, M.: Learning Analytics: Principles and Constraints. In: Carliner, S., Ful-
ford, C., & Ostashewski, N. (eds.), Proceedings of EdMedia: World Conference on Educa-
tional Media and Technology 2015, pp. 1789-1799. Chesapeake, VA: AACE (2015).
15. Khalil, M., Ebner, M.: What is Learning Analytics about? A Survey of Different Methods
Used in 2013-2015. In: the Smart Learning Conference, Dubai, UAE, pp. 294-304. Dubai:
HBMSU Publishing House (2016).
16. Khalil, M., Ebner, M.: What Massive Open Online Course (MOOC) Stakeholders Can
Learn from Learning Analytics?. In M. J. Spector, B. B. Lockee, & M. D. Childress (Eds.),
Learning, Design, and Technology. Springer International Publishing (in press).
17. Khalil, M., Kastl, C., Ebner, M.: Portraying MOOCs Learners: a Clustering Experience
Using Learning Analytics. In: the European Stakeholder Summit on experiences and best
practices in and around MOOCs (EMOOCS 2016), Graz, Austria, pp. 265-278. (2016).
18. Kizilcec, R. F., Piech, C., Schneider, E.: Deconstructing disengagement: analyzing learner
subpopulations in massive open online courses. In: the third international conference on
learning analytics and knowledge (LAK ’13) Leuven, Belgium, pp. 170-179. ACM (2013).
19. Knox, J.: From MOOCs to Learning Analytics: Scratching the surface of the 'visual'.
eLearn. 2014(11), ACM (2014).
20. Kopp, M., Ebner, M.: iMooX - Publikationen rund um das Pionierprojekt. Verlag Mayer.
Weinitzen (2015).
21. Lackner, E., Ebner, M., Khalil, M.: MOOCs as granular systems: design patterns to foster
participant activity. eLearning Papers. 42, 28-37 (2015).
22. Lackner, E., Khalil, M., Ebner, M.: How to foster forum discussions within MOOCs: A
case study. International Journal of Academic Research in Education. (in review).
23. McAulay, A., Tewart, B., Siemens, G.: The MOOC model for digital practice. Charlotte-
town: University of Prince Edward Island (2010).
24. Moissa, B., Gasparini, I., Kemczinski, A.: A Systematic Mapping on the Learning Analyt-
ics Field and Its Analysis in the Massive Open Online Courses Context. International
Journal of Distance Education Technolgies. 13 (3), 1-24 (2015).
25. Online Course Report.: State of the MOOC 2016: A Year of Massive Landscape Change
For Massive Open Online Courses, 2016.
mooc-2016-a-year-of-massive-landscape-change-for-massive-open-online-courses/. Ac-
cessed: 18 April 2016 (2016).
26. Papamitsiou, Z., & Economides, A. A.: Learning Analytics for Smart Learning Environ-
ments: A Meta-Analysis of Empirical Research Results from 2009 to 2015. In M. J. Spec-
tor, B. B. Lockee, & M. D. Childress (Eds.), Learning, Design, and Technology. pp.1-23.
Springer International Publishing (2016).
27. Siemens, G.: A learning theory for the digital age. Instructional Technology and Distance
Education, 2(1), 3-10 (2005).
28. Tseng, S.F., Tsao, Y. W., Yu, L. C., Chan, C. L., Lai, K.R.: Who will pass? Analyzing
learner behaviors in MOOCs. Research and Practice in Technology Enhanced Learn-
ing, 11(1), pp. 1-11. Springer, (2016).
29. Veletsianos, G., Shepherdson, P.: A Systematic Analysis and Synthesis of the Empirical
MOOC Literature Published in 2013-2015.The International Review of Research in Open
and Distributed Learning. 17(2), (2016).
30. Wachtler, J., Khalil, M., Taraghi, B., Ebner, M.: On Using Learning Analytics to Track the
Activity of Interactive MOOC videos. In: Proceedings of the LAK 2016 Workshop on
Smart Environments and Analytics in Video-Based Learning, Edinburgh, Scotland, pp.8-
17. CEUR (2016).
... While much of the focus of early learning analytics research has related to the digital traces within learning management systems and MOOCs (Khalil & Ebner, 2016a), there is increasing interest in capturing and analysing students' data from real-world learning contexts such as gaze, postures, motions, and gestures inside classrooms and face-to-face sessions. Significant records of student behavior in the classroom are often constrained within learning analytics research, due to ethics, privacy, and security concerns (Khalil & Ebner, 2016b). However, multimodal analytics goes beyond the tracking of students through direct surveillance. ...
It is with a sense of irony that we offer a conclusion to this book. As we acknowledged in the introductory chapter, when we invited authors to submit proposals for a book on exploring the potential and challenges of learning analytics for open, distance and distributed learning institutions and forms of delivery, no one would have imagined how the world, and in particular the education sector would be disrupted by the Covid-19 pandemic.
... To address this problem, the field of Learning Analytics (LA) offers opportunities for automatically retrieving and interpreting information about learners' progress and thus, scaling up interventions. LA in MOOCs frequently involve techniques related to machine learning, visualizations, natural language processing, social network analysis, etc. [5]. These techniques can support instructors' awareness of learners' progress. ...
... With respect to data sources, many frameworks (n=27) depended on virtual learning environments and quantitative multichannel data sources (n=18). This is not surprising, since the field is commonly tailored for intelligent and learnerproduced data from information systems [Siemens 2011;Khalil & Ebner 2016]. However, other data sources are also included. ...
Conference Paper
Full-text available
While learning analytics frameworks precede the official launch of learning analytics in 2011, there has been a proliferation of learning analytics frameworks since. This systematic review of learning analytics frameworks between 2011 and 2021 in three databases resulted in an initial corpus of 268 articles and conference proceeding papers based on the occurrence of "learning analytics" and "framework" in titles, keywords and abstracts. The final corpus of 46 frameworks were analysed using a coding scheme derived from purposefully selected learning analytics frameworks. The results found that learning analytics frameworks share a number of elements and characteristics such as source, development and application focus, a form of representation, data sources and types, focus and context. Less than half of the frameworks consider student data privacy and ethics. Finally, while design and process elements of these frameworks may be transferable and scalable to other contexts, users in different contexts will be best-placed to determine their transferability/scalability.
... In that sense, recent research has been focused on issues such as self-regulated learning (K. Li, 2019;Wong et al., 2019), adaptive support systems (Jin et al., 2019;Lerís et al., 2017;Xi et al., 2018), Big data applications and learning analytics (Dessì et al., 2019;Khalil & Ebner, 2016), engagement and completion (Kashyap & Nayak, 2018;W. Li et al., 2016;Nagrecha et al., 2017;Suresh & Mallikarjuna, 2019;Whitehill et al., 2017), communication, (Ossiannilsson et al., 2015), digital support systems (Zhang et al., 2017), perceptions, attitudes and student´s motivations (Higashi et al., 2017;Shapiro et al., 2017), implications of free access and cost (Cross & Whitelock, 2017) and their insertion in various educational levels, among others (Sanchez-Gordon & Luján-Mora, 2017), among others. ...
Full-text available
At the end of the 2000´s, MOOCs broke into the educational field with the promise of learning with features more suited to the demands of our times. Their connectivist genesis provided a provocative expectation regarding the potential of collaboration, sharing, reuse, and free access, as factors of a possible transformation of the current educational system, which has been characterized by being rigid and reluctant to change. Given the relevance and growing participation of MOOC in education, there is a strong interest in understanding both their functioning and structure so that they can be considered as relevant educational options for a networked society. In this sense, a multi-method, exploratory and mixed study was conducted on 225 MOOCs based on the four categories that make up their denomination: Massive, Open, Online and Course. The study was developed through three stages: enlistment, fieldwork and report. The results of the study show that the contributions of MOOCs as generators of shared and collaborative learning experiences as proposed in their origins are not reflected in the reality of their current offering.
... The reasons that stand behind may belong to the availability of students' data, courses, and the variety of platforms. MOOCs offer a challenging and rich space for Learning Analytics to understand learner behavior [5,14], learner engagement patterns [8,10] and learner cognitive capacity [19]. ...
Conference Paper
Full-text available
Since its emergence in 2011, the field of Learning Analytics demands tools that deal with the exhausts of digital learning systems. This paper presents our first prototype 'OXALIC' in an attempt to introduce a standalone Learning Analytics tool for the Open edX MOOC platform. Open edX is largely used by thousands of organizations around the world. Nonetheless, one of the most challenging issues of employing Learning Analytics in Open edX platforms is having the ability to analyze "in-depth" log files. Open edX platform is deficient in providing the same features as the 'edX' system where the latter offers data packages and the prior struggles to explore advanced analytics. The paper reports on the architecture of OXALIC, functionalities, and the user interface. We foresee promising results for future directions of OXALIC as a rigid contribution to Learning Analytics in MOOCs.
... This can be done by agreeing on an ethical framework or checklist such as the one by Greller and Drachsler (2016) when dealing with learner's data. Further, Khalil and Ebner (2016a) dealt with the challenges LA is facing and also pointed out the possibility of de-identification of learner's data (Khalil & Ebner, 2016b). If a researcher wants to use LA, the rights of the data subjects must be questioned. ...
Full-text available
Massive open online courses (MOOCs) provide anyone with Internet access the chance to study at university level for free. In such learning environments and due to their ubiquitous nature, learners produce vast amounts of data representing their learning process. Learning Analytics (LA) can help identifying, quantifying, and understanding these data traces. Within the implemented web-based tool, called LA Cockpit, basic metrics to capture the learners’ activity for the Austrian MOOC platform iMooX were defined. Data is aggregated in an approach of behavioral and web analysis as well as paired with state-of-the-art visualization techniques to build a LA dashboard. It should act as suitable tool to bridge the distant nature of learning in MOOCs. Together with the extendible design of the LA Cockpit, it shall act as a future proof framework to be reused and improved over time. Aimed toward administrators and educators, the dashboard contains interactive widgets letting the user explore their datasets themselves rather than presenting categories. This supports the data literacy and improves the understanding of the underlying key figures, thereby helping them generate actionable insights from the data. The web analytical feature of the LA Cockpit captures mouse activity in individual course-wide heatmaps to identify regions of learner’s interest and help separating structure and content. Activity over time is aggregated in a calendar view, making timely reoccurring patterns otherwise not deductible, now visible. Through the additional feedback from the LA Cockpit on the learners’ behavior within the courses, it will become easier to improve the teaching and learning process by tailoring the provided content to the needs of the online learning community.
... In parallel, a lot of studies were carried out on how to improve online courses as well as the learning process (Khalil & Ebner, 2016a;Khalil & Ebner, 2016b). Especially the phenomena of a high drop-out rate was an issue and got more understandable (Jordan, 2013;Khalil & Ebner, 2014). ...
Full-text available
Since 2010, Massive Open Online Courses (MOOCs) have been one of the most discussed and researched topics in the area of educational technology. Due to their open nature such courses attract thousands of learners worldwide and more and more higher education institutions begin to produce their own MOOCs. Even the (international) press is full of reports and articles of how MOOCs can revolutionize education. In this chapter, we will take a look from a meta-level. After years of experiences with different MOOCs, we recognize that many MOOCs are used in different ways by teachers, lecturers, trainers and learners. So, there are different learning and teaching scenarios in the background often not visible to the broader public. Therefore, we like to address the following research question: “How can MOOCs be used in Higher Education learning and teaching scenarios and beyond?” In the study, the authors will focus on the seven identified scenarios how particular MOOCs were used for teaching and learning and therefore illustrate, that a MOOC can be “more than a MOOC”. MOOCs are one of the key drivers for open education using Open Educational Resources. The use of open licenses for MOOC resources are the mechanism for potential innovations in learning and teachings scenarios.
In 2012, the world's top universities started to offer Massive Open Online Courses (MOOCs). The trend became rapidly spread as a global online education platform, and it immediately provoked social interests as research theme. While researches were mostly published as practice reports and discussion papers to look out for future possibilities at the beginning, empirical research has progressed in recent years, by incorporating Learning Analytics. In this article, by reviewing previous studies on MOOC with Learning Analytics, we examined the research findings and discussed issues and challenges for future research.
Full-text available
In Austria, located in the center of Europe, when compared earlier in relation to other German-speaking countries in Europe, individuals and groups started to develop and work on the idea of freely available and usable learning content on the Internet. A first Austrian milestone was the coordination of an international conference on open educational content in 2007 as the final activity of the first European project that was focused on OER ( Within the contribution an overview of current state and developments of OER activities in Austria is given, also describing its infrastructure, policy, existing resources, curriculum and teaching methodologies, outcome, stakeholders and impact for education. The chapter gives a comprehensive overview of all OER activities in Austria and outlines the benefits for the educational system as well. It can be summarized that the Austrian way seems to be successful even though the steps forward are often small.
Full-text available
During the past decades, the potential of analytics and data mining - methodologies that extract useful and actionable information from large datasets - has transformed one field of scientific inquiry after another (cf. Collins, Morgan, & Patrinos, 2004; Summers et al., 1992). Analytics has become a trend over the past several years, reflected in large numbers of graduate programs promising to make someone a master of analytics, proclamations that analytics skills offer lucrative employment opportunities (Manyika et al., 2011), and airport waiting lounges filled with advertisements from different consultancies promising to significantly increase profits through analytics. When applied to education, these methodologies are referred to as learning analytics (LA) and educational data mining (EDM). In this chapter, we will focus on the shared similarities as we review both parallel areas while also noting important differences. Using the methodologies we describe in this chapter, one can scan through large datasets to discover patterns that occur in only small numbers of students or only sporadically (cf. Baker, Corbett, & Koedinger, 2004; Sabourin, Rowe, Mott, & Lester, 2011); one can investigate how different students choose to use different learning resources and obtain different outcomes (cf. Beck, Chang, Mostow, & Corbett, 2008); one can conduct fine-grained analysis of phenomena that occur over long periods of time (such as the move toward disengagement over the years of schooling - cf. Bowers, 2010); and one can analyze how the design of learning environments may impact variables of interest through the study of large numbers of exemplars (cf. Baker et al., 2009). In the sections that follow, we argue that learning analytics has the potential to substantially increase the sophistication of how the field of learning sciences understands learning, contributing both to theory and practice.
Full-text available
Discussion forums are an essential part to foster interaction among teachers and students, as well as students and students, in virtual learning settings. If interaction can be enhanced, this has a positive influence on motivation and finally also on dropout rates. These days, a special form of online courses, so‐ called MOOCs (Massive Open Online Courses), are popping up massively. Those courses are characterized by a high number of students. In this paper, we would like to examine discussion forums and their role concerning interaction. Therefore, Gilly Salmon's well‐known Five stage model is taken and adapted to MOOCs based on a case study. As a method, we tracked learners' data through learning analytics applications and concluded that there is a positive correlation between reading from one side and writing in forums from the other side.
Full-text available
Many MOOCs initiatives continue to report high attrition rates among distance education students. This study investigates why students dropped out or failed their MOOCs. It also provides strategies that can be implemented to increase the retention rate as well as increasing overall student satisfaction. Through studying literature, accurate data analysis and personal observations, the most significant factors that cause high attrition rate of MOOCs are identified. The reasons found are lack of time, lack of learners’ motivation, feelings of isolation and the lack of interactivity in MOOCs, insufficient background and skills, and finally hidden costs. As a result, some strategies are identified to increase the online retention rate, and will allow more online students to graduate.
Full-text available
The area of Learning Analytics has developed enormously since the first International Conference on Learning Analytics and Knowledge (LAK) in 2011. It is a field that combines different disciplines such as computer science, statistics, psychology and pedagogy to achieve its intended objectives. The main goals illustrate in creating convenient interventions on learning as well as its environment and the final optimization about learning domain stakeholders. Because the field matures and is now adapted in diverse educational settings, we believe there is a pressing need to list its own research methods and specify its objectives and dilemmas. This paper surveys publications from Learning Analytics and Knowledge conference from 2013 to 2015 and lists the significant research areas in this sphere. We consider the method profile and classify them into seven different categories with a brief description on each. Furthermore, we show the most cited method categories using Google scholar. Finally, the authors raise the challenges and constraints that affect its ethical approach through the meta-analysis study. It is believed that this paper will help researchers to identify the common methods used in Learning Analytics, and it will assist by establishing a future forecast towards new research work taking into account the privacy and ethical issues of this strongly emerged field.
Full-text available
Massive open online courses (MOOCs) are the road that led to a revolution and a new era of learning environments. Educational institutions have come under pressure to adopt new models that assure openness in their education distribution. Nonetheless, there is still altercation about the pedagogical approach and the absolute information delivery to the students. On the other side with the use of Learning Analytics, powerful tools become available which mainly aim to enhance learning and improve learners’ performance. In this chapter, the development phases of a Learning Analytics prototype and the experiment of integrating it into a MOOC platform, called iMooX will be presented. This chapter explores how MOOC stakeholders may benefit from Learning Analytics as well as it reports an exploratory analysis of some of the offered courses and demonstrates use cases as a typical evaluation of this prototype in order to discover hidden patterns, overture future proper decisions, and to optimize learning with applicable and convenient interventions.
Conference Paper
Full-text available
It is widely known that interaction, as well as communication, are very important parts of successful online courses. These features are considered crucial because they help to improve students’ attention in a very significant way. In this publication, the authors present an innovative application, which adds different forms of interactivity to learning videos within MOOCs such as multiple-choice questions or the possibility to communicate with the teacher. Furthermore, Learning Analytics using exploratory examination and visualizations have been applied to unveil learners’ patterns and behaviors as well as investigate the effectiveness of the application. Based upon the quantitative and qualitative observations, our study determined common practices behind dropping out using videos indicator and suggested enhancements to increase the performance of the application as well as learners’ attention.
Full-text available
Massive open online courses (MOOCs) have recently gained worldwide attention from educational institutes. MOOCs provide a new option for learning, yet measurable learning benefits of MOOCs still need to be investigated. Collecting data of three MOOCs at Yuan Ze University (YZU), this paper intended to classify learning behaviors among 1489 students on the MOOC platform at YZU. This study further examined learning outcomes in MOOCs by different types of learners. The Ward's hierarchical and k-means non-hierarchical clustering methods were employed to classify types of learners' behavior while they engaged in learning activities on the MOOC platform. Three types of MOOC learners were classified—active learner, passive learner, and bystander. Active learners who submitted assignments on time and frequently watched lecture videos showed a higher completion rate and a better grade in the course. MOOC learners who participated in online discussion forum reported a higher rate of passing the course and a better score than those inactive classmates. The finding of this study suggested that the first 2 weeks was a critical point of time to retain students in MOOCs. MOOC instructors need to carefully design course and detect risk behaviors of students in early of the classes to prevent students from dropping out of the course. The feature design of discussion forum is to provide peer interaction and facilitate online learning. Our results suggested that timely feedback by instructors or facilitators on discussion forum could enhance students' engagement in MOOCs.
Full-text available
Although less well established than in other parts of the world, higher education institutions in German-speaking countries have seen a marked increase in the number of open educational resource (OER) initiatives and in government-supported OER funding in recent years. OER implementation, however, brings with it a unique set of challenges in German-speaking higher education contexts, stemming in part from copyright laws and use permissions that have made sharing and reuse of educational materials less prevalent. The article discusses how instructional development centers, including university didactics centers (hochschuldidaktische Zentren) and e-learning centers, can play a key role in faculty uptake and adoption of OER, and concludes by proposing a set of OER implementation guidelines that leverage the expertise and interfacing role of these centers in German-speaking countries.
Conference Paper
Full-text available
The area of Learning Analytics has developed enormously since the first International Conference on Learning Analytics and Knowledge (LAK) in 2011. It is a field that combines different disciplines such as computer science, statistics, psychology and pedagogy to achieve its intended objectives. The main goals illustrate in creating convenient interventions on learning as well as its environment and the final optimization about learning domain's stakeholders (Khalil & Ebner, 2015b). Because the field matures and is now adapted in diverse educational settings, we believe there is a pressing need to list its own research methods and specify its objectives and dilemmas. This paper surveys publications from Learning Analytics and Knowledge conference from 2013 to 2015 and lists the significant research areas in this sphere. We consider the method profile and classify them into seven different categories with a brief description on each. Furthermore, we show the most cited method categories using Google scholar. Finally, the authors raise the challenges and constraints that affect its ethical approach through the meta-analysis study. It is believed that this paper will help researchers to identify the common methods used in Learning Analytics, and it will assist by establishing a future forecast towards new research work taking into account the privacy and ethical issues of this strongly emerged field.
Although several qualitative analyses appeared in the domain of Learning Analytics (LA), a systematic quantitative analysis of the effects of the empirical research findings toward the development of more reliable Smart Learning Environments (SLE) is still missing. This chapter aims at preserving and enhancing the chronicles of recent LA developments as well as covering the abovementioned gap. The core question is where these two research areas intersect and how the significant LA research findings could be beneficial for guiding the construction of SLEs. This meta-analysis study synthesizes research on the effectiveness of LA and targets at determining the influence of its dimensions on learning outcomes so far. Sixty-six experimental and quasi-experimental papers published from 2009 through September 2015 in the domain of LA were coded and analyzed. Overall, the weighted random effects mean effect size (g) was 0.433 (p = 0.001). The collection was heterogeneous (Qt(66) = 78.47). Here, the results of the statistical and classification processes applied during the meta-analysis process are presented and the most important issues raised are discussed.