Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Sep 25, 2017
Content may be subject to copyright.
Draft – Original finally published here: Khalil, M. and M. Ebner (2016). When Learning Analytics Meets
MOOCs - a Review on iMooX Case Studies. Innovations for Community Services: 16th International
Conference, I4CS 2016, Vienna, Austria, June 27-29, 2016, Revised Selected Papers. G. Fahrnberger, G.
Eichler and C. Erfurth. Cham, Springer International Publishing: 3-19. DOI: 10.1007/978-3-319-49466-1_1
When Learning Analytics Meets MOOCs - a Review on
iMooX Case Studies
Mohammad Khalil and Martin Ebner
{Mohammad.khalil, martin.ebner}@tugraz.at
Educational Technology, Graz University of Technology, Graz, Austria
Abstract. The field of Learning Analytics has proven to provide various solu-
tions to online educational environments. Massive Open Online Courses
(MOOCs) are considered as one of the most emerging online environments. Its
substantial growth attracts researchers from the analytics field to examine the
rich repositories of data they provide. The present paper contributes with a brief
literature review in both prominent fields. Further, the authors overview their
developed Learning Analytics application and show the potential of Learning
Analytics in tracking students of MOOCs using empirical data from iMooX.
Keywords: Learning Analytics · Massive Open Online Courses (MOOCs) ·
Completion Rate · Literature · Engagement · Evaluation · Prototype
1 Introduction
The growth of Massive Open Online Courses (MOOCs) in the modernistic era of
online learning has seen millions of enrollments from all over the world. They are
defined as online courses that are open to the public, with open registration option and
open-ended outcomes that require no prerequisites or fees [23]. These courses have
brought a drastic action to the Higher Education from one side and to the elementary
education from the other side [13]. The number of offered MOOCs has exploded in
the recent years. Particularly, until January 2016, there have been over 4500 courses
with 35 million learners from 12 MOOC providers [25]. Some of these courses are
provided by prestigious and renowned universities such as Harvard, MIT, and Stan-
ford. At the same time, other institutions have joined the MOOC hype and became
providers of their own local universities like the Austrian MOOC platform, iMooX
(www.imoox.at).
It is important to realize that MOOCs have split into two major types: cMOOCs
and xMOOCs. The cMOOCs are based on the philosophy of connectivism which is
about creating networks of learning [27]. On the other hand, the xMOOCs term is
shortened from extended MOOCs based on classical information transmission [10].
Further, new types of online courses related to MOOCs have germinated recently
such as Small Private Online Courses (SPOCs) and Distributed Open Collaborative
Courses (DOCCs).
MOOCs have the potential of scaling education in different fields and subjects.
The study of [25] showed that computer science and programming grabbed the largest
percentage of the offered courses. Yet, substantial growth of MOOCs has also been
noticed in Science, Technology, Engineering, and Mathematics (STEM) fields. The
anticipated results of MOOCs were varied between business purposes like saving
costs, and improving the pedagogical and educational concepts of online learning
[16]. Nevertheless, there is still altercation about the pedagogical approach of infor-
mation delivery to the students. The quality of the offered courses, completion rate,
lack of interaction, and grouping students in MOOCs have been, in addition, debated
recently [4, 12, 17].
Since MOOCs are an environment of online learning, the educational process is
based on video lecturing. In fact, learning in MOOCs is not only exclusive to that, but
social networking and active engagement are major factors too [23]. Contexts that
include topics, articles or documents are also considered as a supporting material in
the learning process.
While MOOC providers initialize and host online courses, the hidden part embod-
ied in recording learners’ activities. Nowadays, ubiquitous technologies have spread
among online learning environments and tracking students online becomes much
easier. The pressing needs of ensuring that the audience of eLearning platforms is
getting the most out of the online learning process and the needs to study their behav-
ior lead to what is so-called “Learning Analytics”. One of its key aspects is identify-
ing trends, discovering patterns and evaluating learning environments, MOOCs here
as an example. Khalil and Ebner listed factors that have driven the expansion of this
emerging field [14]: A) technology spread among educational categories, b) the “big
data” available from learning environments, and c) the availability of analytical tools.
In this research publication, we will discuss the potential of the collaboration be-
tween Learning Analytics and MOOCs. There have been various discussions among
researchers from different disciplines regarding these apparent trends. For instance,
Knox said that “Learning Analytics promises a technological fix to the long-standing
problems of education” [19]. Respectively, we will line up our experience within both
of the fields in the recent years and list the up to date related work. Further, different
scenarios and analysis from offered MOOCs of the iMooX will be discussed using the
iMooX Learning Analytics Prototype. At the end, we will list our proposed interven-
tions that will be adopted in the next MOOCs.
This publication is organized as follows: Section 2 covers literature and related
work. In section 3, we list a systematic mapping from the Scopus library to under-
stand what has been researched in Learning Analytics of MOOCs. Section 4 covers
the iMooX Learning Analytics Prototype while section 5 covers case studies and the
derived analytics outcomes from the empirically provided data.
2 Literature Review
2.1 MOOCs
The new technologies of the World Wide Web, mobile development, social networks
and the Internet of Things have advanced the traditional learning. eLearning and
Technology Enhanced Learning (TEL) have risen up with new models of learning
environments such as Personal Learning Environments (PLE), Virtual Learning Envi-
ronments (VLE) and MOOCs. Since 2008, MOOCs reserved a valuable position in
educational practices. Non-profits platforms like edX (www.edx.org) and profit plat-
forms like Coursera (www.coursera.com) attracted millions of students. As long as
they only require an Internet connection and intention for learning, MOOCs are con-
sidered to be welfare for the Open Educational Resources (OER) and the lifelong
learning orientation [7].
Despite all these benefits, MOOCs turn out badly with several issues. Dropout and
the failure to complete courses are considered as one of the biggest issues. Katy Jor-
dan showed that the completion rate of many courses merely reached 10% [11]. Rea-
sons behind were explained because of poor course design, out of motivation, course
takes much time, lack of interaction and the assumption of too much knowledge
needed [16, 21]. Fetching other issues of MOOCs through available empirical data is
discussed later in this paper.
2.2 Learning Analytics
The birth of Learning Analytics has first seen the light in 2011. A Plethora of defini-
tions were used since then. However, the trend is strongly associated with previously
well-known topics such as web analytics, academic analytics, data analysis, data min-
ing as well as psychometrics and educational measurement [2]. Learning Analytics
mainly targets educational data sets from the modern online learning environments
where learners leave traces behind. The process then includes searching, filtering,
mining and visualizing data in order to retrieve meaningful information.
Learning Analytics involves different key methods of analysis. They vary from da-
ta mining, statistics, and mathematics, text analysis, visualizations, social network
analysis, qualitative to gamification techniques [15, 26]. On the other hands, the aims
of Learning Analytics diversify between different frameworks, but most of them
agreed on common goals. Despite its learning environment, Papamitsiou and Econo-
mides showed that studies of Learning Analytics focused on the pedagogical analysis
of behavior modeling, performance prediction, participation and satisfaction [26].
Benefits utilized in prediction, intervention, recommendation, personalization, evalua-
tion, reflection, monitoring and assessment improvement [3, 9, 14]. In fact, these
goals are considered useless without optimizing, refining and taking the full power of
it on stakeholders [5].
2.3 MOOCs and Learning Analytics
Learners of the online learning environments such as MOOCs are not only considered
as consumers, but they are also generators of data [14]. Lately, the research part of
studying the behavior of online students in MOOCs becomes widely spread across
journals and conferences. A recent survey study done by Khalil and Ebner on Learn-
ing Analytics showed that the ultimate number of citations using Google scholar
(scholar.google.com) were relevant to MOOC articles [15]. They listed the most
common techniques used by Learning Analytics in MOOCs, varying from machine
learning, statistics, information visualization, Natural Language Processing (NLP),
social network analysis, to gamification tools. Moissa and her colleagues mentioned
that Learning Analytics in MOOCs literature studies are still not deeply researched
[24]. We also found that valid in the next section.
3 Learning Analytics of MOOCs
In this section, we did a brief text analysis and mapped the screening of the abstracts
from the Scopus database (www.scopus.com), in order to:
1. Grasp what has been researched in Learning Analytics of MOOCs.
2. Realize the main research trends of the current literature of Learning Analytics and
MOOCs.
Scopus is a database powered by Elsevier Science. Our selection of this library is
because of the valuable indexing information it provides and the usability of perform-
ing search queries. The conducted literature exploration was performed by searching
for the following keywords: “Learning Analytics” and “MOOC”, “MOOCs” or “Mas-
sive Open Online Course”. The used query to retrieve the results was executed on 11-
April- 2016 and is shown in figure 1. The language was refined to English only.
Fig. 1. Search query to conduct the literature mapping
The returned results equaled to 80 papers. Only one paper was retrieved in 2011,
none from 2012, 11 papers from 2013, 23 papers from 2014, 37 from 2015 and 8 pa-
pers from 2016. Abstracts were then extracted and processed to a Comma-Separated
Values (CSV) file. After that, we created a word cloud in furtherance of representing
text data to identify the most prominent terms. Figure 2 depicts the word cloud of the
extracted abstracts. We looked at the single, bi-grams, tri-grams and quad-grams com-
mon terms. The most repeated single words were “MOOCs”, “education”, and “en-
gagement”. On the other hand, “Learning Analytics”, “Online Courses” and “Higher
Education” were recorded as the prominent bi-grams. “Khan Academy platform” and
“Massive Open Online Courses” were listed on the top of the tri-grams and quad-
grams respectively. As long as massive open online courses are represented in different
terms in the abstracts, we abbreviated all the terms to “MOOCs” in the corpus.
Figure 3 shows the most frequent phrases fetched from the text. Figure 2 and Fig-
ure 3 show interesting observations of the researched topics of Learning Analytics in
MOOCs. By doing a simple grouping of the topics and disregarding the main phrases
which are “Learning Analytics” and “MOOCs”, we found that researchers were look-
ing mostly at the engagement and interactions.
[ADD Fig 2. here]
Fig. 2. World cloud of the most prominent terms from the abstracts
Fig. 3. The most frequent terms extracted from the abstracts
It was quite interesting that the dropout and the completion rate were not the major
topics as we believed. Design and framework principles as well as assessment were
ranked the second most cited terms. Social factors and learning as well as discussions
grabbed the afterward attention, while tools and methods were mentioned to show the
mechanism done in offering solutions and case studies.
4 Learning Analytics Prototype of iMooX
The analyses of this study are based on the different courses provided by the Austrian
MOOC provider (iMooX). The platform was first initiated in 2013 with the coopera-
tion of University of Graz and Graz University of Technology [20]. iMooX offers
German courses in different disciplines and proposes certificates for students who
successfully complete the courses for free.
A MOOC platform cannot be considered as a real modern technology enhanced
learning environment without a tracking approach for analysis purposes [16]. Track-
ing students left traces on MOOC platforms with a Learning Analytics application is
essential to enhance the educational environment and understand students’ needs.
iMooX pursued the steps and applied an analytical approach called the “iMoox Learn-
ing Analytics Approach” to track students for research purposes. It embodies the
functionality to interpret low-level data and present them to the administrators and
researchers. The tool is built based on the architecture of the early presented Learning
Analytics framework by the authors [14]. Several goals were anticipated, but mainly
the intention to use data from the iMooX enterprise and examine what is happening
on the platform as well as rendering useful decisions upon the interpretation.
4.1 Design Ontology
The design of the tool is to propose integration with the data generated from MOOCs.
The large amount of available courses and participants in MOOCs, create a huge
amount of low-level data related to students’ performance and behavior [1]. For in-
stance, low-level data like the number of students who watched a certain video can be
used to interpret valuable actions regarding boring segments [30].
In order to fulfill our proposed framework, we divided the design architecture of
the prototype into four stages. Figure 4 depicts these main stages. Briefly summa-
rized, the first stage is the generation part of the data. Generating log files start when a
student enrolls in a course, begins watching videos, discusses topics in forums, does
quizzes, and answering evaluations. The next stage is followed by a suitable data
management and administration into stamping a time-reference descriptions of every
interaction. Parsing log files and processing them such as filtering unstructured data
and mining keywords from bulk text occur in the third stage. Finally, the fourth stage
characterizes the visualization part, and the processed data are displayed to the admins
and researchers.
Fig. 4. The iMooX Learning Analytics Prototype design architecture [16]
4.2 Implementation Architecture and User Interface
The implementation framework adopts the design architecture with more detailed
processing steps for the visualization part. We aimed to develop an easy-to-read dash-
board. The intended plan was to make visualizations for taking actions. They should
not only be connected with meaning and facts [6]. Thus, the data are presented in a
statistical text format and in charts like pie charts and bar plots as shown below in
figure 5.
This user dashboard is only accessible by researchers and administrators. A teacher
version, however, is attainable in a static format which shows general statistics about
his/her teaching course. The detailed personal information of students is kept confi-
dential and is only available for research and administrative reasons. The Dashboard
shows various MOOC objects and indicators. These objects inherent pedagogical
purposes and require appropriate interpretation for proper actions [8]. The Dashboard
offers searching for any specific user in a particular period. The returned results cov-
er:
• Quiz attempt, scores, and self-assessment
• Downloaded documents from the course
• Login frequency
• Forums reading frequency
• Forums posting frequency
• Watched videos
[ADD Fig. 5 HERE]
Fig. 5. iMooX Learning Analytics Prototype user dashboard - admin view
Further, comprehensive details can be carried out of each indicator when required by
clicking on the learning object tab.
5 Analysis and Case Studies
This section shows some of detailed analyses done previously. This examination is
carried out using the log data fetched from the prototype. The awaited results are: a)
evaluating the prototype efficiency in revealing patterns, b) recognizing the potentiali-
ty of Learning Analytics in MOOCs.
5.1 Building Activity Profiles
Building an activity profile using Learning Analytics becomes possible using the rich
available data provided by the prototype. We have analyzed a MOOC called “Me-
chanics in Everyday life”. The course was ten weeks long, and the target group was
secondary school students from Austria. The MOOC, however, was also open to the
public. There were (N=269) participants. The aim behind the activity profile is to
deeply examine the activity of participants and to distinguish between their activities.
Figure 6 displays the activity profile only for school pupils. The green represents the
certified students (N=5), while the red represents the non-certified students (N=27). It
is obvious that week-1, week-3, and week-4 were very active in discussion forums.
Watching videos were totally uninteresting in the last week. Thorough observations
and differences between pupils and other enrollees can be trailed from [13].
Fig. 6. The activity profile
5.2 Tracking Forums Activity
Various discussions about the role of social activity in MOOCs forums were regularly
debated. Recently, the study by Tseng et al., found out that the activity in forum dis-
cussion is strongly related to the course retention and performance [28]. We have
done several exploratory analyses to uncover diverse pedagogical relations and results
[16, 17, 21, 22]. The analyses are based on different offered MOOCs. The following
outcomes were concluded:
─ Defined drop-out point where students posting and reading in forums clearly di-
minishes in week-4, as shown in figure 7. We found such patterns being recurred
among a collection of MOOCs.
─ Figure 8 shows the relation between reading and writing in the discussion forums.
Different samples were tested randomly. The Pearson-moment correlation coeffi-
cient of 0.52 and p-value < 0.01 was calculated. This indicates a moderate correla-
tion of students who write more are likely to read more. Further, the active instruc-
tor drives a positive interaction into creating a dynamic social environment.
─ Figure 9 depicts forum posts in two courses. Students usually write more often in
the first two weeks.
─ Figure 10 shows the timing trends of learning happening during the whole day.
Peaks were detected between 6 p.m. and 10 p.m.
Fig. 7. Reading in forums leads to define drop-out peak points [16]
Fig. 8. Positive relationship between reading and writing in forums [22]
Fig. 9. Participants discuss and ask more often in the first two weeks [16]
Fig. 10. Time spent reading in forums [22]
5.3 Grouping and Clustering Participants
A recent systematic analysis done by Veletsianos and Shepherdson showed that lim-
ited research was done into identifying learners and examining subpopulations of
MOOCs [29]. Defining dropping out from MOOCs can take a different way when
categorizing students. Students may register in a course and then never show up. In-
cluding them in the total dropout share implies unjustified retention rate. Henceforth,
a case study of two offered MOOCs was examined to scrutinize this issue. We divid-
ed the participants and investigated the dropout ratio of each category. New student
types were defined based on their activity, quizzes and successful completion of the
course: registrants, active learners, completers and certified learners. The total drop-
out gap between registrants and active students was the highest. However, the new
dropout ratio between active students and certified learners was quite promising. The
completion rate in the first MOOC was 37%, while 30% in the second MOOC. This is
considered a very high completion rate compared to Jordan’s study [11]. Figure 11
shows the newly defined student types.
Fig. 11. New types of MOOC learners are defined using Learning Analytics [16]
On the other hand, explaining activity or engagement and their interaction with
MOOCs is needed to cluster students to subpopulations. Classifying students into
subpopulations improve decisions and interventions taken by lead managements [17,
18]. However, engagements vary and depend on the tested sample of students. In our
paper “Portraying MOOCs learners: a clustering experience using learning analytics”,
we studied the engagement of university students using the k-means clustering algo-
rithm [17]. Table 1, shows the activities of each cluster (reading frequency, writing
frequency, watching videos, quiz attendance, and certification ratio).
Table 1. University students clustering results [17]
Cluster
Read F.
Write F.
Watch Vid.
Quiz Att.
Cert. Ratio
C1
Low
Low
Low
Low
10.53%
C2
High
Low
High
High
96.10%
C3
Moderate
Low
Low
High
94.36%
C4
High
High
Low
Moderate
50%
Four sorts of students were detected using the algorithm. The “dropout” cluster
shown as C1 is clarified by students who have a low activity in all MOOC learning
activities. “On track” or excellent students displayed as C2 in the table and those who
are involved in most of the MOOC activities and have a certification rate of 96.1 %.
The “Gamblers” or students who play the system shown as C3, and these barely
watch learning videos, but they did every quiz seeking for the grade. “Social” stu-
dents, shown as C4 and these are more engaged in forums and their certification rate
was around 50%.
5.4 Quiz Attendance
In this part of the overview, we were concerned about the quiz attendance. The ques-
tion seeks to ascertain whether there is a relation between the dropout and the number
of quiz tries?
Fig. 12. Quiz attendance in one of the MOOCs (eight weeks long)
A student in iMooX has the option to attend a quiz up to five times. In figure 12,
the total number of quiz attendance is apparently decreasing in the first four weeks.
Starting from week-5 till the last week, the drop rate from quizzes was quite low. This
emphasizes our results in figure 7, in which a course of four weeks is critical from
different points. Our study in [13] has proven that the students who did more quiz
trials apparently retained and reached further weeks.
6 Discussion and Conclusion
This research study is mainly divided into three parts. The first part lists literature
study of the topics of Learning Analytics, MOOCs, and Learning Analytics in
MOOCs. With the 80 collected papers from the Elsevier Science library: Scopus, we
did a word cloud to remark the vital trends in these two prominent fields. Topics of
engagement, interactions, social factors, as well as design and frameworks, were re-
ferred the most. Further, Learning Analytics was employed more into improving in-
teractions and engagements of students in the MOOCs environment instead of the
dropout matter. The second part shows our experience in implementing the iMooX
Learning Analytics prototype. It eases collecting and tracking students of the exam-
ined MOOC platform. We discussed its ontology, implementation architecture and
user interface. The third part evaluated the application. Different scenarios from
iMooX were analyzed using advanced visualizations, statistics, clustering and qualita-
tive decisions.
The potentiality of Learning Analytics in MOOCs crystallizes in the subsequent in-
terventions upon the evaluation results. We believe in designing shorter courses such
as four weeks MOOC instead of eight weeks [21]. As a result, the workload would be
cut in half, and students’ efficiency will be higher. Additionally, the concept of en-
hancing social communications in the discussion forums, especially between the in-
structor and the students would attract students into being connected which by all
means would decrease the dropout rate. We further discovered new types of students
using categorization and clustering by depending on their activity. This will lead us
into portraying engagement and behavior of a subpopulation of learners in the plat-
form.
We think learning analytics carries significant values to MOOCs from the peda-
gogical and technological perspectives. Proper interventions, predictions, and bench-
marking learning environments are difficult to optimize on MOOCs without the assis-
tance of Learning Analytics. In the end, the under development algorithm of design-
ing an assistant tool that sends a direct feedback to the student in order to improve the
completion rate is in our future plans. It will notify students directly in order to sup-
port a live awareness and reflection system.
References
1. Alario-Hoyos, C., Muñoz-Merino, P. J., Pérez-Sanagustín, M., Delgado Kloos, C., Parada,
G. H. A.: Who are the top contributors in a MOOC? Relating participants' performance
and contributions. Journal of Computer Assisted Learning. 32 (3), 232-243 (2016).
2. Baker, R. S., Siemens, G.: Educational Data Mining and Learning Analytics.
http://www.columbia.edu/~rsb2162/BakerSiemensHandbook2013.pdf. Accessed: 20 April
2016 (2016).
3. Chatti, M., Dyckhoff, A., Schroeder, U., Thüs, H.: A reference model for learning analyt-
ics. International Journal of Technology Enhanced Learning. 4, 5/6, 318-331 (2012).
4. Clow, D.: MOOCs and the funnel of participation. In: the Third International Conference
on Learning Analytics and Knowledge (LAK 13), Leuven, Belgium, pp. 185-189. ACM
(2013).
5. Clow, D.: The learning analytics cycle: closing the loop effectively. In: the 2nd Interna-
tional Conference on Learning Analytics and Knowledge (LAK '12), Vancouver, Canada,
pp. 134-138. ACM (2012).
6. Duval, E.: Attention please!: learning analytics for visualization and recommendation. In:
the 1st International Conference on Learning Analytics and Knowledge (LAK 11), Alber-
ta, Canada, pp. 9-17. ACM (2011).
7. Ebner, M., Schön, S., Kumar, S.: Guidelines for leveraging university didactics centers to
support OER uptake in German-speaking Europe. Education Policy Analysis Archives,
24(39) (2016).
8. Graf, S., Ives, C., Rahman, N., Ferri, A.: AAT: a tool for accessing and analysing students'
behaviour data in learning systems. In: the 1st International Conference on Learning Ana-
lytics and Knowledge (LAK 11), Alberta, Canada, pp. 174-179. ACM (2011).
9. Greller, W., Drachsler, H.: Translating Learning into Numbers: A Generic Framework for
Learning Analytics. Educational Technology & Society. 15 (3), 42-57 (2012).
10. Hollands, F. M., Tirthali, D.: MOOCs: Expectations and reality. Full report. Center for
Benefit-Cost Studies of Education, Teachers College, Columbia University, NY.
http://cbcse.org/wordpress/wp-
content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf. Accessed: 19 April
2016 (2014).
11. Jordan, K.: MOOC completion rates: The data. http://www.katyjordan.com/MOOCproject.
html. Accessed: 12 April 2016 (2013).
12. Khalil, H. Ebner, M.: MOOCs Completion Rates and Possible Methods to Improve Reten-
tion - A Literature Review. In: Proceedings of World Conference on Educational Multi-
media, Hypermedia and Telecommunications 2014, pp. 1236-1244. Chesapeake, VA:
AACE (2014).
13. Khalil, M., Ebner, M.: A STEM MOOC for school children—What does learning analytics
tell us?. In: the 2015 International Conference on Interactive Collaborative Learning (ICL
2015), Florence, Italy, pp. 1217-1221. IEEE (2015).
14. Khalil, M., Ebner, M.: Learning Analytics: Principles and Constraints. In: Carliner, S., Ful-
ford, C., & Ostashewski, N. (eds.), Proceedings of EdMedia: World Conference on Educa-
tional Media and Technology 2015, pp. 1789-1799. Chesapeake, VA: AACE (2015).
15. Khalil, M., Ebner, M.: What is Learning Analytics about? A Survey of Different Methods
Used in 2013-2015. In: the Smart Learning Conference, Dubai, UAE, pp. 294-304. Dubai:
HBMSU Publishing House (2016).
16. Khalil, M., Ebner, M.: What Massive Open Online Course (MOOC) Stakeholders Can
Learn from Learning Analytics?. In M. J. Spector, B. B. Lockee, & M. D. Childress (Eds.),
Learning, Design, and Technology. Springer International Publishing (in press).
17. Khalil, M., Kastl, C., Ebner, M.: Portraying MOOCs Learners: a Clustering Experience
Using Learning Analytics. In: the European Stakeholder Summit on experiences and best
practices in and around MOOCs (EMOOCS 2016), Graz, Austria, pp. 265-278. (2016).
18. Kizilcec, R. F., Piech, C., Schneider, E.: Deconstructing disengagement: analyzing learner
subpopulations in massive open online courses. In: the third international conference on
learning analytics and knowledge (LAK ’13) Leuven, Belgium, pp. 170-179. ACM (2013).
19. Knox, J.: From MOOCs to Learning Analytics: Scratching the surface of the 'visual'.
eLearn. 2014(11), ACM (2014).
20. Kopp, M., Ebner, M.: iMooX - Publikationen rund um das Pionierprojekt. Verlag Mayer.
Weinitzen (2015).
21. Lackner, E., Ebner, M., Khalil, M.: MOOCs as granular systems: design patterns to foster
participant activity. eLearning Papers. 42, 28-37 (2015).
22. Lackner, E., Khalil, M., Ebner, M.: How to foster forum discussions within MOOCs: A
case study. International Journal of Academic Research in Education. (in review).
23. McAulay, A., Tewart, B., Siemens, G.: The MOOC model for digital practice. Charlotte-
town: University of Prince Edward Island (2010). http://www.elearnspace.org/Articles/
MOOC_Final.pdf.
24. Moissa, B., Gasparini, I., Kemczinski, A.: A Systematic Mapping on the Learning Analyt-
ics Field and Its Analysis in the Massive Open Online Courses Context. International
Journal of Distance Education Technolgies. 13 (3), 1-24 (2015).
25. Online Course Report.: State of the MOOC 2016: A Year of Massive Landscape Change
For Massive Open Online Courses, 2016. http://www.onlinecoursereport.com/state-of-the-
mooc-2016-a-year-of-massive-landscape-change-for-massive-open-online-courses/. Ac-
cessed: 18 April 2016 (2016).
26. Papamitsiou, Z., & Economides, A. A.: Learning Analytics for Smart Learning Environ-
ments: A Meta-Analysis of Empirical Research Results from 2009 to 2015. In M. J. Spec-
tor, B. B. Lockee, & M. D. Childress (Eds.), Learning, Design, and Technology. pp.1-23.
Springer International Publishing (2016).
27. Siemens, G.: A learning theory for the digital age. Instructional Technology and Distance
Education, 2(1), 3-10 (2005).
28. Tseng, S.F., Tsao, Y. W., Yu, L. C., Chan, C. L., Lai, K.R.: Who will pass? Analyzing
learner behaviors in MOOCs. Research and Practice in Technology Enhanced Learn-
ing, 11(1), pp. 1-11. Springer, (2016).
29. Veletsianos, G., Shepherdson, P.: A Systematic Analysis and Synthesis of the Empirical
MOOC Literature Published in 2013-2015.The International Review of Research in Open
and Distributed Learning. 17(2), (2016).
30. Wachtler, J., Khalil, M., Taraghi, B., Ebner, M.: On Using Learning Analytics to Track the
Activity of Interactive MOOC videos. In: Proceedings of the LAK 2016 Workshop on
Smart Environments and Analytics in Video-Based Learning, Edinburgh, Scotland, pp.8-
17. CEUR (2016).