Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Sep 24, 2017
Content may be subject to copyright.
Draft version – originally published in: Leitner, P., Khalil, M., Ebner, M (2017) Learning Analytics in
Higher Education – A Literature Review. In: Learning Analytics: Fundaments, Applications, and
Trends. Peña-Ayala, A. (Ed.). Springer International Publishing. DOI: 10.1007/978-3-319-52977-6_1.
pp. 1-23
Learning Analytics in Higher
Education - A Literature Re-
view
Philipp Leitner, Mohammad Khalil and Martin Ebner
Educational Technology, Graz University of Technology
{philipp.leitner, mohammad.khalil, martin.ebner}@tugraz.at
Münzgrabenstraße 35A/I, 8010 Graz, Austria
Abstract This chapter looks into examining research studies of the last five years
and presents the state of the art of Learning Analytics (LA) in the Higher Educa-
tion (HE) arena. Therefore, we used mixed-method analysis and searched through
three popular libraries, including the Learning Analytics and Knowledge (LAK)
conference, the SpringerLink, and the Web of Science (WOS) databases. We
deeply examined a total of 101 papers during our study. Thereby, we are able to
present an overview of the different techniques used by the studies and their asso-
ciated projects. To gain insights into the trend direction of the different projects,
we clustered the publications into their stakeholders. Finally, we tackled the limi-
tations of those studies and discussed the most promising future lines and chal-
lenges. We believe the results of this review may assist universities to launch their
own Learning Analytics projects or improve existing ones.
Keywords: Learning Analytics, Higher Education, Stakeholders, Literature Re-
view
2
1 Introduction
In the area of Higher Education, Learning Analytics has proven to be helpful to
colleges and universities in strategic areas such as resource allocation, student
success, and finance. These institutions are collecting more and more data than
ever before, to maximize strategic outcomes. Based on key questions data is ana-
lyzed and predictions are made to gain insights and set actions. Many examples of
successful analytics and frameworks use are available across a diverse range of in-
stitutions (Bichsel 2012). Ethical and legal issues of collecting and processing stu-
dents’ data are seen as barriers by the Higher Education institutions in Learning
Analytics (Sclater 2014).
In this chapter, we present a literature review to evaluate the progress of Learn-
ing Analytics in Higher Education since its early beginning in 2011. We conduct-
ed the search with the three popular libraries: the Learning Analytics and
Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS)
databases.
We then refined the returned results and settled on including 101 relevant pub-
lications. This chapter mainly contributes by analyzing them and lists the used
Learning Analytics methods, limitations and stakeholders. It is expected that this
study will be a guide for academicians who would like to improve existing Learn-
ing Analytics projects or assist universities to launch their own.
The next section gives a short introduction on the topic of Learning Analytics
and describes Learning Analytics in Higher Education in detail. The subsequent
sections are concerned with our research design, methodology and execution of
the review. The outcomes of the research questions and the literature survey are
presented in the third section. The penultimate section discusses the findings and
shows the conclusion of our survey. A glance of future trends are presented in the
last section.
2 A Profile of LA and LA in HE
In this section we present a profile of Learning Analytics in general and de-
scribe the analysis process. Further, we give emphasis to Learning Analytics in
Higher Education, discuss challenges and identify the involved stakeholders.
2.1 Learning Analytics
Since its first mention in the Horizon Report 2012 (Johnson et al. 2012), Learn-
ing Analytics has gained an increasing relevance. Learning Analytics is defined as
3
"the measurement, collection, analysis and reporting of data about learners and
their contexts for purposes of understanding and optimizing learning and the envi-
ronments in which it occurs" (Elias 2011). Another definition states “the use of in-
telligent data, learner-produced data, and analysis models to discover information
and social connection, and to predict and advise on learning” (Siemens 2010).
The Horizon Report 2013 identified Learning Analytics as one of the most im-
portant trends in technology-enhanced learning and teaching (Johnson et al. 2013).
Therefore, it is not surprising, that Learning Analytics is the subject of many sci-
entific papers. The research and improvement of Learning Analytics involves do-
ing the development, the use and integration of new processes and tools to im-
prove the performance of teaching and learning of individual students and of
teachers. Learning Analytics focuses specifically on the process of learning (Sie-
mens and Long 2011).
Due to its connections with digital teaching and learning, Learning Analytics is
an interdisciplinary research field with connections to the field of teaching and
learning research, computer science and statistics (Johnson et al. 2013). The avail-
able data is collected, analyzed and the gained insights are used to understand the
behavior of the students to provide them additional support (Gašević et al. 2015).
A key concern of Learning Analytics is the gathering and analyzation of data as
well as the setting of appropriate interventions to improve the learners learning
experience (Greller et al. 2014). These “actionable intelligence” from data mining
is supporting the teaching and learning and provides ideas for customization, tu-
toring and intervention within the learning environment (Campbell et al. 2007).
According to Campbell and Oblinger (Campbell and Oblinger 2007), an analy-
sis process has five steps, shown in Fig. 1.1.
Fig. 1.1. The five steps of the analysis process.
Capturing, data is captured and collected in real-time from different sources
(e.g. virtual learning environments, learning management systems, personal learn-
ing environment, web portals, forums, chat rooms, and so on) and combined with
student information (Lauría et al. 2012; Tseng et al. 2016).
Reporting, the collected data is used to generate accurate models for identifying
and measuring the student’s progress. Often visualization is used in Learning Ana-
lytics dashboards for a better understanding of the data. (Muñoz-Merino et al.
2013; Leony et al. 2013)
Predicting, the data is used to identify predictors for student success, outcomes
and for identifying at-risk students. Further, it is used for decision-making about
courses and resource allocation which then is used by the decision-makers of the
institutions. (Akhtar et al. 2015; Lonn et al. 2012)
4
Acting, the information gained from the data analyzation process is used to set
appropriate interventions in e.g. teaching or supporting students who are at risk of
failure or dropping out (Freitas et al. 2015; Palmer 2013).
Refining, the gathered information is used in a cyclical process for continuous
improvements of the used model in teaching and learning (Nam et al. 2014; Pistilli
et al. 2014).
Although research in the field of Learning Analytics in recent years celebrates
boom, Learning Analytics is still in its infancy. Students, researchers and educa-
tional managers need to discuss ideas and opportunities on how to integrate these
possibilities in their research and practice. (Ferguson 2012)
2.2 Learning Analytics in Higher Education
Higher Education looks forward to a future of uncertainty and change. In addi-
tion to the national and global as well as political and social changes, the competi-
tion on university level increases. Higher Education needs to increase financial
and operational efficiency, expand local and global impact, establish new funding
models during a changing economic climate and respond to the demands for
greater accountability to ensure organizational success at all levels (van Barneveld
et al. 2012). Higher Education must overcome these external loads in an efficient
and dynamic manner, but also understand the needs of the student body, who rep-
resents the contributor as well as the donor of this system (Shacklock 2016).
In addition to the strong competition, universities have to deal with the rapidly
changing technologies that have arisen with the entry of the digital age. In the
course of this, institutions collected enormous amounts of relevant data as a by-
product. For instance, when students take an online course, use an Intelligent Tu-
toring System (ITS) (Arnold and Pistilli 2012; Bramucci and Gaston 2012; Fritz
2011; Santos et al. 2013) play educational games (Gibson and de Freitas 2016;
Holman et al. 2013; Holman et al. 2015; Westera et al. 2013) or simply use an
online learning platform (Casquero et al. 2014; Casquero et al. 2016; Wu and
Chen 2013; Ma et al. 2015; Santos et al. 2015; Softic et al. 2013).
In recent years, more universities use methods of Learning Analytics in order to
obtain findings on the academic progress of students, predict future behaviors and
recognize potential problems in an early stage. Further, Learning Analytics in the
context of Higher Education is an appropriate tool for reflecting the learning be-
havior of students and provide suitable assistance from teachers or tutors. This in-
dividual or group support offers new ways of teaching and provides a way to re-
flect on the learning behavior of the student. Another motivation behind the use of
Learning Analytics in universities is to improve the inter-institutional cooperation,
and the development of an agenda for the large community of students and teach-
ers (Atif et al. 2013).
5
On an international level, the recruitment, management and retention of stu-
dents have become as high level priorities for decision makers in institutions of
Higher Education. Especially improving the student retention starts and the under-
standing of the reason behind and/or prediction of the attrition has come in the fo-
cus of attention due to the financial losses, lower graduation rates, and inferior
school reputation in the eyes of all stakeholders (Delen 2010; Palmer 2013).
Despite that Learning Analytics focuses strongly on the learning process, the
results still in the beneficial for all stakeholders. Romero and Ventura (2013) di-
vided the involved stakeholders based on their objectives, benefits and perspec-
tives in the following four groups:
Learners, support the learner with adaptive feedback, recommendations, response
to his or her needs, for learning performance improvement.
Educators, understand students’ learning process, reflect on teaching methods
and performance, understand social, cognitive and behavioral aspects.
Researchers, use the right data mining technique which fits the problem, evalua-
tion of learning effectiveness for different settings.
Administrators, evaluation of institutional resources and their educational offer.
3 Research Design, Methodology and Execution
This research aims at the elicitation of an overview on the advancement of the
Learning Analytics field in Higher Education since it emerged in 2011. The pro-
posed Research Questions (RQ) to answer are:
RQ1: What are the research strands of the Learning Analytics field in Higher Ed-
ucation (between January 2011 and February 2016)?
RQ2: What kind of limitations do the research papers and articles mention?
RQ3: Who are the stakeholders and how could they be categorized?
RQ4: What methods do they use in their papers?
In accordance to this objective, we performed a literature review following the
procedure of Machi and McEvoy (2009). Fig. 3.1 displays the six steps for a lit-
erature review used in this process.
[ADD Picture here]
Fig. 3.1. The literature review: Six steps to success. (Machi and McEvoy 2009)
After we selected our topic, we identified data sources based on their relevance in
the computing domain:
• The papers of the Learning Analytics and Knowledge conference published in
the ACM Digital Library,
• The SpringerLink, and
6
• The Thomson Reuters Web of Science database
and the following search parameters:
In the LAK papers, we didn’t need to search for the “Learning Analytics” term
because the whole conference covers the Learning Analytics discipline. We
searched the title, the abstract and the author keywords for “Higher Education”
and/or “University”.
In the SpringerLink database, we searched for the “Learning Analytics” term in
conjunction with either “Higher Education” or “University” (“Learning Analytics
AND (Higher Education OR University).
In the Web of Science database, we searched for the topic “Learning Analytics”
in conjunction with either “Higher Education” or “University” and in the research
domain “science technology”.
The defined inclusion criteria of the fetched papers from the libraries were set
to be: a) written in English, and b) published between 2011 till the February 2016.
We superficially assessed the quality of the reported studies, considering only arti-
cles that provided substantial information for Learning Analytics in Higher Educa-
tion. Therefore, we excluded articles that did not meet the outlined inclusion prin-
ciples.
The literature survey was conducted in February and March 2016. In the initial
search, we found a total of 135 publications (LAK: 65, SpringerLink: 37, Web of
Science: 33). During the first stage, the search results were analyzed based on
their titles, author keywords and abstracts. After this stage, 101 papers remain for
the literature survey. We fully read each publication and actively searched for
their research questions, techniques, stakeholders, and limitations. Regular meet-
ings between the authors were set on a weekly basis to discuss the results. Addi-
tionally, we added to our spreadsheet the Google Scholar1 citation count as a
measurement of article’s impact.
In order to present our findings, we analyze each of the research questions sep-
arately. This section presents our findings.
3.1 Response to Research Question 1
In order to answer the RQ1, which corresponds to “What are the research
strands of the Learning Analytics field in Higher Education (between January
2011 and February 2016)?”, we tried to extract the main topics from the research
questions of the publications. We identified that many of the publications do not
outline their research questions clearly. Many of the examined publications de-
scribed use cases. This concerns in particular the older publications of 2011 and
1 Online: http://scholar.google.com
7
2012, and is probably resulting from the young age of the scientific field of Learn-
ing Analytics. As a result, we did a brief text analysis on the fetched abstracts in
order to examine the robust trends in the prominent field of Learning Analytics
and Higher Education. We have collected all the article abstracts, processed them
through the R software, and then refined the resulted corpus. In the final stages,
we demonstrated the keywords and chose the Word cloud as a representation tool
of the terms as shown in Fig. 3.2. The figure was graphically generated using one
of the R library packages called “wordcloud”2.
[ADD Picture here]
Fig. 3.2. Word cloud of the prominent terms from the abstracts
In order to ease reading the cloud, we adopted four levels of representation de-
picted in four colors. The obtained list of words that have been used were classi-
fied into singular phrases, bi-grams, tri-grams and quad-grams. The most cited
singular words were “academic”, “performance”, “behavior” and “MOOCs”.
“learning environment”, “case study” and “online learning” were the most repeat-
ed bi-grams. The highest tri-grams used in the abstracts were “learning manage-
ment systems”, “Higher Education institutions” and “social network analysis”.
While quad-grams were only limited to “massive open online courses” which
were merged at the final filtering stage with the “MOOCs” term.
The word cloud shows a glance about the general topics when Learning Ana-
lytics is ascribed with Higher Education. Learning Analytics researchers focused
on utilizing its techniques towards enhancing performance and students’ behav-
iors. The popular adopted educational environment was MOOC platforms. Fur-
thermore, Learning Analytics was also used to perform practices of interventions,
observing dropout, videos, dashboards and engagement.
In Fig. 3.3 the collected articles are from the library data sources. Results show
an obvious increase in the number of publications since 2011. For instance, there
were 32 papers in 2015, incremented from 26 articles in 2014 and 17 articles in
2013. However, there were 5 articles only in 2011 and 12 articles in 2012. Be-
cause February 2016 was the date of collecting the publications in this study, the
2016 year was not indexed with many papers. On the other hand, the figure shows
the apparent involvement of the journal articles from the SpringerLink and Web of
Science libraries from 2013.
2 Online: https://cran.r-project.org/web/packages/wordcloud/index.html
8
Fig. 3.3. Collected articles distributed by source and year.
We cross-referenced the relevant publications with Google Scholar to derive
their citation impact. Table 3.1 shows the 10 most cited publications.
Table 3.1. Citation impact of the publications
Paper Title
Year of
Publication
No. of Google
Citations (Feb.
2016)
Course Signal at Purdue: Using Learning Analytics to Increase
Student Success (Arnold and Pistilli 2012)
2012
164
Social Learning Analytics: Five Approaches (Ferguson and
Shum 2012)
2012
94
Classroom walls that talk: Using online course activity data of
successful students to raise self-awareness of underperforming
peers (Fritz 2011)
2011
52
Goal-oriented visualizations of activity tracking: a case study
with engineering students (Santos et al. 2012)
2012
46
Where is Research on Massive Open Online Courses Headed?
A Data Analysis of the MOOC Research Initiative (Gasevic et
al. 2014)
2014
46
9
Course Correction: Using Analytics to Predict Course Success
(Barber and Sharkey 2012)
2012
36
Improving retention: predicting at-risk students by analyzing
clicking behavior in a virtual learning environment (Wolff et al.
2013)
2013
34
Learning designs and Learning Analytics (Lockyer and Dawson
2011)
2011
33
The Pulse of Learning Analytics Understandings and Expecta-
tions from the Stakeholders (Drachsler and Greller 2012)
2012
30
Inferring Higher Level Learning Information from Low Level
Data for the Khan Academy Platform (Muñoz-Merino et al.
2013)
2013
28
3.2 Response to Research Question 2
We identified for RQ2, which corresponds to “What kind of limitations do the
research papers and articles mention?”, three different limitations, either clearly
mentioned in articles or being tacitly within the context.
Limitations through time, some of the publications stated that continuous work
is needed (Elbadrawy et al. 2015; Ifenthaler and Widanapathirana 2014; Koulo-
cheri and Xenos 2013; Lonn et al. 2012; Palavitsinis et al. 2011; Sharkey 2011).
Either a longitudinal study would be necessary to prove hypotheses or because of
the shortage of the project (Fritz 2011; Nam et al. 2014; Ramírez-Correa and
Fuentes-Vega 2015).
Limitations through the size, other publications talked about the need for more
detailed data (Barber and Sharkey 2012; Best and MacGregor 2015; Rogers et al.
2014), the small group sizes (Junco and Clem 2015; Jo et al. 2015; Martin and
Whitmer 2016; Strang 2016), the unsure scalability, possible problems in wider
context and the problem of the generalization of the approach or method (Prinsloo
et al. 2015; Yasmin 2013).
Limitations through the culture, many of the publications mention that their ap-
proach might only work in their educational culture and is not applicable some-
where else (Arnold et al. 2014; Drachsler and Greller 2012; Grau-Valldosera and
Minguillón 2014; Kung-Keat and Ng 2016). Additionally, the ethics differ strong-
ly around the world, so cooperation projects between different universities in dif-
ferent countries needs different moderation as well as the use of data could be eth-
ically questionable (Abdelnour-Nocera et al. 2015; Ferguson and Shum 2012;
Lonn et al. 2013; Park et al. 2016).
Furthermore, ethical discussions about data ownership and privacy have recent-
ly arisen. Slade & Prinsloo (2013) pointed out that Learning Analytics touches
10
various research areas and therefore overlaps with ethical perspectives in areas of
data ownership and privacy. Questions about who should own the collected and
analyzed data were highly debated. As a result, the authors classified the overlap-
ping categories in three parts:
• the location and interpretation of data,
• informed consent, privacy and the de-identification of data, and
• the management, classification and storage of data.
These three elements generate an imbalance of power between the stakeholders
which they addressed by proposing a list of 6 grounding principles and considera-
tions: Learning Analytics as moral practice, students as agents, student identity
and performance are temporal dynamic constructs, Student success is a complex
and multidimensional phenomenon, transparency, higher education cannot afford
to not use data. (Slade and Prinsloo 2013)
3.3 Response to Research Question 3
In order to answer the RQ3, which corresponds to “Who are the stakeholders
and how could they be categorized?”, we determined the stakeholders from the
publications and categorized them into three types. As a basis, we took the four
stakeholders as mentioned in section 2.2 and introduced in (Machi and McEvoy
2009). We merged the Researchers and Administrators from the original classifi-
cation into one distinct group. Therefore, the institutional perspective (Academic
Analytics) is separated from the learners’ and teachers’ one (Learning Analytics).
Fig. 3.4 depicts the defined Learning Analytics stakeholders as a VENN-
Diagram. The figure shows that there had been more research conducted concern-
ing the Researchers/Administrators with overall 65 publications and 40 of them
only concerning themselves, than in the field of Learners with a total of 53 publi-
cations and 21 single mentions. Also, it seems that Teachers are only a “side-
product” of this field with only 20 mentions and only 7 dedicated to them alone.
11
Fig. 3.4. VENN-diagram of stakeholders in the publications
Most of the combined articles addressed Researchers/Administrators together
with Learners (20 publications). Only 8 articles can be found with an overlap be-
tween Learners and Teachers, which should be one of the most researched and
discussed combinations within Learning Analytics in Higher Education. Nearly no
work has been done by combining Researchers/Administrators with Teachers (in 1
publications) and only 4 paper combined all 3 stakeholders. This lack of research
will be a matter of debate in the discussion section.
3.4 Response to Research Question 4
By analyzing the selected studies to answer RQ4, which corresponds to “What
techniques do they use in their papers?”, we identified the techniques used in
Learning Analytics and Higher Education publications. We took into account the
methods presented by Romero & Ventura (2013), Khalil & Ebner (2016) and Li-
nan & Perez (2015). We propose an overview of the used techniques of the differ-
ent articles in Table 2.
Table 3.2. Overview of the used Learning Analytics techniques of this study
Techniques
Key applications
Examples
Prediction
Predicting student performance and detecting
(AbuKhousa and Atif 2016;
12
student behaviors.
Cambruzzi et al. 2015; Harri-
son et al. 2015)
Clustering
Grouping similar materials or students based
on their learning and interaction patterns.
(Aguiar et al. 2014; Asif et al.
2015; Scheffel et al. 2012)
Outlier Detection
Detection of students with difficulties or irregular
learning processes.
(Grau-Valldosera and Min-
guillón 2011; Manso-Vázquez
and Llamas-Nistal 2015; Sin-
clari and Kalvala 2015)
Relationship Min-
ing
Identifying relationships in learner behavior pat-
terns and diagnosing student difficulties.
(Kim et al. 2016; Pardo et al.
2015; Piety et al. 2014)
Social Network
Analysis
Interpretation of the structure and relations in
collaborative activities and interactions with
communication tools.
(Hecking et al. 2014; Terva-
kari et al. 2013; Vozniuk et al.
2014)
Process Mining
Reflecting student behavior in terms of its exam-
ination traces, consisting of a sequence of course,
grade and timestamp.
(Menchaca et al. 2015; Vahdat
et al. 2015; Wise 2014)
Text mining
Analyzing the contents of forums, chats, web
pages and documents.
(Gasevic et al. 2014; Lotsari et
al. 2014; Prinsloo et al. 2012)
Distillation of Da-
ta for Human
Judgment
Helping instructors to visualize and analyze
the ongoing activities of the students and the
use of information.
(Aguilar et al. 2014; Grann
and Bushway 2014; Swenson
2014)
Discovery with
Models
Identification of relationships among student
behaviors and characteristics or contextual
variables. Integration of psychometric model-
ling frameworks into machine-learning mod-
els.
(Gibson et al. 2014; Kovanov-
ić et al. 2015; Lockyer and
Dawson 2011)
Gamification
Include possibilities for playful learning to
maintain motivation; e.g. integration of
achievements, experience points or badges as
indicators of success.
(Holman et al. 2013; Øhrstrøm
et al. 2013; Westera et al.
2013)
Machine Learning
Find hidden insights in data automatically
(based on models who are exposed to new da-
ta and adapt itself independently).
(Corrigan et al. 2015; McKay
et al. 2012; Nespereira et al.
2016)
Statistic
Analysis and interpretation of quantitative da-
ta for decision making.
(Clow 2014; Khousa and Atif
2014; Simsek et al. 2015)
The results of Fig. 3.5 show, that the research is focused mainly on prediction
with a total of 36 citations. Outlier detection for pointing out at-risk or dropping
out students with a citation count of 29. Distillation of data for human judgment in
form of a visualization with a citation count of 33 than in all other parts including
rarely used techniques like gamification or machine learning with a total amount
of 102 counts.
13
[ADD Picture here]
Fig. 3.5. The publication count of the used Learning Analytics techniques
4 Discussion and Conclusion
In this chapter, we examined hundreds of pages to introduce a remarkable liter-
ature review of the Learning Analytics field in the Higher Education domain. We
presented a state-of-the-art study of both domains based on analyzing articles from
three major library references: the Learning Analytics and Knowledge conference,
SpringerLink and Web of Science. The total number of relevant publications were
equal to 101 articles in a period between 2011-2016.
In this literature review study, we followed the procedure of Machi and
McEvoy (2009) in which we selected the topic, searched the literature to get the
answers to the research questions, surveyed and critiqued the literature and finally
introduced our review. Using this big dataset, we identified the research strands of
the relevant publications. Most of the publications described use cases rather than
comprehensive research - especially the prior publications, which is comprehensi-
ble because at the time, the universities had to figure out how to handle and har-
ness the abilities offered by Learning Analytics for their benefit.
To make a better holistic overview on the advancement of Learning Analytics
field in Higher Education, we proposed four main research questions. These ques-
tions were related to the research strands of Learning Analytics in Higher Educa-
tion, limitations, stakeholders and what techniques were used by Learning Analyt-
ics experts in the Higher Education domain, respectively.
The first research question was answered by generating a word cloud of a final
corpus which was formed from all abstracts of the included papers. Results re-
vealed that the usage of MOOCs, enhancing learning performance, students be-
havior, and benchmarking learning environments were strongly researched by
Learning Analytics experts in the domain of Higher Education. In addition, the
paper with the title “Course signals at Purdue: using learning analytics to increase
student success” by Arnold and Pistilli (2012), was the most cited article of our
inclusion, which focused on a tool of prediction. Also, we identified that there was
a clear increase in the number of publications since 2011 till 2015, Further it was
shown the apparent involvement of the journal articles from the SpringerLink and
Web of Science libraries in 2013 and 2015 over the LAK conference publications.
The second research questions showed that limitations were mainly concerning the
needed time to prepare data or getting the results, the size of the available dataset
and examined group and ethical reasons. While the discussions of privacy and
ownership have arisen dramatically after 2012, we found that the ethical con-
14
straints drive the limitations to the greatest extent of this literature review study
similar to the arguments in (Khalil and Ebner 2015; Khalil and Ebner 2016b).
The analysis shows that there was clamor regarding who are the main stake-
holders of Learning Analytics and Higher Education. As the leading stakeholders
of Learning Analytics should be learners and students (Khalil and Ebner 2015),
we found that researchers play a major role of the loop between Higher Education
and Learning Analytics. Fig. 3.4 demonstrated the high use of researchers and
administrators in carrying out decisions. The direct overlap between learners and
teachers was not evidently identified in our study.
At the final stage, we tried to elaborate what were the most used techniques of
Learning Analytics in Higher Education. This research question was answered
based on solid articles that discussed the Learning Analytics Techniques. The
scanning showed that prediction, distilling of data for human judgment, and outli-
er detection were the most used methods in the Higher Education domain. General
data mining methodologies from text mining to social network analysis were iden-
tified with high usage in the analyzed publications. On the other hand, we noticed
that there are new techniques that seem to be used more frequently in the past two
years such as serious gaming, which belongs to the gamification techniques.
5 Future Trends
In this chapter we are going to tackle the future development in the field of
Learning Analytics in Higher Education, which can be divided into short-term (1-
2 years) and long term (3-5 years) trends.
5.1 short-term trends
Over the next 1 to 2 years, universities must adjust to the social and economic
factors, which postulated the change in the capabilities of the students (Johnson et
al. 2016).
The tuning of the areas analysis, consultation, examination of individual learn-
ing outcomes and the visualization of continuously-available, aggregated infor-
mation in dashboards are gaining more and more importance. Students expect re-
al-time feedback during learning with critical self-reflection on the learning
progress and learning goal which strengthens their expertise in self-organization.
If adequate quantities of data from students are available, they can be carried out
for subsequently, predictive analytics. (Johnson et al. 2016)
15
5.2 long-term trends
The relevance of Learning Analytics in Higher Education will mint even more
over the next 3 to 5 years. This trend is promoted by the strong interest of students
for individual evaluations and care. To serve this market, dashboards and analysis
applications that specifically address the needs of each customer will develop
stronger. This approach offers many advantages: Accessing your own data in an
appropriate form allows better self-reflection and a healthy rivalry among the fel-
low students.
The teachers can survey a large amount of students and precisely recognize
those who need their help. University and college dropouts can be better detected
by appropriate analyzing and with targeted interventions they remain in the uni-
versity system. (Shacklock 2016)
To master the associated problems, the Learning Analytics market will have to
change. Currently, many different systems and analytical approaches are used.
The fragmentation of the market will grow even further in the future, which makes
the interuniversity comparison very difficult or even impossible. Therefore, the
creation of standards is essential. (Shacklock 2016)
Furthermore, a change in the type of analysis is foreseeable. Most current and
past data have been used to measure the success of students. Today, advances in
predictive analytics (predictive analysis) are more important. By using the analysis
of existing data sets of many students, predictive models can be developed and
warn thus students who are at risk not to meet their learning success. (Shacklock
2016)
Acknowledgments This research project is co-funded by the European Commission Erasmus+
program, in the context of the project 562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD
References
Abdelnour-Nocera J, Oussena S, Burns C (2015) Human Work Interaction Design of the Smart
University. In Human Work Interaction Design. Work Analysis and Interaction Design
Methods for Pervasive and Smart Workplaces. Springer International Publishing, 127-140
AbuKhousa E, Atif Y (2016) Virtual Social Spaces for Practice and Experience Sharing. In
State-of-the-Art and Future Directions of Smart Learning, Springer Singapore, pp 409-414
Aguiar E, Chawla NV, Brockman J, Ambrose GA, Goodrich V (2014) Engagement vs perfor-
mance: using electronic portfolios to predict first semester engineering student retention. In
Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
ACM. pp 103-112
Aguilar S, Lonn S, Teasley SD (2014) Perceptions and use of an early warning system during a
higher education transition program. In Proceedings of the fourth international conference on
learning analytics and knowledge. ACM. pp 113-117
16
Akhtar S, Warburton S, Xu W (2015) The use of an online learning and teaching system for
monitoring computer aided design student participation and predicting student success. Inter-
national Journal of Technology and Design Education, pp 1-20
Arnold KE, Lonn S, Pistilli MD (2014) An exercise in institutional reflection: The learning ana-
lytics readiness instrument (LARI). In Proceedings of the Fourth International Conference on
Learning Penetrating the Black Box of Time-on-task Estimation And Knowledge. ACM. pp
163-167
Arnold KE, Pistilli MD (2012) Course signals at Purdue: using learning analytics to increase stu-
dent success. Proceedings of the 2nd international conference on learning analytics and
knowledge, ACM, pp 267-270
Asif R, Merceron A, Pathan MK (2015) Investigating performance of students: a longitudinal
study. In Proceedings of the Fifth International Conference on Learning Analytics And
Knowledge. ACM. pp 108-112
Atif A, Richards D, Bilgin A, Marrone M (2013) Learning analytics in higher education: a sum-
mary of tools and approaches. 30th Australasian Society for Computers in Learning in Ter-
tiary Education Conference, Sydney.
Barber R, Sharkey M (2012) Course correction: using analytics to predict course success. Pro-
ceedings of the 2nd international conference on learning analytics and knowledge, ACM, pp
259-262.
Best M, MacGregor D (2015) Transitioning Design and Technology Education from physical
classrooms to virtual spaces: implications for pre-service teacher education. International
Journal of Technology and Design Education, 1-13.
Bichsel J (2012) Analytics in higher education: Benefits, barriers, progress, and recommenda-
tions. EDUCAUSE Center for Applied Research
Bramucci R, Gaston J (2012) Sherpa: increasing student success with a recommendation engine.
In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.
ACM. pp 82-83
Cambruzzi WL, Rigo SJ, Barbosa JL (2015) Dropout Prediction and Reduction in Distance Edu-
cation Courses with the Learning Analytics Multitrail Approach. J. UCS, 21(1), pp 23-47
Campbell JP, DeBlois PB, Oblinger DG (2007). Academic Analytics: A New Tool for a New
Era, EDUCAUSE Review, 42(4), pp 40–57
Campbell JP, Oblinger DG (2007) Academic analytics, EDUCAUSE White Paper. Retrieved
February 10, 2016 from https://net.educause.edu/ir/library/pdf/PUB6101.pdf
Casquero O, Ovelar R, Romo J, Benito M (2014) Personal learning environments, higher educa-
tion and learning analytics: a study of the effects of service multiplexity on undergraduate
students’ personal networks/Entornos de aprendizaje personales, educación superior y analíti-
ca del aprendizaje: un estudio sobre los efectos de la multiplicidad de servicios en las redes
personales de estudiantes universitarios. Cultura y Educación, 26(4), pp 696-738
Casquero O, Ovelar R, Romo J, Benito M, Alberdi M (2016) Students' personal networks in vir-
tual and personal learning environments: a case study in higher education using learning ana-
lytics approach. Interactive Learning Environments, 24(1), pp 49-67
Clow D (2014) Data wranglers: human interpreters to help close the feedback loop. In Proceed-
ings of the Fourth International Conference on Learning Analytics And Knowledge. ACM.
pp 49-53
Corrigan, O., Smeaton, A. F., Glynn, M., & Smyth, S. (2015). Using Educational Analytics to
Improve Test Performance. In Design for Teaching and Learning in a Networked World (pp.
42-55). Springer International Publishing.
Delen D (2010) A comparative analysis of machine learning techniques for student retention
management. Decision Support Systems, 49(4), pp 498-506.
Drachsler H, Greller W (2012) The pulse of learning analytics understandings and expectations
from the stakeholders. Proceedings of the 2nd international conference on learning analytics
and knowledge, ACM, pp 120-129.
17
Elbadrawy A, Studham RS, Karypis G (2015) Collaborative multi-regression models for predict-
ing students' performance in course activities. In Proceedings of the Fifth International Con-
ference on Learning Analytics And Knowledge. ACM. pp 103-107
Elias T (2011) Learning Analytics: Definitions, Processes and Potential
Ferguson R (2012) Learning analytics: drivers, developments and challenges. International Jour-
nal of Technology Enhanced Learning. 4(5/6), pp 304–317.
Ferguson R, Shum SB (2012) Social learning analytics: five approaches. Proceedings of the 2nd
international conference on learning analytics and knowledge, ACM, pp 23-33
Freitas S, Gibson D, Du Plessis C, Halloran P, Williams E, Ambrose M, Dunwell I, Arnab S
(2015) Foundations of dynamic learning analytics: Using university student data to increase
retention. British Journal of Educational Technology, 46(6), pp 1175-1188
Fritz J (2011) Classroom walls that talk: Using online course activity data of successful students
to raise self-awareness of underperforming peers. The Internet and Higher Education, 14(2),
pp 89-97.
Gašević D, Dawson S, Siemens G (2015) Let’s not forget: Learning analytics are about learning.
TechTrends, 59(1), pp 64-71.
Gasevic D, Kovanovic V, Joksimovic S, Siemens G (2014) Where is research on massive open
online courses headed? A data analysis of the MOOC Research Initiative. The International
Review Of Research In Open And Distributed Learning, 15(5).
Gibson A, Kitto K, Willis J (2014) A cognitive processing framework for learning analytics. In
Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
ACM. pp 212-216
Gibson D, de Freitas S (2016) Exploratory analysis in learning analytics. Technology,
Knowledge and Learning, 21(1), pp 5-19
Grann J, Bushway D (2014) Competency map: Visualizing student learning to promote student
success. In Proceedings of the fourth international conference on learning analytics and
knowledge. ACM. pp 168-172
Grau-Valldosera J, Minguillón J (2011) Redefining dropping out in online higher education: a
case study from the UOC. In Proceedings of the 1st International Conference on Learning
Analytics and Knowledge. ACM. pp 75-80
Grau-Valldosera J, Minguillón J (2014) Rethinking dropout in online higher education: The case
of the Universitat Oberta de Catalunya. The International Review of Research in Open and
Distributed Learning, 15(1).
Greller W, Ebner M, Schön M (2014) Learning Analytics: From Theory to Practice–Data Sup-
port for Learning and Teaching. Computer Assisted Assessment. Research into E-
Assessment. Springer International Publishing, pp 79-87.
Groeneveld CM (2014) Implementation of an Adaptive Training and Tracking Game in Statistics
Teaching. In International Computer Assisted Assessment Conference, Springer International
Publishing, pp 53-58
Harrison S, Villano R, Lynch G, Chen G (2015) Likelihood analysis of student enrollment out-
comes using learning environment variables: A case study approach. In Proceedings of the
Fifth International Conference on Learning Analytics And Knowledge. ACM. pp 141-145
Hecking T, Ziebarth S, Hoppe HU (2014) Analysis of dynamic resource access patterns in a
blended learning course. In Proceedings of the Fourth International Conference on Learning
Analytics and Knowledge. ACM. pp 173-182
Holman C, Aguilar S, Fishman B (2013) GradeCraft: what can we learn from a game-inspired
learning management system?. Proceedings of the Third International Conference on Learn-
ing Analytics and Knowledge, ACM, 260-264.
Holman C, Aguilar SJ, Levick A, Stern J, Plummer B, Fishman B (2015) Planning for success:
how students use a grade prediction tool to win their classes. In Proceedings of the Fifth In-
ternational Conference on Learning Analytics And Knowledge. ACM. pp 260-264
18
Ifenthaler D, Widanapathirana C (2014) Development and validation of a learning analytics
framework: Two case studies using support vector machines. Technology, Knowledge and
Learning, 19(1-2), pp 221-240
Jo IH, Yu T, Lee H, Kim Y (2015) Relations between student online learning behavior and aca-
demic achievement in higher education: A learning analytics approach. In Emerging issues in
smart learning. Springer Berlin Heidelberg. pp 275-287
Johnson L, Adams Becker S, Cummins M, Freeman A, Ifenthaler D, Vardaxis N (2013) Tech-
nology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Project Re-
gional Analysis. New Media Consortium.
Johnson L, Adams S, Cummins M (2012) The NMC Horizon Report: 2012 Higher Education
Edition. The New Media Consortium, Austin
Johnson L, Adams S, Cummins M, Estrada V, Freeman A, Hall C (2016) NMC Horizon Report:
2016 Higher Education Edition. The New Media Consortium, Austin, Texas.
http://cdn.nmc.org/media/2016-nmc-horizon-report-he-EN.pdf
Junco R, Clem C (2015) Predicting course outcomes with digital textbook usage data. The Inter-
net and Higher Education, 27, pp 54-63.
Khalil M, Ebner M (2015) Learning Analytics: Principles and Constraints. In Proceedings of
World Conference on Educational Multimedia, Hypermedia and Telecommunications pp
1326-1336
Khalil M, Ebner M (2016a) What is Learning Analytics about? A Survey of Different Methods
Used in 2013-2015. Proceedings of Smart Learning Conference, Dubai, UAE, 7-9 March,
Dubai: HBMSU Publishing House, 294-304.
Khalil M, Ebner M (2016b). De-Identification in Learning Analytics. Journal of Learning Ana-
lytics, 3(1), pp 129-138 http://dx.doi.org/10.18608/jla.2016.31.8
Khousa EA, Atif Y(2014) A Learning Analytics Approach to Career Readiness Development in
Higher Education. In International Conference on Web-Based Learning. Springer Interna-
tional Publishing. pp 133-141
Kim J, Jo IH, Park Y (2016) Effects of learning analytics dashboard: analyzing the relations
among dashboard utilization, satisfaction, and learning achievement. Asia Pacific Education
Review, 17(1), pp 13-24
Koulocheri E, Xenos M (2013) Considering formal assessment in learning analytics within a
PLE: the HOU2LEARN case. In Proceedings of the Third International Conference on
Learning Analytics and Knowledge. ACM. pp 28-32
Kovanović V, Gašević D, Dawson S, Joksimović S, Baker RS, Hatala M (2015) Penetrating the
black box of time-on-task estimation. In Proceedings of the Fifth International Conference on
Learning Analytics And Knowledge. ACM. pp 184-193
Kung-Keat T, Ng J (2016) Confused, Bored, Excited? An Emotion Based Approach to the De-
sign of Online Learning Systems. In 7th International Conference on University Learning and
Teaching (InCULT 2014) Proceedings. Springer Singapore. pp 221-233
Lauría EJ, Baron JD, Devireddy M, Sundararaju V, Jayaprakash SM (2012) Mining academic
data to improve college student retention: An open source perspective. In Proceedings of the
2nd International Conference on Learning Analytics and Knowledge, ACM, pp 139-142
Leony D, Muñoz-Merino PJ, Pardo A, Kloos CD (2013) Provision of awareness of learners’
emotions through visualizations in a computer interaction-based environment. Expert Sys-
tems with Applications, 40(13), 5093-5100.
Liñán LC, Pérez ÁAJ (2015) Educational Data Mining and Learning Analytics: differences, sim-
ilarities, and time evolution. Revista de Universidad y Sociedad del Conocimiento, 12(3), 98-
112.
Lockyer L, Dawson S (2011) Learning designs and learning analytics. Proceedings of the 1st in-
ternational conference on learning analytics and knowledge, ACM, pp 153-156.
Lonn S, Aguilar S, Teasley SD (2013) Issues, challenges, and lessons learned when scaling up a
learning analytics intervention. In Proceedings of the third international conference on learn-
ing analytics and knowledge. ACM. pp 235-239
19
Lonn S, Krumm AE, Waddington RJ, Teasley SD (2012) Bridging the gap from knowledge to
action: Putting analytics in the hands of academic advisors. Proceedings of the 2nd Interna-
tional Conference on Learning Analytics and Knowledge, ACM, 184-18
Lotsari E, Verykios VS, Panagiotakopoulos C, Kalles D (2014) A learning analytics methodolo-
gy for student profiling. In Hellenic Conference on Artificial Intelligence. Springer Interna-
tional Publishing. pp 300-312
Ma J, Han X, Yang J, Cheng J (2015) Examining the necessary condition for engagement in an
online learning environment based on learning analytics approach: The role of the instructor.
The Internet and Higher Education, 24, pp 26-34
Machi LA, McEvoy BT (2009) The literature review: Six steps to success. Thousand Oaks:
Corwin Sage
Manso-Vázquez M, Llamas-Nistal M (2015) A Monitoring System to Ease Self-Regulated
Learning Processes. IEEE Revista Iberoamericana de Tecnologias del Aprendizaje, 10(2), pp
52-59
Martin F, Whitmer JC (2016) Applying Learning Analytics to Investigate Timed Release in
Online Learning. Technology, Knowledge and Learning, 21(1), 59-74.
McKay T, Miller K, Tritz J (2012) What to do with actionable intelligence: E 2 Coach as an in-
tervention engine. In Proceedings of the 2nd International Conference on Learning Analytics
and Knowledge. ACM. pp 88-91
Menchaca I, Guenaga M, Solabarrieta J (2015) Project-Based Learning: Methodology and As-
sessment Learning Technologies and Assessment Criteria. In Design for Teaching and Learn-
ing in a Networked World. Springer International Publishing. pp 601-604
Mirriahi N, Liaqat D, Dawson S, Gašević D (2016) Uncovering student learning profiles with a
video annotation tool: reflective learning with and without instructional norms. Educational
Technology Research and Development, pp 1-24
Muñoz-Merino PJ, Valiente JAR, Kloos CD (2013) Inferring higher level learning information
from low level data for the Khan Academy platform. Proceedings of the third international
conference on learning analytics and knowledge, ACM, 112-116
Nam S, Lonn S, Brown T, Davis CS, Koch D (2014) Customized course advising: investigating
engineering student success with incoming profiles and patterns of concurrent course enroll-
ment. Proceedings of the Fourth International Conference on Learning Analytics And
Knowledge, ACM, 16-25.
Nespereira CG, Elhariri E, El-Bendary N, Vilas AF, Redondo RPD (2016) Machine Learning
Based Classification Approach for Predicting Students Performance in Blended Learning. In
The 1st International Conference on Advanced Intelligent System and Informatics
(AISI2015), November 28-30, 2015, Beni Suef, Egypt. Springer International Publishing. pp
47-56
Øhrstrøm P, Sandborg-Petersen U, Thorvaldsen S, Ploug T (2013) Teaching logic through web-
based and gamified quizzing of formal arguments. In European Conference on Technology
Enhanced Learning. Springer Berlin Heidelberg. pp 410-423
Palavitsinis N, Protonotarios V, Manouselis N (2011) Applying analytics for a learning portal:
the Organic. Edunet case study. Proceedings of the 1st International Conference on Learning
Analytics and Knowledge, ACM, 140-146.
Palmer S (2013) Modelling engineering student academic performance using academic analytics.
International journal of engineering education, 29(1), pp 132-138.
Pardo A, Mirriahi N, Dawson S, Zhao Y, Zhao A, Gašević D (2015) Identifying learning strate-
gies associated with active use of video annotation software. In Proceedings of the Fifth In-
ternational Conference on Learning Analytics And Knowledge. ACM. pp 255-259
Park Y, Yu JH, Jo IH (2016) Clustering blended learning courses by online behavior data: A case
study in a Korean higher education institute. The Internet and Higher Education, 29, pp 1-11
Piety PJ, Hickey DT, Bishop MJ (2014) Educational data sciences: framing emergent practices
for analytics of learning, organizations, and systems. In Proceedings of the Fourth Interna-
tional Conference on Learning Analytics And Knowledge. ACM. pp 193-202
20
Pistilli MD, Willis III JE, Campbell JP (2014) Analytics through an institutional lens: Definition,
theory, design, and impact. In Learning Analytics. Springer New York. pp 79-102
Prinsloo P, Archer E, Barnes G, Chetty Y, Van Zyl D (2015) Big (ger) data as better data in open
distance learning. The International Review of Research in Open and Distributed Learning,
16(1).
Prinsloo P, Slade S, Galpin F (2012) Learning analytics: challenges, paradoxes and opportunities
for mega open distance learning institutions. In Proceedings of the 2nd International Confer-
ence on Learning Analytics and Knowledge. ACM. pp 130-133
Ramírez-Correa P, Fuentes-Vega C (2015) Factors that affect the formation of networks for col-
laborative learning: an empirical study conducted at a Chilean university/Factores que afectan
la formación de redes para el aprendizaje colaborativo: un estudio empírico conducido en una
universidad chilena. Ingeniare: Revista Chilena de Ingenieria, 23(3), 341
Rogers T, Colvin C, Chiera B (2014) Modest analytics: using the index method to identify stu-
dents at risk of failure. In Proceedings of the Fourth International Conference on Learning
Analytics And Knowledge. ACM. pp 118-122
Romero C, Ventura S (2013) Data mining in education. Wiley Interdisciplinary Reviews: Data
Mining and Knowledge Discovery, 3(1), pp 12-27
Santos JL, Govaerts S, Verbert K, Duval E (2012) Goal-oriented visualizations of activity track-
ing: a case study with engineering students. Proceedings of the 2nd international conference
on learning analytics and knowledge, ACM, 143-152
Santos JL, Verbert K, Govaerts S, Duval E (2013) Addressing learner issues with StepUp!: an
evaluation. In Proceedings of the Third International Conference on Learning Analytics and
Knowledge. ACM. pp 14-22
Santos JL, Verbert K, Klerkx J, Duval E, Charleer S, Ternier S (2015) Tracking data in open
learning environments. Journal of Universal Computer Science, 21(7), pp 976-996
Scheffel M, Niemann K, Leony D, Pardo A, Schmitz, HC, Wolpers M, Kloos, CD (2012) Key
action extraction for learning analytics. In European Conference on Technology Enhanced
Learning. Springer Berlin Heidelberg. pp 320-333
Sclater N (2014) Code of practice “essential” for learning analytics.
http://analytics.jiscinvolve.org/wp/2014/09/18/code-of-practice-essential-for-learning-
analytics/
Shacklock X (2016) From Bricks to Clicks: the potential of data and analytics in Higher Educa-
tion. The Higher Education Commission’s (HEC) report.
Sharkey M (2011) Academic analytics landscape at the University of Phoenix. In Proceedings of
the 1st International Conference on Learning Analytics and Knowledge. ACM. pp 122-126
Siemens G (2010) What are learning analytics. Retrieved February 10, 2016 from
http://www.elearnspace.org/blog/2010/08/25/what-are-learning-analytics/
Siemens G, Long P (2011) Penetrating the Fog: Analytics in Learning and Education.
EDUCAUSE review, 46(5), pp 30-40
Simsek D, Sándor Á, Shum SB, Ferguson R, De Liddo A, Whitelock D (2015) Correlations be-
tween automated rhetorical analysis and tutors' grades on student essays. In Proceedings of
the Fifth International Conference on Learning Analytics And Knowledge. ACM. pp 355-359
Sinclair J, Kalvala S (2015) Engagement measures in massive open online courses. In Interna-
tional Workshop on Learning Technology for Education in Cloud. Springer International
Publishing, pp 3-15
Slade S, Prinsloo P (2013) Learning analytics ethical issues and dilemmas. American Behavioral
Scientist, 57(10), 1510-1529.
Softic S, Taraghi B, Ebner M, De Vocht L, Mannens E, Van de Walle R (2013) Monitoring
learning activities in PLE using semantic modelling of learner behaviour. In Human Factors
in Computing and Informatics. Springer Berlin Heidelberg. pp 74-90
Strang KD (2016) Beyond engagement analytics: which online mixed-data factors predict stu-
dent learning outcomes?. Education and Information Technologies, 1-21.
21
Swenson J (2014) Establishing an ethical literacy for learning analytics. In Proceedings of the
Fourth International Conference on Learning Analytics And Knowledge. ACM. pp 246-250
Tervakari AM, Marttila J, Kailanto M, Huhtamäki J, Koro J, Silius K (2013). Developing learn-
ing analytics for TUT Circle. In Open and Social Technologies for Networked Learning.
Springer Berlin Heidelberg. pp 101-110
Tseng SF, Tsao YW, Yu LC, Chan CL, Lai KR (2016) Who will pass? Analyzing learner behav-
iors in MOOCs. Research and Practice in Technology Enhanced Learning, 11(1), p 1
Vahdat M, Oneto L, Anguita D, Funk M, Rauterberg M (2015) A Learning Analytics Approach
to Correlate the Academic Achievements of Students with Interaction Data from an Educa-
tional Simulator. In Design for Teaching and Learning in a Networked World, Springer In-
ternational Publishing, pp 352-366
van Barneveld A, Arnold KE, Campbell JP (2012) Analytics in higher education: establishing a
common language. EDUCAUSE Learning Initiative 1, pp 1-11
Vozniuk A, Holzer A, Gillet D (2014) Peer assessment based on ratings in a social media course.
In Proceedings of the Fourth International Conference on Learning Analytics And
Knowledge. ACM. pp 133-137
Westera W, Nadolski R, Hummel H (2013). Learning analytics in serious gaming: uncovering
the hidden treasury of game log files. In International Conference on Games and Learning
Alliance. Springer International Publishing. pp 41-52
Wise AF (2014) Designing pedagogical interventions to support student use of learning analyt-
ics. In Proceedings of the Fourth International Conference on Learning Analytics And
Knowledge. ACM. pp 203-211
Wolff A, Zdrahal Z, Nikolov A, Pantucek M (2013) Improving retention: predicting at-risk stu-
dents by analysing clicking behaviour in a virtual learning environment. Proceedings of the
third international conference on learning analytics and knowledge, ACM, pp 145-149.
Wu IC, Chen WS (2013) Evaluating the Practices in the E-Learning Platform from the Perspec-
tive of Knowledge Management. In Open and Social Technologies for Networked Learning,
Springer Berlin Heidelberg. pp 81-90
Yasmin D (2013) Application of the classification tree model in predicting learner dropout be-
haviour in open and distance learning. Distance Education, 34(2), 218-231.