ChapterPDF Available

Abstract and Figures

This chapter looks into examining research studies of the last five years and presents the state of the art of Learning Analytics (LA) in the Higher Education (HE) arena. Therefore, we used mixed-method analysis and searched through three popular libraries, including the Learning Analytics and Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS) databases. We deeply examined a total of 101 papers during our study. Thereby, we are able to present an overview of the different techniques used by the studies and their associated projects. To gain insights into the trend direction of the different projects, we clustered the publications into their stakeholders. Finally, we tackled the limitations of those studies and discussed the most promising future lines and challenges. We believe the results of this review may assist universities to launch their own LA projects or improve existing ones.
Content may be subject to copyright.
Draft version originally published in: Leitner, P., Khalil, M., Ebner, M (2017) Learning Analytics in
Higher Education A Literature Review. In: Learning Analytics: Fundaments, Applications, and
Trends. Peña-Ayala, A. (Ed.). Springer International Publishing. DOI: 10.1007/978-3-319-52977-6_1.
pp. 1-23
Learning Analytics in Higher
Education - A Literature Re-
Philipp Leitner, Mohammad Khalil and Martin Ebner
Educational Technology, Graz University of Technology
{philipp.leitner, mohammad.khalil, martin.ebner}
Münzgrabenstraße 35A/I, 8010 Graz, Austria
Abstract This chapter looks into examining research studies of the last five years
and presents the state of the art of Learning Analytics (LA) in the Higher Educa-
tion (HE) arena. Therefore, we used mixed-method analysis and searched through
three popular libraries, including the Learning Analytics and Knowledge (LAK)
conference, the SpringerLink, and the Web of Science (WOS) databases. We
deeply examined a total of 101 papers during our study. Thereby, we are able to
present an overview of the different techniques used by the studies and their asso-
ciated projects. To gain insights into the trend direction of the different projects,
we clustered the publications into their stakeholders. Finally, we tackled the limi-
tations of those studies and discussed the most promising future lines and chal-
lenges. We believe the results of this review may assist universities to launch their
own Learning Analytics projects or improve existing ones.
Keywords: Learning Analytics, Higher Education, Stakeholders, Literature Re-
1 Introduction
In the area of Higher Education, Learning Analytics has proven to be helpful to
colleges and universities in strategic areas such as resource allocation, student
success, and finance. These institutions are collecting more and more data than
ever before, to maximize strategic outcomes. Based on key questions data is ana-
lyzed and predictions are made to gain insights and set actions. Many examples of
successful analytics and frameworks use are available across a diverse range of in-
stitutions (Bichsel 2012). Ethical and legal issues of collecting and processing stu-
dents’ data are seen as barriers by the Higher Education institutions in Learning
Analytics (Sclater 2014).
In this chapter, we present a literature review to evaluate the progress of Learn-
ing Analytics in Higher Education since its early beginning in 2011. We conduct-
ed the search with the three popular libraries: the Learning Analytics and
Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS)
We then refined the returned results and settled on including 101 relevant pub-
lications. This chapter mainly contributes by analyzing them and lists the used
Learning Analytics methods, limitations and stakeholders. It is expected that this
study will be a guide for academicians who would like to improve existing Learn-
ing Analytics projects or assist universities to launch their own.
The next section gives a short introduction on the topic of Learning Analytics
and describes Learning Analytics in Higher Education in detail. The subsequent
sections are concerned with our research design, methodology and execution of
the review. The outcomes of the research questions and the literature survey are
presented in the third section. The penultimate section discusses the findings and
shows the conclusion of our survey. A glance of future trends are presented in the
last section.
2 A Profile of LA and LA in HE
In this section we present a profile of Learning Analytics in general and de-
scribe the analysis process. Further, we give emphasis to Learning Analytics in
Higher Education, discuss challenges and identify the involved stakeholders.
2.1 Learning Analytics
Since its first mention in the Horizon Report 2012 (Johnson et al. 2012), Learn-
ing Analytics has gained an increasing relevance. Learning Analytics is defined as
"the measurement, collection, analysis and reporting of data about learners and
their contexts for purposes of understanding and optimizing learning and the envi-
ronments in which it occurs" (Elias 2011). Another definition states “the use of in-
telligent data, learner-produced data, and analysis models to discover information
and social connection, and to predict and advise on learning” (Siemens 2010).
The Horizon Report 2013 identified Learning Analytics as one of the most im-
portant trends in technology-enhanced learning and teaching (Johnson et al. 2013).
Therefore, it is not surprising, that Learning Analytics is the subject of many sci-
entific papers. The research and improvement of Learning Analytics involves do-
ing the development, the use and integration of new processes and tools to im-
prove the performance of teaching and learning of individual students and of
teachers. Learning Analytics focuses specifically on the process of learning (Sie-
mens and Long 2011).
Due to its connections with digital teaching and learning, Learning Analytics is
an interdisciplinary research field with connections to the field of teaching and
learning research, computer science and statistics (Johnson et al. 2013). The avail-
able data is collected, analyzed and the gained insights are used to understand the
behavior of the students to provide them additional support (Gašević et al. 2015).
A key concern of Learning Analytics is the gathering and analyzation of data as
well as the setting of appropriate interventions to improve the learners learning
experience (Greller et al. 2014). These “actionable intelligence” from data mining
is supporting the teaching and learning and provides ideas for customization, tu-
toring and intervention within the learning environment (Campbell et al. 2007).
According to Campbell and Oblinger (Campbell and Oblinger 2007), an analy-
sis process has five steps, shown in Fig. 1.1.
Fig. 1.1. The five steps of the analysis process.
Capturing, data is captured and collected in real-time from different sources
(e.g. virtual learning environments, learning management systems, personal learn-
ing environment, web portals, forums, chat rooms, and so on) and combined with
student information (Lauría et al. 2012; Tseng et al. 2016).
Reporting, the collected data is used to generate accurate models for identifying
and measuring the student’s progress. Often visualization is used in Learning Ana-
lytics dashboards for a better understanding of the data. (Muñoz-Merino et al.
2013; Leony et al. 2013)
Predicting, the data is used to identify predictors for student success, outcomes
and for identifying at-risk students. Further, it is used for decision-making about
courses and resource allocation which then is used by the decision-makers of the
institutions. (Akhtar et al. 2015; Lonn et al. 2012)
Acting, the information gained from the data analyzation process is used to set
appropriate interventions in e.g. teaching or supporting students who are at risk of
failure or dropping out (Freitas et al. 2015; Palmer 2013).
Refining, the gathered information is used in a cyclical process for continuous
improvements of the used model in teaching and learning (Nam et al. 2014; Pistilli
et al. 2014).
Although research in the field of Learning Analytics in recent years celebrates
boom, Learning Analytics is still in its infancy. Students, researchers and educa-
tional managers need to discuss ideas and opportunities on how to integrate these
possibilities in their research and practice. (Ferguson 2012)
2.2 Learning Analytics in Higher Education
Higher Education looks forward to a future of uncertainty and change. In addi-
tion to the national and global as well as political and social changes, the competi-
tion on university level increases. Higher Education needs to increase financial
and operational efficiency, expand local and global impact, establish new funding
models during a changing economic climate and respond to the demands for
greater accountability to ensure organizational success at all levels (van Barneveld
et al. 2012). Higher Education must overcome these external loads in an efficient
and dynamic manner, but also understand the needs of the student body, who rep-
resents the contributor as well as the donor of this system (Shacklock 2016).
In addition to the strong competition, universities have to deal with the rapidly
changing technologies that have arisen with the entry of the digital age. In the
course of this, institutions collected enormous amounts of relevant data as a by-
product. For instance, when students take an online course, use an Intelligent Tu-
toring System (ITS) (Arnold and Pistilli 2012; Bramucci and Gaston 2012; Fritz
2011; Santos et al. 2013) play educational games (Gibson and de Freitas 2016;
Holman et al. 2013; Holman et al. 2015; Westera et al. 2013) or simply use an
online learning platform (Casquero et al. 2014; Casquero et al. 2016; Wu and
Chen 2013; Ma et al. 2015; Santos et al. 2015; Softic et al. 2013).
In recent years, more universities use methods of Learning Analytics in order to
obtain findings on the academic progress of students, predict future behaviors and
recognize potential problems in an early stage. Further, Learning Analytics in the
context of Higher Education is an appropriate tool for reflecting the learning be-
havior of students and provide suitable assistance from teachers or tutors. This in-
dividual or group support offers new ways of teaching and provides a way to re-
flect on the learning behavior of the student. Another motivation behind the use of
Learning Analytics in universities is to improve the inter-institutional cooperation,
and the development of an agenda for the large community of students and teach-
ers (Atif et al. 2013).
On an international level, the recruitment, management and retention of stu-
dents have become as high level priorities for decision makers in institutions of
Higher Education. Especially improving the student retention starts and the under-
standing of the reason behind and/or prediction of the attrition has come in the fo-
cus of attention due to the financial losses, lower graduation rates, and inferior
school reputation in the eyes of all stakeholders (Delen 2010; Palmer 2013).
Despite that Learning Analytics focuses strongly on the learning process, the
results still in the beneficial for all stakeholders. Romero and Ventura (2013) di-
vided the involved stakeholders based on their objectives, benefits and perspec-
tives in the following four groups:
Learners, support the learner with adaptive feedback, recommendations, response
to his or her needs, for learning performance improvement.
Educators, understand students’ learning process, reflect on teaching methods
and performance, understand social, cognitive and behavioral aspects.
Researchers, use the right data mining technique which fits the problem, evalua-
tion of learning effectiveness for different settings.
Administrators, evaluation of institutional resources and their educational offer.
3 Research Design, Methodology and Execution
This research aims at the elicitation of an overview on the advancement of the
Learning Analytics field in Higher Education since it emerged in 2011. The pro-
posed Research Questions (RQ) to answer are:
RQ1: What are the research strands of the Learning Analytics field in Higher Ed-
ucation (between January 2011 and February 2016)?
RQ2: What kind of limitations do the research papers and articles mention?
RQ3: Who are the stakeholders and how could they be categorized?
RQ4: What methods do they use in their papers?
In accordance to this objective, we performed a literature review following the
procedure of Machi and McEvoy (2009). Fig. 3.1 displays the six steps for a lit-
erature review used in this process.
[ADD Picture here]
Fig. 3.1. The literature review: Six steps to success. (Machi and McEvoy 2009)
After we selected our topic, we identified data sources based on their relevance in
the computing domain:
The papers of the Learning Analytics and Knowledge conference published in
the ACM Digital Library,
The SpringerLink, and
The Thomson Reuters Web of Science database
and the following search parameters:
In the LAK papers, we didn’t need to search for the “Learning Analytics” term
because the whole conference covers the Learning Analytics discipline. We
searched the title, the abstract and the author keywords for “Higher Education”
and/or “University”.
In the SpringerLink database, we searched for the “Learning Analytics” term in
conjunction with either “Higher Education” or “University” (“Learning Analytics
AND (Higher Education OR University).
In the Web of Science database, we searched for the topic “Learning Analytics”
in conjunction with either “Higher Education” or “University” and in the research
domain “science technology”.
The defined inclusion criteria of the fetched papers from the libraries were set
to be: a) written in English, and b) published between 2011 till the February 2016.
We superficially assessed the quality of the reported studies, considering only arti-
cles that provided substantial information for Learning Analytics in Higher Educa-
tion. Therefore, we excluded articles that did not meet the outlined inclusion prin-
The literature survey was conducted in February and March 2016. In the initial
search, we found a total of 135 publications (LAK: 65, SpringerLink: 37, Web of
Science: 33). During the first stage, the search results were analyzed based on
their titles, author keywords and abstracts. After this stage, 101 papers remain for
the literature survey. We fully read each publication and actively searched for
their research questions, techniques, stakeholders, and limitations. Regular meet-
ings between the authors were set on a weekly basis to discuss the results. Addi-
tionally, we added to our spreadsheet the Google Scholar1 citation count as a
measurement of article’s impact.
In order to present our findings, we analyze each of the research questions sep-
arately. This section presents our findings.
3.1 Response to Research Question 1
In order to answer the RQ1, which corresponds to “What are the research
strands of the Learning Analytics field in Higher Education (between January
2011 and February 2016)?”, we tried to extract the main topics from the research
questions of the publications. We identified that many of the publications do not
outline their research questions clearly. Many of the examined publications de-
scribed use cases. This concerns in particular the older publications of 2011 and
1 Online:
2012, and is probably resulting from the young age of the scientific field of Learn-
ing Analytics. As a result, we did a brief text analysis on the fetched abstracts in
order to examine the robust trends in the prominent field of Learning Analytics
and Higher Education. We have collected all the article abstracts, processed them
through the R software, and then refined the resulted corpus. In the final stages,
we demonstrated the keywords and chose the Word cloud as a representation tool
of the terms as shown in Fig. 3.2. The figure was graphically generated using one
of the R library packages called “wordcloud”2.
[ADD Picture here]
Fig. 3.2. Word cloud of the prominent terms from the abstracts
In order to ease reading the cloud, we adopted four levels of representation de-
picted in four colors. The obtained list of words that have been used were classi-
fied into singular phrases, bi-grams, tri-grams and quad-grams. The most cited
singular words were “academic”, “performance”, “behavior” and “MOOCs”.
“learning environment”, “case study” and “online learning” were the most repeat-
ed bi-grams. The highest tri-grams used in the abstracts were “learning manage-
ment systems”, “Higher Education institutions” and “social network analysis”.
While quad-grams were only limited to “massive open online courses” which
were merged at the final filtering stage with the “MOOCs” term.
The word cloud shows a glance about the general topics when Learning Ana-
lytics is ascribed with Higher Education. Learning Analytics researchers focused
on utilizing its techniques towards enhancing performance and students’ behav-
iors. The popular adopted educational environment was MOOC platforms. Fur-
thermore, Learning Analytics was also used to perform practices of interventions,
observing dropout, videos, dashboards and engagement.
In Fig. 3.3 the collected articles are from the library data sources. Results show
an obvious increase in the number of publications since 2011. For instance, there
were 32 papers in 2015, incremented from 26 articles in 2014 and 17 articles in
2013. However, there were 5 articles only in 2011 and 12 articles in 2012. Be-
cause February 2016 was the date of collecting the publications in this study, the
2016 year was not indexed with many papers. On the other hand, the figure shows
the apparent involvement of the journal articles from the SpringerLink and Web of
Science libraries from 2013.
2 Online:
Fig. 3.3. Collected articles distributed by source and year.
We cross-referenced the relevant publications with Google Scholar to derive
their citation impact. Table 3.1 shows the 10 most cited publications.
Table 3.1. Citation impact of the publications
Paper Title
Year of
No. of Google
Citations (Feb.
Course Signal at Purdue: Using Learning Analytics to Increase
Student Success (Arnold and Pistilli 2012)
Social Learning Analytics: Five Approaches (Ferguson and
Shum 2012)
Classroom walls that talk: Using online course activity data of
successful students to raise self-awareness of underperforming
peers (Fritz 2011)
Goal-oriented visualizations of activity tracking: a case study
with engineering students (Santos et al. 2012)
Where is Research on Massive Open Online Courses Headed?
A Data Analysis of the MOOC Research Initiative (Gasevic et
al. 2014)
Course Correction: Using Analytics to Predict Course Success
(Barber and Sharkey 2012)
Improving retention: predicting at-risk students by analyzing
clicking behavior in a virtual learning environment (Wolff et al.
Learning designs and Learning Analytics (Lockyer and Dawson
The Pulse of Learning Analytics Understandings and Expecta-
tions from the Stakeholders (Drachsler and Greller 2012)
Inferring Higher Level Learning Information from Low Level
Data for the Khan Academy Platform (Muñoz-Merino et al.
3.2 Response to Research Question 2
We identified for RQ2, which corresponds to “What kind of limitations do the
research papers and articles mention?”, three different limitations, either clearly
mentioned in articles or being tacitly within the context.
Limitations through time, some of the publications stated that continuous work
is needed (Elbadrawy et al. 2015; Ifenthaler and Widanapathirana 2014; Koulo-
cheri and Xenos 2013; Lonn et al. 2012; Palavitsinis et al. 2011; Sharkey 2011).
Either a longitudinal study would be necessary to prove hypotheses or because of
the shortage of the project (Fritz 2011; Nam et al. 2014; Ramírez-Correa and
Fuentes-Vega 2015).
Limitations through the size, other publications talked about the need for more
detailed data (Barber and Sharkey 2012; Best and MacGregor 2015; Rogers et al.
2014), the small group sizes (Junco and Clem 2015; Jo et al. 2015; Martin and
Whitmer 2016; Strang 2016), the unsure scalability, possible problems in wider
context and the problem of the generalization of the approach or method (Prinsloo
et al. 2015; Yasmin 2013).
Limitations through the culture, many of the publications mention that their ap-
proach might only work in their educational culture and is not applicable some-
where else (Arnold et al. 2014; Drachsler and Greller 2012; Grau-Valldosera and
Minguillón 2014; Kung-Keat and Ng 2016). Additionally, the ethics differ strong-
ly around the world, so cooperation projects between different universities in dif-
ferent countries needs different moderation as well as the use of data could be eth-
ically questionable (Abdelnour-Nocera et al. 2015; Ferguson and Shum 2012;
Lonn et al. 2013; Park et al. 2016).
Furthermore, ethical discussions about data ownership and privacy have recent-
ly arisen. Slade & Prinsloo (2013) pointed out that Learning Analytics touches
various research areas and therefore overlaps with ethical perspectives in areas of
data ownership and privacy. Questions about who should own the collected and
analyzed data were highly debated. As a result, the authors classified the overlap-
ping categories in three parts:
the location and interpretation of data,
informed consent, privacy and the de-identification of data, and
the management, classification and storage of data.
These three elements generate an imbalance of power between the stakeholders
which they addressed by proposing a list of 6 grounding principles and considera-
tions: Learning Analytics as moral practice, students as agents, student identity
and performance are temporal dynamic constructs, Student success is a complex
and multidimensional phenomenon, transparency, higher education cannot afford
to not use data. (Slade and Prinsloo 2013)
3.3 Response to Research Question 3
In order to answer the RQ3, which corresponds to “Who are the stakeholders
and how could they be categorized?”, we determined the stakeholders from the
publications and categorized them into three types. As a basis, we took the four
stakeholders as mentioned in section 2.2 and introduced in (Machi and McEvoy
2009). We merged the Researchers and Administrators from the original classifi-
cation into one distinct group. Therefore, the institutional perspective (Academic
Analytics) is separated from the learners’ and teachers’ one (Learning Analytics).
Fig. 3.4 depicts the defined Learning Analytics stakeholders as a VENN-
Diagram. The figure shows that there had been more research conducted concern-
ing the Researchers/Administrators with overall 65 publications and 40 of them
only concerning themselves, than in the field of Learners with a total of 53 publi-
cations and 21 single mentions. Also, it seems that Teachers are only a “side-
product” of this field with only 20 mentions and only 7 dedicated to them alone.
Fig. 3.4. VENN-diagram of stakeholders in the publications
Most of the combined articles addressed Researchers/Administrators together
with Learners (20 publications). Only 8 articles can be found with an overlap be-
tween Learners and Teachers, which should be one of the most researched and
discussed combinations within Learning Analytics in Higher Education. Nearly no
work has been done by combining Researchers/Administrators with Teachers (in 1
publications) and only 4 paper combined all 3 stakeholders. This lack of research
will be a matter of debate in the discussion section.
3.4 Response to Research Question 4
By analyzing the selected studies to answer RQ4, which corresponds to “What
techniques do they use in their papers?”, we identified the techniques used in
Learning Analytics and Higher Education publications. We took into account the
methods presented by Romero & Ventura (2013), Khalil & Ebner (2016) and Li-
nan & Perez (2015). We propose an overview of the used techniques of the differ-
ent articles in Table 2.
Table 3.2. Overview of the used Learning Analytics techniques of this study
Key applications
Predicting student performance and detecting
student behaviors.
Grouping similar materials or students based
on their learning and interaction patterns.
Outlier Detection
Detection of students with difficulties or irregular
learning processes.
Relationship Min-
Identifying relationships in learner behavior pat-
terns and diagnosing student difficulties.
Social Network
Interpretation of the structure and relations in
collaborative activities and interactions with
communication tools.
Process Mining
Reflecting student behavior in terms of its exam-
ination traces, consisting of a sequence of course,
grade and timestamp.
Text mining
Analyzing the contents of forums, chats, web
pages and documents.
Distillation of Da-
ta for Human
Helping instructors to visualize and analyze
the ongoing activities of the students and the
use of information.
Discovery with
Identification of relationships among student
behaviors and characteristics or contextual
variables. Integration of psychometric model-
ling frameworks into machine-learning mod-
Include possibilities for playful learning to
maintain motivation; e.g. integration of
achievements, experience points or badges as
indicators of success.
Machine Learning
Find hidden insights in data automatically
(based on models who are exposed to new da-
ta and adapt itself independently).
Analysis and interpretation of quantitative da-
ta for decision making.
The results of Fig. 3.5 show, that the research is focused mainly on prediction
with a total of 36 citations. Outlier detection for pointing out at-risk or dropping
out students with a citation count of 29. Distillation of data for human judgment in
form of a visualization with a citation count of 33 than in all other parts including
rarely used techniques like gamification or machine learning with a total amount
of 102 counts.
[ADD Picture here]
Fig. 3.5. The publication count of the used Learning Analytics techniques
4 Discussion and Conclusion
In this chapter, we examined hundreds of pages to introduce a remarkable liter-
ature review of the Learning Analytics field in the Higher Education domain. We
presented a state-of-the-art study of both domains based on analyzing articles from
three major library references: the Learning Analytics and Knowledge conference,
SpringerLink and Web of Science. The total number of relevant publications were
equal to 101 articles in a period between 2011-2016.
In this literature review study, we followed the procedure of Machi and
McEvoy (2009) in which we selected the topic, searched the literature to get the
answers to the research questions, surveyed and critiqued the literature and finally
introduced our review. Using this big dataset, we identified the research strands of
the relevant publications. Most of the publications described use cases rather than
comprehensive research - especially the prior publications, which is comprehensi-
ble because at the time, the universities had to figure out how to handle and har-
ness the abilities offered by Learning Analytics for their benefit.
To make a better holistic overview on the advancement of Learning Analytics
field in Higher Education, we proposed four main research questions. These ques-
tions were related to the research strands of Learning Analytics in Higher Educa-
tion, limitations, stakeholders and what techniques were used by Learning Analyt-
ics experts in the Higher Education domain, respectively.
The first research question was answered by generating a word cloud of a final
corpus which was formed from all abstracts of the included papers. Results re-
vealed that the usage of MOOCs, enhancing learning performance, students be-
havior, and benchmarking learning environments were strongly researched by
Learning Analytics experts in the domain of Higher Education. In addition, the
paper with the title “Course signals at Purdue: using learning analytics to increase
student success” by Arnold and Pistilli (2012), was the most cited article of our
inclusion, which focused on a tool of prediction. Also, we identified that there was
a clear increase in the number of publications since 2011 till 2015, Further it was
shown the apparent involvement of the journal articles from the SpringerLink and
Web of Science libraries in 2013 and 2015 over the LAK conference publications.
The second research questions showed that limitations were mainly concerning the
needed time to prepare data or getting the results, the size of the available dataset
and examined group and ethical reasons. While the discussions of privacy and
ownership have arisen dramatically after 2012, we found that the ethical con-
straints drive the limitations to the greatest extent of this literature review study
similar to the arguments in (Khalil and Ebner 2015; Khalil and Ebner 2016b).
The analysis shows that there was clamor regarding who are the main stake-
holders of Learning Analytics and Higher Education. As the leading stakeholders
of Learning Analytics should be learners and students (Khalil and Ebner 2015),
we found that researchers play a major role of the loop between Higher Education
and Learning Analytics. Fig. 3.4 demonstrated the high use of researchers and
administrators in carrying out decisions. The direct overlap between learners and
teachers was not evidently identified in our study.
At the final stage, we tried to elaborate what were the most used techniques of
Learning Analytics in Higher Education. This research question was answered
based on solid articles that discussed the Learning Analytics Techniques. The
scanning showed that prediction, distilling of data for human judgment, and outli-
er detection were the most used methods in the Higher Education domain. General
data mining methodologies from text mining to social network analysis were iden-
tified with high usage in the analyzed publications. On the other hand, we noticed
that there are new techniques that seem to be used more frequently in the past two
years such as serious gaming, which belongs to the gamification techniques.
5 Future Trends
In this chapter we are going to tackle the future development in the field of
Learning Analytics in Higher Education, which can be divided into short-term (1-
2 years) and long term (3-5 years) trends.
5.1 short-term trends
Over the next 1 to 2 years, universities must adjust to the social and economic
factors, which postulated the change in the capabilities of the students (Johnson et
al. 2016).
The tuning of the areas analysis, consultation, examination of individual learn-
ing outcomes and the visualization of continuously-available, aggregated infor-
mation in dashboards are gaining more and more importance. Students expect re-
al-time feedback during learning with critical self-reflection on the learning
progress and learning goal which strengthens their expertise in self-organization.
If adequate quantities of data from students are available, they can be carried out
for subsequently, predictive analytics. (Johnson et al. 2016)
5.2 long-term trends
The relevance of Learning Analytics in Higher Education will mint even more
over the next 3 to 5 years. This trend is promoted by the strong interest of students
for individual evaluations and care. To serve this market, dashboards and analysis
applications that specifically address the needs of each customer will develop
stronger. This approach offers many advantages: Accessing your own data in an
appropriate form allows better self-reflection and a healthy rivalry among the fel-
low students.
The teachers can survey a large amount of students and precisely recognize
those who need their help. University and college dropouts can be better detected
by appropriate analyzing and with targeted interventions they remain in the uni-
versity system. (Shacklock 2016)
To master the associated problems, the Learning Analytics market will have to
change. Currently, many different systems and analytical approaches are used.
The fragmentation of the market will grow even further in the future, which makes
the interuniversity comparison very difficult or even impossible. Therefore, the
creation of standards is essential. (Shacklock 2016)
Furthermore, a change in the type of analysis is foreseeable. Most current and
past data have been used to measure the success of students. Today, advances in
predictive analytics (predictive analysis) are more important. By using the analysis
of existing data sets of many students, predictive models can be developed and
warn thus students who are at risk not to meet their learning success. (Shacklock
Acknowledgments This research project is co-funded by the European Commission Erasmus+
program, in the context of the project 562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD
Abdelnour-Nocera J, Oussena S, Burns C (2015) Human Work Interaction Design of the Smart
University. In Human Work Interaction Design. Work Analysis and Interaction Design
Methods for Pervasive and Smart Workplaces. Springer International Publishing, 127-140
AbuKhousa E, Atif Y (2016) Virtual Social Spaces for Practice and Experience Sharing. In
State-of-the-Art and Future Directions of Smart Learning, Springer Singapore, pp 409-414
Aguiar E, Chawla NV, Brockman J, Ambrose GA, Goodrich V (2014) Engagement vs perfor-
mance: using electronic portfolios to predict first semester engineering student retention. In
Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
ACM. pp 103-112
Aguilar S, Lonn S, Teasley SD (2014) Perceptions and use of an early warning system during a
higher education transition program. In Proceedings of the fourth international conference on
learning analytics and knowledge. ACM. pp 113-117
Akhtar S, Warburton S, Xu W (2015) The use of an online learning and teaching system for
monitoring computer aided design student participation and predicting student success. Inter-
national Journal of Technology and Design Education, pp 1-20
Arnold KE, Lonn S, Pistilli MD (2014) An exercise in institutional reflection: The learning ana-
lytics readiness instrument (LARI). In Proceedings of the Fourth International Conference on
Learning Penetrating the Black Box of Time-on-task Estimation And Knowledge. ACM. pp
Arnold KE, Pistilli MD (2012) Course signals at Purdue: using learning analytics to increase stu-
dent success. Proceedings of the 2nd international conference on learning analytics and
knowledge, ACM, pp 267-270
Asif R, Merceron A, Pathan MK (2015) Investigating performance of students: a longitudinal
study. In Proceedings of the Fifth International Conference on Learning Analytics And
Knowledge. ACM. pp 108-112
Atif A, Richards D, Bilgin A, Marrone M (2013) Learning analytics in higher education: a sum-
mary of tools and approaches. 30th Australasian Society for Computers in Learning in Ter-
tiary Education Conference, Sydney.
Barber R, Sharkey M (2012) Course correction: using analytics to predict course success. Pro-
ceedings of the 2nd international conference on learning analytics and knowledge, ACM, pp
Best M, MacGregor D (2015) Transitioning Design and Technology Education from physical
classrooms to virtual spaces: implications for pre-service teacher education. International
Journal of Technology and Design Education, 1-13.
Bichsel J (2012) Analytics in higher education: Benefits, barriers, progress, and recommenda-
tions. EDUCAUSE Center for Applied Research
Bramucci R, Gaston J (2012) Sherpa: increasing student success with a recommendation engine.
In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.
ACM. pp 82-83
Cambruzzi WL, Rigo SJ, Barbosa JL (2015) Dropout Prediction and Reduction in Distance Edu-
cation Courses with the Learning Analytics Multitrail Approach. J. UCS, 21(1), pp 23-47
Campbell JP, DeBlois PB, Oblinger DG (2007). Academic Analytics: A New Tool for a New
Era, EDUCAUSE Review, 42(4), pp 4057
Campbell JP, Oblinger DG (2007) Academic analytics, EDUCAUSE White Paper. Retrieved
February 10, 2016 from
Casquero O, Ovelar R, Romo J, Benito M (2014) Personal learning environments, higher educa-
tion and learning analytics: a study of the effects of service multiplexity on undergraduate
students’ personal networks/Entornos de aprendizaje personales, educación superior y analíti-
ca del aprendizaje: un estudio sobre los efectos de la multiplicidad de servicios en las redes
personales de estudiantes universitarios. Cultura y Educación, 26(4), pp 696-738
Casquero O, Ovelar R, Romo J, Benito M, Alberdi M (2016) Students' personal networks in vir-
tual and personal learning environments: a case study in higher education using learning ana-
lytics approach. Interactive Learning Environments, 24(1), pp 49-67
Clow D (2014) Data wranglers: human interpreters to help close the feedback loop. In Proceed-
ings of the Fourth International Conference on Learning Analytics And Knowledge. ACM.
pp 49-53
Corrigan, O., Smeaton, A. F., Glynn, M., & Smyth, S. (2015). Using Educational Analytics to
Improve Test Performance. In Design for Teaching and Learning in a Networked World (pp.
42-55). Springer International Publishing.
Delen D (2010) A comparative analysis of machine learning techniques for student retention
management. Decision Support Systems, 49(4), pp 498-506.
Drachsler H, Greller W (2012) The pulse of learning analytics understandings and expectations
from the stakeholders. Proceedings of the 2nd international conference on learning analytics
and knowledge, ACM, pp 120-129.
Elbadrawy A, Studham RS, Karypis G (2015) Collaborative multi-regression models for predict-
ing students' performance in course activities. In Proceedings of the Fifth International Con-
ference on Learning Analytics And Knowledge. ACM. pp 103-107
Elias T (2011) Learning Analytics: Definitions, Processes and Potential
Ferguson R (2012) Learning analytics: drivers, developments and challenges. International Jour-
nal of Technology Enhanced Learning. 4(5/6), pp 304317.
Ferguson R, Shum SB (2012) Social learning analytics: five approaches. Proceedings of the 2nd
international conference on learning analytics and knowledge, ACM, pp 23-33
Freitas S, Gibson D, Du Plessis C, Halloran P, Williams E, Ambrose M, Dunwell I, Arnab S
(2015) Foundations of dynamic learning analytics: Using university student data to increase
retention. British Journal of Educational Technology, 46(6), pp 1175-1188
Fritz J (2011) Classroom walls that talk: Using online course activity data of successful students
to raise self-awareness of underperforming peers. The Internet and Higher Education, 14(2),
pp 89-97.
Gašević D, Dawson S, Siemens G (2015) Let’s not forget: Learning analytics are about learning.
TechTrends, 59(1), pp 64-71.
Gasevic D, Kovanovic V, Joksimovic S, Siemens G (2014) Where is research on massive open
online courses headed? A data analysis of the MOOC Research Initiative. The International
Review Of Research In Open And Distributed Learning, 15(5).
Gibson A, Kitto K, Willis J (2014) A cognitive processing framework for learning analytics. In
Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
ACM. pp 212-216
Gibson D, de Freitas S (2016) Exploratory analysis in learning analytics. Technology,
Knowledge and Learning, 21(1), pp 5-19
Grann J, Bushway D (2014) Competency map: Visualizing student learning to promote student
success. In Proceedings of the fourth international conference on learning analytics and
knowledge. ACM. pp 168-172
Grau-Valldosera J, Minguillón J (2011) Redefining dropping out in online higher education: a
case study from the UOC. In Proceedings of the 1st International Conference on Learning
Analytics and Knowledge. ACM. pp 75-80
Grau-Valldosera J, Minguillón J (2014) Rethinking dropout in online higher education: The case
of the Universitat Oberta de Catalunya. The International Review of Research in Open and
Distributed Learning, 15(1).
Greller W, Ebner M, Schön M (2014) Learning Analytics: From Theory to PracticeData Sup-
port for Learning and Teaching. Computer Assisted Assessment. Research into E-
Assessment. Springer International Publishing, pp 79-87.
Groeneveld CM (2014) Implementation of an Adaptive Training and Tracking Game in Statistics
Teaching. In International Computer Assisted Assessment Conference, Springer International
Publishing, pp 53-58
Harrison S, Villano R, Lynch G, Chen G (2015) Likelihood analysis of student enrollment out-
comes using learning environment variables: A case study approach. In Proceedings of the
Fifth International Conference on Learning Analytics And Knowledge. ACM. pp 141-145
Hecking T, Ziebarth S, Hoppe HU (2014) Analysis of dynamic resource access patterns in a
blended learning course. In Proceedings of the Fourth International Conference on Learning
Analytics and Knowledge. ACM. pp 173-182
Holman C, Aguilar S, Fishman B (2013) GradeCraft: what can we learn from a game-inspired
learning management system?. Proceedings of the Third International Conference on Learn-
ing Analytics and Knowledge, ACM, 260-264.
Holman C, Aguilar SJ, Levick A, Stern J, Plummer B, Fishman B (2015) Planning for success:
how students use a grade prediction tool to win their classes. In Proceedings of the Fifth In-
ternational Conference on Learning Analytics And Knowledge. ACM. pp 260-264
Ifenthaler D, Widanapathirana C (2014) Development and validation of a learning analytics
framework: Two case studies using support vector machines. Technology, Knowledge and
Learning, 19(1-2), pp 221-240
Jo IH, Yu T, Lee H, Kim Y (2015) Relations between student online learning behavior and aca-
demic achievement in higher education: A learning analytics approach. In Emerging issues in
smart learning. Springer Berlin Heidelberg. pp 275-287
Johnson L, Adams Becker S, Cummins M, Freeman A, Ifenthaler D, Vardaxis N (2013) Tech-
nology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Project Re-
gional Analysis. New Media Consortium.
Johnson L, Adams S, Cummins M (2012) The NMC Horizon Report: 2012 Higher Education
Edition. The New Media Consortium, Austin
Johnson L, Adams S, Cummins M, Estrada V, Freeman A, Hall C (2016) NMC Horizon Report:
2016 Higher Education Edition. The New Media Consortium, Austin, Texas.
Junco R, Clem C (2015) Predicting course outcomes with digital textbook usage data. The Inter-
net and Higher Education, 27, pp 54-63.
Khalil M, Ebner M (2015) Learning Analytics: Principles and Constraints. In Proceedings of
World Conference on Educational Multimedia, Hypermedia and Telecommunications pp
Khalil M, Ebner M (2016a) What is Learning Analytics about? A Survey of Different Methods
Used in 2013-2015. Proceedings of Smart Learning Conference, Dubai, UAE, 7-9 March,
Dubai: HBMSU Publishing House, 294-304.
Khalil M, Ebner M (2016b). De-Identification in Learning Analytics. Journal of Learning Ana-
lytics, 3(1), pp 129-138
Khousa EA, Atif Y(2014) A Learning Analytics Approach to Career Readiness Development in
Higher Education. In International Conference on Web-Based Learning. Springer Interna-
tional Publishing. pp 133-141
Kim J, Jo IH, Park Y (2016) Effects of learning analytics dashboard: analyzing the relations
among dashboard utilization, satisfaction, and learning achievement. Asia Pacific Education
Review, 17(1), pp 13-24
Koulocheri E, Xenos M (2013) Considering formal assessment in learning analytics within a
PLE: the HOU2LEARN case. In Proceedings of the Third International Conference on
Learning Analytics and Knowledge. ACM. pp 28-32
Kovanović V, Gašević D, Dawson S, Joksimović S, Baker RS, Hatala M (2015) Penetrating the
black box of time-on-task estimation. In Proceedings of the Fifth International Conference on
Learning Analytics And Knowledge. ACM. pp 184-193
Kung-Keat T, Ng J (2016) Confused, Bored, Excited? An Emotion Based Approach to the De-
sign of Online Learning Systems. In 7th International Conference on University Learning and
Teaching (InCULT 2014) Proceedings. Springer Singapore. pp 221-233
Lauría EJ, Baron JD, Devireddy M, Sundararaju V, Jayaprakash SM (2012) Mining academic
data to improve college student retention: An open source perspective. In Proceedings of the
2nd International Conference on Learning Analytics and Knowledge, ACM, pp 139-142
Leony D, Muñoz-Merino PJ, Pardo A, Kloos CD (2013) Provision of awareness of learners’
emotions through visualizations in a computer interaction-based environment. Expert Sys-
tems with Applications, 40(13), 5093-5100.
Liñán LC, Pérez ÁAJ (2015) Educational Data Mining and Learning Analytics: differences, sim-
ilarities, and time evolution. Revista de Universidad y Sociedad del Conocimiento, 12(3), 98-
Lockyer L, Dawson S (2011) Learning designs and learning analytics. Proceedings of the 1st in-
ternational conference on learning analytics and knowledge, ACM, pp 153-156.
Lonn S, Aguilar S, Teasley SD (2013) Issues, challenges, and lessons learned when scaling up a
learning analytics intervention. In Proceedings of the third international conference on learn-
ing analytics and knowledge. ACM. pp 235-239
Lonn S, Krumm AE, Waddington RJ, Teasley SD (2012) Bridging the gap from knowledge to
action: Putting analytics in the hands of academic advisors. Proceedings of the 2nd Interna-
tional Conference on Learning Analytics and Knowledge, ACM, 184-18
Lotsari E, Verykios VS, Panagiotakopoulos C, Kalles D (2014) A learning analytics methodolo-
gy for student profiling. In Hellenic Conference on Artificial Intelligence. Springer Interna-
tional Publishing. pp 300-312
Ma J, Han X, Yang J, Cheng J (2015) Examining the necessary condition for engagement in an
online learning environment based on learning analytics approach: The role of the instructor.
The Internet and Higher Education, 24, pp 26-34
Machi LA, McEvoy BT (2009) The literature review: Six steps to success. Thousand Oaks:
Corwin Sage
Manso-Vázquez M, Llamas-Nistal M (2015) A Monitoring System to Ease Self-Regulated
Learning Processes. IEEE Revista Iberoamericana de Tecnologias del Aprendizaje, 10(2), pp
Martin F, Whitmer JC (2016) Applying Learning Analytics to Investigate Timed Release in
Online Learning. Technology, Knowledge and Learning, 21(1), 59-74.
McKay T, Miller K, Tritz J (2012) What to do with actionable intelligence: E 2 Coach as an in-
tervention engine. In Proceedings of the 2nd International Conference on Learning Analytics
and Knowledge. ACM. pp 88-91
Menchaca I, Guenaga M, Solabarrieta J (2015) Project-Based Learning: Methodology and As-
sessment Learning Technologies and Assessment Criteria. In Design for Teaching and Learn-
ing in a Networked World. Springer International Publishing. pp 601-604
Mirriahi N, Liaqat D, Dawson S, Gašević D (2016) Uncovering student learning profiles with a
video annotation tool: reflective learning with and without instructional norms. Educational
Technology Research and Development, pp 1-24
Muñoz-Merino PJ, Valiente JAR, Kloos CD (2013) Inferring higher level learning information
from low level data for the Khan Academy platform. Proceedings of the third international
conference on learning analytics and knowledge, ACM, 112-116
Nam S, Lonn S, Brown T, Davis CS, Koch D (2014) Customized course advising: investigating
engineering student success with incoming profiles and patterns of concurrent course enroll-
ment. Proceedings of the Fourth International Conference on Learning Analytics And
Knowledge, ACM, 16-25.
Nespereira CG, Elhariri E, El-Bendary N, Vilas AF, Redondo RPD (2016) Machine Learning
Based Classification Approach for Predicting Students Performance in Blended Learning. In
The 1st International Conference on Advanced Intelligent System and Informatics
(AISI2015), November 28-30, 2015, Beni Suef, Egypt. Springer International Publishing. pp
Øhrstrøm P, Sandborg-Petersen U, Thorvaldsen S, Ploug T (2013) Teaching logic through web-
based and gamified quizzing of formal arguments. In European Conference on Technology
Enhanced Learning. Springer Berlin Heidelberg. pp 410-423
Palavitsinis N, Protonotarios V, Manouselis N (2011) Applying analytics for a learning portal:
the Organic. Edunet case study. Proceedings of the 1st International Conference on Learning
Analytics and Knowledge, ACM, 140-146.
Palmer S (2013) Modelling engineering student academic performance using academic analytics.
International journal of engineering education, 29(1), pp 132-138.
Pardo A, Mirriahi N, Dawson S, Zhao Y, Zhao A, Gašević D (2015) Identifying learning strate-
gies associated with active use of video annotation software. In Proceedings of the Fifth In-
ternational Conference on Learning Analytics And Knowledge. ACM. pp 255-259
Park Y, Yu JH, Jo IH (2016) Clustering blended learning courses by online behavior data: A case
study in a Korean higher education institute. The Internet and Higher Education, 29, pp 1-11
Piety PJ, Hickey DT, Bishop MJ (2014) Educational data sciences: framing emergent practices
for analytics of learning, organizations, and systems. In Proceedings of the Fourth Interna-
tional Conference on Learning Analytics And Knowledge. ACM. pp 193-202
Pistilli MD, Willis III JE, Campbell JP (2014) Analytics through an institutional lens: Definition,
theory, design, and impact. In Learning Analytics. Springer New York. pp 79-102
Prinsloo P, Archer E, Barnes G, Chetty Y, Van Zyl D (2015) Big (ger) data as better data in open
distance learning. The International Review of Research in Open and Distributed Learning,
Prinsloo P, Slade S, Galpin F (2012) Learning analytics: challenges, paradoxes and opportunities
for mega open distance learning institutions. In Proceedings of the 2nd International Confer-
ence on Learning Analytics and Knowledge. ACM. pp 130-133
Ramírez-Correa P, Fuentes-Vega C (2015) Factors that affect the formation of networks for col-
laborative learning: an empirical study conducted at a Chilean university/Factores que afectan
la formación de redes para el aprendizaje colaborativo: un estudio empírico conducido en una
universidad chilena. Ingeniare: Revista Chilena de Ingenieria, 23(3), 341
Rogers T, Colvin C, Chiera B (2014) Modest analytics: using the index method to identify stu-
dents at risk of failure. In Proceedings of the Fourth International Conference on Learning
Analytics And Knowledge. ACM. pp 118-122
Romero C, Ventura S (2013) Data mining in education. Wiley Interdisciplinary Reviews: Data
Mining and Knowledge Discovery, 3(1), pp 12-27
Santos JL, Govaerts S, Verbert K, Duval E (2012) Goal-oriented visualizations of activity track-
ing: a case study with engineering students. Proceedings of the 2nd international conference
on learning analytics and knowledge, ACM, 143-152
Santos JL, Verbert K, Govaerts S, Duval E (2013) Addressing learner issues with StepUp!: an
evaluation. In Proceedings of the Third International Conference on Learning Analytics and
Knowledge. ACM. pp 14-22
Santos JL, Verbert K, Klerkx J, Duval E, Charleer S, Ternier S (2015) Tracking data in open
learning environments. Journal of Universal Computer Science, 21(7), pp 976-996
Scheffel M, Niemann K, Leony D, Pardo A, Schmitz, HC, Wolpers M, Kloos, CD (2012) Key
action extraction for learning analytics. In European Conference on Technology Enhanced
Learning. Springer Berlin Heidelberg. pp 320-333
Sclater N (2014) Code of practice “essential” for learning analytics.
Shacklock X (2016) From Bricks to Clicks: the potential of data and analytics in Higher Educa-
tion. The Higher Education Commission’s (HEC) report.
Sharkey M (2011) Academic analytics landscape at the University of Phoenix. In Proceedings of
the 1st International Conference on Learning Analytics and Knowledge. ACM. pp 122-126
Siemens G (2010) What are learning analytics. Retrieved February 10, 2016 from
Siemens G, Long P (2011) Penetrating the Fog: Analytics in Learning and Education.
EDUCAUSE review, 46(5), pp 30-40
Simsek D, Sándor Á, Shum SB, Ferguson R, De Liddo A, Whitelock D (2015) Correlations be-
tween automated rhetorical analysis and tutors' grades on student essays. In Proceedings of
the Fifth International Conference on Learning Analytics And Knowledge. ACM. pp 355-359
Sinclair J, Kalvala S (2015) Engagement measures in massive open online courses. In Interna-
tional Workshop on Learning Technology for Education in Cloud. Springer International
Publishing, pp 3-15
Slade S, Prinsloo P (2013) Learning analytics ethical issues and dilemmas. American Behavioral
Scientist, 57(10), 1510-1529.
Softic S, Taraghi B, Ebner M, De Vocht L, Mannens E, Van de Walle R (2013) Monitoring
learning activities in PLE using semantic modelling of learner behaviour. In Human Factors
in Computing and Informatics. Springer Berlin Heidelberg. pp 74-90
Strang KD (2016) Beyond engagement analytics: which online mixed-data factors predict stu-
dent learning outcomes?. Education and Information Technologies, 1-21.
Swenson J (2014) Establishing an ethical literacy for learning analytics. In Proceedings of the
Fourth International Conference on Learning Analytics And Knowledge. ACM. pp 246-250
Tervakari AM, Marttila J, Kailanto M, Huhtamäki J, Koro J, Silius K (2013). Developing learn-
ing analytics for TUT Circle. In Open and Social Technologies for Networked Learning.
Springer Berlin Heidelberg. pp 101-110
Tseng SF, Tsao YW, Yu LC, Chan CL, Lai KR (2016) Who will pass? Analyzing learner behav-
iors in MOOCs. Research and Practice in Technology Enhanced Learning, 11(1), p 1
Vahdat M, Oneto L, Anguita D, Funk M, Rauterberg M (2015) A Learning Analytics Approach
to Correlate the Academic Achievements of Students with Interaction Data from an Educa-
tional Simulator. In Design for Teaching and Learning in a Networked World, Springer In-
ternational Publishing, pp 352-366
van Barneveld A, Arnold KE, Campbell JP (2012) Analytics in higher education: establishing a
common language. EDUCAUSE Learning Initiative 1, pp 1-11
Vozniuk A, Holzer A, Gillet D (2014) Peer assessment based on ratings in a social media course.
In Proceedings of the Fourth International Conference on Learning Analytics And
Knowledge. ACM. pp 133-137
Westera W, Nadolski R, Hummel H (2013). Learning analytics in serious gaming: uncovering
the hidden treasury of game log files. In International Conference on Games and Learning
Alliance. Springer International Publishing. pp 41-52
Wise AF (2014) Designing pedagogical interventions to support student use of learning analyt-
ics. In Proceedings of the Fourth International Conference on Learning Analytics And
Knowledge. ACM. pp 203-211
Wolff A, Zdrahal Z, Nikolov A, Pantucek M (2013) Improving retention: predicting at-risk stu-
dents by analysing clicking behaviour in a virtual learning environment. Proceedings of the
third international conference on learning analytics and knowledge, ACM, pp 145-149.
Wu IC, Chen WS (2013) Evaluating the Practices in the E-Learning Platform from the Perspec-
tive of Knowledge Management. In Open and Social Technologies for Networked Learning,
Springer Berlin Heidelberg. pp 81-90
Yasmin D (2013) Application of the classification tree model in predicting learner dropout be-
haviour in open and distance learning. Distance Education, 34(2), 218-231.
... Artificial intelligence (AI) and machine learning (ML) can optimise and automate several analytics processes, bringing changes to business processes (Elayyan 2021). Studies suggest that learning analytics has the potential of changing higher education (Siemens & Gasevic 2012;Leitner et al. 2017;Wong 2017;Viberg et al. 2018;Sousa et al. 2021). The success of learning analytics in supporting decision-making depends on various factors, which need to be considered and addressed by the institution. ...
... LMS may be considered the most common source of learning analytics in higher education. As students navigate through an LMS, their behaviour is recorded, leaving a "digital footprint" which may be analysed to give insights on student engagements, predictions on future trends and chances of success and make decisions accordingly, as ascertained by Leitner et al. (2017). ...
... This study aims to assess institutional readiness for using learning analytics to support decision-making. There are various decision-support systems in use in higher education, most of which focus on enhancing the student experience and improving institutional performance (Leitner et al. 2017). This paper was done to explore institutional readiness for learning analytics. ...
Full-text available
The Fourth Industrial Revolution (4IR) brought disruptive technologies, dramatically changing the way businesses operate. Higher education institutions make use of learning management systems (LMS) primarily for teaching, learning and assessment. The COVID-19 pandemic has pushed the use of technology for academic continuity, resulting in institutions using LMS for virtual engagements with students, student collaborations, assessments, and as a repository for resources. Student behaviour on the LMS can be tracked, giving useful learning analytics which may be used to improve student success, retention, experience, and institutional performance. This paper is an exploration of institutional readiness for learning analytics. We adopted a qualitative approach, using purposive sampling to select the institution and initial participants. We used the snowball technique to recruit further participants. The personality traits stated in the Technology Readiness Index model were used to formulate interview questions. The findings show that the institution has systems in place to support students, which were launched to address insights from LMS-based learning analytics. The institution is ready for using learning analytics, with participants innovatively using the LMS, showing enthusiasm, and optimisation of the full potential of learning analytics. We recommend the use of learning analytics to come up with effective student support.KeywordsFourth Industrial Revolution (4IR)Data-DrivenDecision-SupportHigher EducationLearning AnalyticsTechnology Readiness
... 2) викладачі зацікавлені в розумінні процесів навчання студентів, соціальних, когнітивних та поведінкових аспектів, що відображають застосовувані методики навчання та їх оптимізацію для досягнення кращого результату навчання [17], хочуть більш ефективно оцінювати діяльність студентів та мати можливість робити висновки про те, яких саме заходів потрібно вжити для підвищення ефективності навчання студентів; ...
... Після створення першого прототипу та його оцінювання переходять до наступного кроку, який полягає у реалізації прототипу. Вирішення цього питання є досить критичним, адже під час започаткування проєктів зі впровадження навчальної аналітики необхідно враховувати переносимість та масштабованість даних [17]. З'являються нові проблеми, пов'язані з масштабованістю реалізації, перевизначенням процесів, які були створені вручну так, щоб вони виконувалися автоматизовано чи напівавтоматизовано. ...
Full-text available
Стаття присвячена дослідженню проблем впровадження Learning Analytics – навчальної аналітики у сферу вищої освіти. Розкрито зміст поняття «Learning Analytics», проаналізовано досвід її впровадження у діяльність вищих навчальних закладів країн світу. Установлено, що навчальна аналітика як галузь наукового дослідження є поєднанням інформаційних технологій, цифрового викладання і навчання та методів інтелектуального аналізу даних, що обумовлює специфіку її формування та проблематику. Виявлено задачі, які вона дозволяє розв’язувати стосовно різних аспектів електронного навчання: прогнозування, виявлення структури, виявлення зв’язків та асоціацій на основі аналізу цифрових слідів студентів у освітніх електронних середовищах. З’ясовано перспективні напрями досліджень на сучасному етапі. Установлено, що впровадження у діяльність закладів вищої освіти основних типів навчальної аналітики: описової, прогностичної та пропонуючої, дає можливість отримувати інформацію про поточний стан електронного навчання та оперативно приймати рішення стосовно його корекції й оптимізації. Сформульовано перелік проблем, пов’язаних зі стратегічним плануванням і політикою впровадження навчальної аналітики у діяльність вищих навчальних закладів: недосконалість керівництва в реалізації проектів; нерівномірне залучення різних зацікавлених сторін; недостатній рівень педагогічних підходів при інтерпретації отримуваних даних; недостатній рівень підготовки персоналу; недостатня кількість досліджень, емпірично підтверджуючих вплив на ефективність навчального процесу; недосконалість нормативного регулювання. Показано, що ці проблеми є міждисциплінарними, а їх вирішення потребує тісної співпраці та узгоджених дій адміністраторів, ІТ-фахівців, викладачів та педагогів-дослідників упродовж усіх етапів реалізації проекту. Теоретично обґрунтовано пропозиції щодо заходів, спрямованих на подолання міждисциплінарного бар’єру у процесі розробки та експлуатації проектів навчальної аналітики: чіткість та прозорість цілей і ініціатив; задоволення потреб усіх зацікавлених сторін; забезпечення необхідної ІТ-інфраструктури; підготовка співробітників, які будуть надавати допомогу в інтерпретації отриманих результатів; забезпечення безпеки конфіденційних даних; розробка нормативних положень стосовно функціонування та використання навчальної аналітики.
... An overview of the recent findings in this research field has been presented in several literature reviews ( Table 1). The reviews are mainly focused on specific areas of use of intelligence, such as the use of machine learning techniques (Alenezi and Faisal 2020;Farhat et al. 2020;Khanal et al. 2020;Tang et al. 2021), educational data mining (Al-Razgan et al. 2014;Du et al. 2020;Dutt et al. 2017;Martins et al. 2018;Silva and Fonseca 2017), knowledge tracing (Am et al. 2021;Dai et al. 2021), learning analytics (Banihashem et al. 2018;Bruno et al. 2021;Leitner et al. 2017;Melesko and Kurilovas 2018b;, learner modeling (Abyaa et al. 2019;Chrysafiadi and Virvou 2013;Jando et al. 2017), and different kinds of intelligent agents (Hobert and Meyer von Wolff 2019;Martha and Santoso 2019;Soliman et al. 2010) in e-learning. In addition, intelligent techniques might be used to visualize learners' data (Bodily et al. 2018;Hooshyar et al. 2020;Matcha et al. 2019a) or for different purposes in intelligent tutoring systems (Alkhatlan and Kalita 2019; Dermeval et al. 2018;Mousavinasab et al. 2021) and learning management systems (Alshammari et al. 2016;Kasim and Khalid 2016;Oliveira et al. 2016). ...
... Statistical analysis is used for the analysis and interpretation of quantitative data for decision-making (Leitner et al. 2017). By using statistical analysis in e-learning environments, we can count the number of visits, analyze mouse clicks, and calculate time spent on tasks (Khalil and Ebner 2015). ...
Full-text available
Online learning has become increasingly important, having in mind the latest events, imposed isolation measures and closed schools and campuses. Consequently, teachers and students need to embrace digital tools and platforms, bridge the newly established physical gap between them, and consume education in various new ways. Although literature indicates that the development of intelligent techniques must be incorporated in e-learning systems to make them more effective, the need exists for research on how these techniques impact the whole process of online learning, and how they affect learners’ performance. This paper aims to provide comprehensive research on innovations in e-learning, and present a literature review of used intelligent techniques and explore their potential benefits. This research presents a categorization of intelligent techniques, and explores their roles in e-learning environments. By summarizing the state of the art in the area, the authors outline past research, highlight its gaps, and indicate important implications for practice. The goal is to understand better available intelligent techniques, their implementation and application in e-learning context, and their impact on improving learning in online education. Finally, the review concludes that AI-supported solutions not only can support learner and teacher, by recommending resources and grading submissions, but they can offer fully personalized learning experience.
... However, many LA projects are focused on departments or faculty in small-scale pilots. An evaluation of LA publications found that many describe use cases and have only targeted online courses such as MOOCs (Leitner et al., 2017)-this is understandable due to the large volume of data collected in these courses. Institutional culture change is often a significant obstacle to the successful implementation of LA programs (Macfadyen & Dawson, 2012); buy-in to scale up within an institution requires the support of leadership to overcome barriers to change and reduce resistance. ...
Full-text available
Providing timely nudges to students has been shown to improve engagement and persistence in tertiary education. However, many studies focus on small-scale pilots rather than institution-wide initiatives. This article assesses the impact of a pan-institution Early Alert System at the University of Canterbury that utilises nudging when students are at risk of disengagement. Once flagged, students received an automated text message and email encouraging re-engagement with the learning management system. Students who received the nudge re-engaged at a higher rate and spent more time engaging with online material. These benefits were sustained over two weeks, demonstrating a measurable benefit over time. Unexpectedly, the nudge resulted in persistence and engagement in other enrolled courses where a nudge was not provided, showing the transferability of benefits to other courses. Although no significant differences in GPA were found between test and control groups, future development will enable further research.
This study contributes with a case study on redesigning three Learning Analytics Dashboards (LADs) of the adaptive learning platform Rhapsode™ with instructions for pedagogical actions. Applying self determination theory’s elements of competence and relatedness and mental models in a design thinking process, the differences among the teachers perceptions and the designers intentions are highlighted through several methods to answer the questions of: How might we improve the learning analytics dashboards by prioritizing course instructors’ perceived competence and relatedness? and How might we redesign learning analytics dashboards by including course instructors’ purpose, insights, and recommending actions?These questions are answered first by developing three Role-based Personas of Alina Action, Niels Novice, and Paul Privacy along with scenarios and user stories. Second, prototypes of interfaces are designed and tested in three iterations showing insights, recommended actions, and explanation of mechanics. Feedback from the tests on the prototypes receives positive feedback from all teacher personas. The teacher persona of Niels Novice also supplies a criticism of the insights and recommended actions on the basis of creating undesired interpretation, potential bias, taking away freedom of interpretation, and authoritative system that “instructs/orders” action. Additionally, the scope of the study cannot meet the persona of Paul Privacy’s reservations on students’ possible experience of surveillance.KeywordsActionable Learning Analytics DashboardAdaptive Learning PlatformMental modelMotivation theoryDesign Thinking
Advances in reinforcement learning research have demonstrated the ways in which different agent-based models can learn how to optimally perform a task within a given environment. Reinforcement leaning solves unsupervised problems where agents move through a state-action-reward loop to maximize the overall reward for the agent, which in turn optimizes the solving of a specific problem in a given environment. However, these algorithms are designed based on our understanding of actions that should be taken in a real-world environment to solve a specific problem. One such problem is the ability to identify, recommend and execute an action within a system where the users are the subject, such as in education. In recent years, the use of blended learning approaches integrating face-to-face learning with online learning in the education context, has increased. Additionally, online platforms used for education require the automation of certain functions such as the identification, recommendation or execution of actions that can benefit the user, in this sense, the student or learner. As promising as these scientific advances are, there is still a need to conduct research in a variety of different areas to ensure the successful deployment of these agents within education systems. Therefore, the aim of this study was to contextualise and simulate the cumulative reward within an environment for an intervention recommendation problem in the education context.KeywordsAutonomous LearningEducationReinforcement LearningMulti-Armed Bandits
Over the last ten years learning analytics (LA) has grown from a hypothetical future into a concrete field of inquiry and a global community of researchers and practitioners. Although the LA space may appear sprawling and complex, there are some clear through-lines that the new student or interested practitioner can use as entry points. Four of these are presented in this chapter, 1. LA as a concern or problem to be solved, 2. LA as an opportunity, 3. LA as field of inquiry and 4. the researchers and practitioners that make up the LA community. These four ways of understanding LA and its associated constructs, technologies, domains and history can hopefully provide a launch pad not only for the other chapters in this handbook but the world of LA in general. A world that, although large, is open to all who hold an interest in data and learning and the complexities that follow from the combination of the two.
Full-text available
The Handbook of Learning Analytics is designed to meet the needs of a new and growing field. It aims to balance rigor, quality, open access and breadth of appeal and was devised to be an introduction to the current state of research. The Handbook is a snapshot of the field in 2017 and features a range of prominent authors from the learning analytics and educational data mining research communities. The chapters have been peer reviewed by committed members of these fields and are being published with the endorsement of both the Society for Learning Analytics Research and the International Society for Educational Data Mining. We hope you will find the Handbook of Learning Analytics a useful and informative resource.
Full-text available
The area of Learning Analytics has developed enormously since the first International Conference on Learning Analytics and Knowledge (LAK) in 2011. It is a field that combines different disciplines such as computer science, statistics, psychology and pedagogy to achieve its intended objectives. The main goals illustrate in creating convenient interventions on learning as well as its environment and the final optimization about learning domain stakeholders. Because the field matures and is now adapted in diverse educational settings, we believe there is a pressing need to list its own research methods and specify its objectives and dilemmas. This paper surveys publications from Learning Analytics and Knowledge conference from 2013 to 2015 and lists the significant research areas in this sphere. We consider the method profile and classify them into seven different categories with a brief description on each. Furthermore, we show the most cited method categories using Google scholar. Finally, the authors raise the challenges and constraints that affect its ethical approach through the meta-analysis study. It is believed that this paper will help researchers to identify the common methods used in Learning Analytics, and it will assist by establishing a future forecast towards new research work taking into account the privacy and ethical issues of this strongly emerged field.
Full-text available
Learning analytics has reserved its position as an important field in the educational sector. However, the large-scale collection, processing, and analyzing of data has steered the wheel beyond the borders to face an abundance of ethical breaches and constraints. Revealing learners’ personal information and attitudes, as well as their activities, are major aspects that lead to identifying individuals personally. Yet, de-identification can keep the process of learning analytics in progress while reducing the risk of inadvertent disclosure of learners’ identities. In this paper, the authors discuss de-identification methods in the context of the learning environment and propose a first prototype conceptual approach that describes the combination of anonymization strategies and learning analytics techniques.
Full-text available
This study explores the types of learning profiles that evolve from student use of video annotation software for reflective learning. The data traces from student use of the software were analysed across four undergraduate courses with differing instructional conditions. That is, the use of graded or non-graded self-reflective annotations. Using hierarchical cluster analysis, four profiles of students emerged: minimalists, task-oriented, disenchanted, and intensive users. Students enrolled in one of the courses where grading of the video annotation software was present, were exposed to either another graded course (annotations graded) or non-graded course (annotations not graded) in their following semester of study. Further analysis revealed that in the presence of external factors (i.e., grading), more students fell within the task-oriented and intensive clusters. However, when the external factor is removed, most students exhibited the disenchanted and minimalist learning behaviors. The findings provide insight into how students engage with the different features of a video annotation tool when there are graded or non-graded annotations and, most importantly, that having experience with one course where there are external factors influencing students’ use of the tool is not sufficient to sustain their learning behaviour in subsequent courses where the external factor is removed.
Over the last ten years learning analytics (LA) has grown from a hypothetical future into a concrete field of inquiry and a global community of researchers and practitioners. Although the LA space may appear sprawling and complex, there are some clear through-lines that the new student or interested practitioner can use as entry points. Four of these are presented in this chapter, 1. LA as a concern or problem to be solved, 2. LA as an opportunity, 3. LA as field of inquiry and 4. the researchers and practitioners that make up the LA community. These four ways of understanding LA and its associated constructs, technologies, domains and history can hopefully provide a launch pad not only for the other chapters in this handbook but the world of LA in general. A world that, although large, is open to all who hold an interest in data and learning and the complexities that follow from the combination of the two.
Recently, interest in how this data can be used to improve teaching and learning has also seen unprecedented growth and the emergence of the field of learning analytics. In other fields, analytics tools already enable the statistical evaluation of rich data sources and the identification of patterns within the data. These patterns are then used to better predict future events and make informed decisions aimed at improving outcomes (Educause, 2010). This paper reviews the literature related to this emerging field and seeks to define learning analytics, its processes, and its potential to advance teaching and learning in online education.
Nowadays, recognizing and predicting students learning achievement introduces a significant challenge, especially in blended learning environments, where online (web-based electronic interaction) and offline (direct face-to-face interaction in classrooms) learning are combined. This paper presents a Machine Learning (ML) based classification approach for students learning achievement behavior in Higher Education. In the proposed approach, Random Forests (RF) and Support Vector Machines (SVM) classification algorithms are being applied for developing prediction models in order to discover the underlying relationship between students past course interactions with Learning Management Systems (LMS) and their tendency to pass/fail. In this paper, we considered daily students interaction events, based on time series, with a number of Moodle LMS modules as the leading characteristics to observe students learning performance. The dataset used for experiments is constructed based on anonymized real data samples traced from web-log files of students access behavior concerning different modules in a Moodle online LMS throughout two academic years. Experimental results showed that the proposed RF classification system has outperformed the typical SVMs classification algorithm.
Conference Paper
Tertiary institutions are increasing the emphasis on generating, collecting and analyzing student data as a means of targeting student support services. This study utilizes a data set from a regional Australian university to conduct logistic regression analyzing the student enrollment outcomes. The results indicate that demographic factors have a minor effect while institutional and learning environment variables play a more significant role in determining student enrollment outcomes. Using grade distribution compared to grade point average provides better estimates as to the effect particular grades have on enrollment outcomes. Moreover, the effect of an early alert system on enrollment outcomes shows that early identification has a significant relationship to a student's choice to stay enrolled versus discontinuing, lapsing or being inactive in their enrollment. These results are vital in the targeting of student support services at the case study institution. The significant results indicate the importance of learning environment variables in understanding student enrollment outcomes at tertiary institutions. This analysis forms part of a much larger research project analyzing student retention at the institution.