ArticlePDF Available

Abstract and Figures

Learning Analytics is a field that measures, analyses, and reports data about students and their contexts to understand/improve learning and the place in which it occurs. Educational institutions have different motivations to use Learning Analytics . Some want to improve students' outcomes or optimize their educational technology and reduce the dropout rate and others. This concept is presented with practical experiences that have been acquired and validated by 16 institutions. Besides, an analysis of the results, challenges, and expectations was performed. It was found that the majority of initiatives use Learning Analytics to improve retention of students; few are focused merely on improving the teaching/learning process or academic issues. The organizations invest their resources in acquiring Learning Analytics software; however, most universities develop their technology. The technology helps organizations be preventive and not reactive as various models determine students at risk of failing. This information allows them to make suitable interventions, which increases the success of the initiative. CoViD19 pandemic is also put in context in this research; Learning Analytics could be a great approach to help the educational community adapt effectively to the new forms of educational delivery. Based on an exhaustive bibliographic review, various educational projects and experiences were analyzed, presenting an overview detailing applications, results, and potentialities and opportunities, hoping that this article will be a useful reference for researchers and faculty to exploit Learning Analytics education.
This content is subject to copyright. Terms and conditions apply.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
https://doi.org/10.1007/s12008-022-00930-0
TECHNICAL PAPER
Learning analytics: state of the art
Marcela Hernández-de-Menéndez1
·Ruben Morales-Menendez1
·Carlos A. Escobar2
·
Ricardo A. Ramírez Mendoza1
Received: 26 November 2020 / Accepted: 12 May 2022 / Published online: 18 June 2022
© The Author(s) 2022
Abstract
Learning Analytics is a field that measures, analyses, and reports data about students and their contexts to understand/improve
learning and the place in which it occurs. Educational institutions have different motivations to use Learning Analytics.
Some want to improve students’ outcomes or optimize their educational technology and reduce the dropout rate and others.
This concept is presented with practical experiences that have been acquired and validated by 16 institutions. Besides, an
analysis of the results, challenges, and expectations was performed. It was found that the majority of initiatives use Learning
Analytics to improve retention of students; few are focused merely on improving the teaching/learning process or academic
issues. The organizations invest their resources in acquiring Learning Analytics software; however, most universities develop
their technology. The technology helps organizations be preventive and not reactive as various models determine students
at risk of failing. This information allows them to make suitable interventions, which increases the success of the initiative.
CoViD19 pandemic is also put in context in this research; Learning Analytics could be a great approach to help the educational
community adapt effectively to the new forms of educational delivery. Based on an exhaustive bibliographic review, various
educational projects and experiences were analyzed, presenting an overview detailing applications, results, and potentialities
and opportunities, hoping that this article will be a useful reference for researchers and faculty to exploit Learning Analytics
education.
Keywords Educational innovation ·Higher education ·Learning analytics ·Educational practices
1 Introduction
Higher Education Institutions (HEI) have different moti-
vations for using Learning Analytics (LA). Some of them
are aware of a student success movement that is arising as
international studies have found that reasons for dropouts
are related to the choice of the wrong program, have no
motivation, personal situations, an unsatisfying first-year
experience, absence of university support services, and aca-
demic unpreparedness (Ifenthaler, Yin, & Yau, [32]. Others
want to get public funding, which is a tie to academic per-
formance and want to improve the outcomes of the students
or optimize their educational technology (Arroway, Morgan,
BRuben Morales-Menendez
rmm@tec.mx
1Tecnológico de Monterrey, School of Engineering and
Sciences, Ave. E. Garza Sada 2501, Monterrey 64849, NL,
México
2General Motors, Global Research & Development, Warren,
MI, USA
O’Keefe, & Yanosky, [3]. Whatever the motivation is, LA
is a field that has existed since 1979 when The Open Uni-
versity (UK) could analyze ten years of progress of their
thousands of distance students [25]. Since then, and thanks
to the widespread emergence of online learning and big data,
LA has been in the spotlight of educational discussions. How-
ever, its adoption has been mainly done at the professor level,
but it is a tool that is considered valuable to improve teach-
ing and institutional management. The educational sector
has had problems determining the value and impact of LA
in enhancing learning. Their motivations for adopting LA
include improving (1) teaching, (2) satisfaction and (3) reten-
tion of students, and (4) additional benefits for the institution
and staff [69]. Table 1summarizes the acronym definitions.
The use of digital tools in education generates many
data and experiences from various sources such as online
pedagogical platforms, academic enrollment, libraries, infor-
mation systems, online assessment, social networks, etc.
Any user’s digital behavior can be tracked and analyzed
to generate what have been called digital footprints. This
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1210 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Table 1 Acronyms description
Acronym Definition Acronym Definition
CSBA Computer-
Supported
Behavioral
Analytics
IT Information
Technology
CSLA Computer-
Supported
Learning
Analytics
ICT Information and
Communication
Technology
CSPA Computer-
Supported
Predictive
Analytics
LA Learning Analytics
CSVA Computer-
Supported
Visualization
Analytics
LMS Learning Management
Systems
EDM Educational Data
Mining
OER Open Educational
Resources
GPA Grade Point
Average
TEL Technology-Enhanced
Learning
HEI Higher Education
Institutions
VLE Virtual Learning
Environment
significant amount of data can be recorded and diagnosed
with many different technologies. LA provides educational
leaders with the knowledge to improve the teaching/learning
process [5], Boyer & Bonnin, n.d.). It is stated that online
technologies could increase education quality [46]. They
can also aid during air pollution crises, natural disasters,
or pandemic diseases such as CoViD19, spreading rapidly
worldwide and causing educational institutions to close their
doors and thousands of people to be affected regarding their
learning process. UNESCO states that it is necessary to "pro-
vide alternative modes of learning and education, and put
in place equivalency and bridging programs, recognized and
accredited by the state, to ensure flexible learning in both
formal and non-formal settings, including in emergencies"
[29]. Several challenges arise when using online learning in
times of disruption [29]: (1) faculty/students feel alone, (2)
demotivation of students, and (3) no time to adapt to change.
The effectiveness of the online digital learning process
depends on the material prepared, the faculty’s engagement
in the new form of teaching, and the interaction that can
be developed between faculty and students. From the side
of the student, this learning demands more self-discipline
(Aristovnik, Keržiˇc, Ravšelj, Tomaževiˇc, & Umek, [2].
A range of technologies compose online learning; it ranges
from a simple tweet to an avatar-based simulation. Whatever
the technology used, rushing to online learning could be a
challenge. Some recommendations for doing this task are
[62]:
(a) Manage students and faculty. They need to develop
skills and behaviors, exploiting the current LMS.
(b) Divide lectures and mix them with different learning
activities. Again, there are multiple platforms for doing
this.
(c) Use online tutorials (these aid in acquiring knowledge
and skills and are tools that engage students), videos,
social media, MOOC, etc.
(d) Encourage online reflection by making students share
their experiences with others and solve real-world prob-
lems.
(e) Demonstrate the value for active professionals of the
shift to online learning.
LA can be used as a diagnosis and predictive tool. It can
help determine student retention, performance, engagement,
employability, progression, attainment, and mastery [5,32],
especially under critical situations such as the CoViD19 pan-
demic. It can also be a tool for developing an open assessment
by giving the learner timely feedback and explaining why
such feedback was given. This process is essential as it
could promote self-reflection and self-assessment competen-
cies [17]. However, although LA has excellent HEI potential,
it has not been widely used in this sector yet [5]. The lack
of LA adoption in HEI can be explained by the absence of
participation of teachers and communication issues between
stakeholders such as professors, students, institutional direc-
tors, etc. However, research in the area has increased related
to papers published. These investigations explore LA’s use to
increase retention, predictive analytics, social network anal-
ysis, discourse analytics, supporting students learning and
determining its link with educational theory and learning
design [34].
DuetotheCoViD19 pandemic, MOOC platforms have
had an exponential expansion. e.g., Coursera grew 640%
in the mid-March period to mid-April compared with the
same period last year. This growth is not only the result
of a provider’s strategy of giving free access to over 3,800
courses; but as a need due to the pandemic spread [33].
MOOC use is the only alternative to give continuity to the
teaching/learning process in some places. Various techno-
logical tools have been used during the CoViD19 pandemic,
including Zoom, Google Hangouts, Skype meets up, Google
classrooms, and YouTube. Teachers need to rapidly adapt
to the new technological environment, which poses a chal-
lenge [65]. Still, it is also a benefit as the crisis has made
professors develop digital competencies quickly. However,
no one knows what will happen after the crisis if professors
continue to give classes in an online format or return to face-
to-face ones. This will depend on their experience during
the CoViD19 pandemic [38].
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1211
The pandemic changed the educational dynamic, and its
effect will be seen in the new normality in educational insti-
tutions. CoViD19 will change how the future workforce
is educated. Online teaching was mandatory during this
phase of turbulence. Real-time video conferences, sending
presentations to students, video recording, and written com-
munication using forums and chats were the most formats
used. These forms of education can be a challenge; learn-
ers need to be more conscious of their learning process to
achieve educational goals. Teaching staff and universities’
public relations were the entities that offered them signifi-
cant support during the pandemic. Besides, students felt they
were not performing well due to a lack of computer skills and
a high workload. Also, they felt bored, anxious, and frustrated
regarding their future professional career [2]. LA will be an
essential tool in understanding and analyzing students’ and
faculty’s digital footprints and experiences.
LA allows students to have information regarding their
performance and what they need to do (practical guide and
feedback) to reach their educational goals. They are con-
scious about how they learn, which is achieved by giving
them continuous formative feedback. They can also compare
with peers’ performance; this adds a competitive element
that is desirable for many students; because it develops an
additional engagement. Learners can select future courses
based on their past performance; this allows them to have
an optimum and challenging pathway through their career
choices. Adaptive learning systems are under construction
to help students develop competencies and knowledge more
personalized and adaptable. From the side of faculty, LA
mainly permits them to know how effective is their perfor-
mance/content and enable a continuous enhancement of it
(Sclater, Peasgood, & Mullan, [63].
Faculty could track students’ interactions with the system
and make interventions opportunely, e.g., if a student has
not read a post for a long time, the professor can intervene
and determine what is happening. Also, suppose a student
asks many questions regarding the material. In that case,
faculty can access the learning environment and determine
when and how often they access relevant tools of such a
learning environment [20]. For the institution, LA allows
knowing the general effectiveness of the learning programs.
This is done by combining students’ learning data and other
educational data developed from the institution’s different
departments/offices [63]. Finally, LA will be a valuable tool
for analyzing the new normality derived from the CoViD19
pandemic in an educational context. It will be a tool to deter-
mine the unique conditions faculty and students are teaching
and learning.
LA has been developed by three main factors that are con-
sidered drivers of the field [25]:
(1) Online learning: Students may feel alone due to
the lack of contact with peers/teachers; they may
have technical problems; they may lose motivation
in virtual settings. This becomes particularly critical
under the CoViD19 conditions, where online teach-
ing/learning/communication/etc. became a unique need
and widespread; and.
(2) Political issues: educational institutions are asked to
demonstrate and improve performance. As well as for
international recognition in academic rankings.
(3) Big data: there is a significant quantity of data
obtained from the use of different educational soft-
ware/platforms, with main characteristics such as high
volume, velocity, variety, veracity, and value [17].
As mentioned before, LA leverages the digital footprint
left by the virtual learning environments. These environments
offer a significant quantity of information and data regarding
user activities, which can improve the teaching/learning pro-
cesses [59]. However, the data generated is complex, large,
and heterogeneous, difficult to understand. Here is where LA
techniques can help users understand the data and transform
it into the information that can be used to make decisions
(Vieira, Parsons, & Byrd, [71], especially in unexpected con-
ditions, such as the massive move to online and/or virtual
versions (with very different technologies) due to CoViD19
pandemics. LA is one of the fastest-growing research areas
related to education and technology. Using techniques such
as predictive modeling, user profiling, adaptive learning, and
social network analysis, LA uses data patterns to make rec-
ommendations for improving education [12], including new
possible ways of teaching/learning.
Interest in LA is high among educational institutions; how-
ever, adoption remains low even though the HEI sector is
incrementing its use to understand better and support stu-
dents learning [70]. HEI is an education area that uses LA
with different focuses: research, knowledge exchange, and
praxis [45]. HEI research related to LA has been in [34]:
student retention, predictive analytics, discourse analytics,
helping students learn, and the relationship between educa-
tional theory and learning design.
LA has mainly been used to increase the retention of stu-
dents and success and create personalized learning. It is stated
that study success is affected by several factors, such as indi-
vidual attitudes and characteristics of the educational envi-
ronment. In a VLE, some factors contribute to determining
students’ success; these include predictors and visualization
tools. Predictors are varied, e.g., forum interactions (posts,
replies, etc.), engagement with learning options (videos, lec-
tures, etc.), demographics, socioeconomic information, past
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1212 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Table 2 Publication areas of LA research
Area Categories Area Categories
Results Assessment and
evaluation
Quality Interaction
between learner
& platforms
Prediction
learning
outcomes
Assessment of the
system
Decision making
systems
Teaching quality
assessment
Indicators Student dropout
rates
Support Recommendation
& guiding
systems
Learning style
preferences
Recommendation
systems for
faculty
Acceptance and
enrollment
rates
Structure of
knowledge
domain
academic experience and performance, and educational his-
tory. Visualization is done using dashboards [32].
Interest is growing in using LA to monitor the progress of
students. However, LA is not a priority for universities. The
research aimed at investigating the state of adoption of LA in
Australian universities revealed that 2 out of 32 organizations
that were studied reached the advanced stage of implemen-
tation. The other universities were in the preparatory or early
stages of implementation. In the United Kingdom, results are
similar; from 53 organizations surveyed, 25 did not imple-
ment LA, 18 worked on it, nine partially implemented it, and
only one fully implemented LA [68]. However, publications
and hence the interest in the area is growing,a study aimed
at determining the adoption of LA in HEI found that 60%
of publications were related to the theme of interest. On a
global scale, the USA has been the leader in publications [30,
69], followed by Spain, the UK, Australia, Germany, Canada,
India Netherlands, Japan, and China. As an example, the
USA emphasizes its research in monitoring or measuring the
students’ progress. In summary, the adoption of LA in HEI
globally is in an embryonic stage [69].
A study aimed at determining the four publication areas
of LA identified and grouped related studies in categories
(Charitopoulos, Rangoussi, & Koulouriotis, [15], Table 2.
It was also found that learning context used in LA research
is varied, Fig. 1[15].
Research in LA has a weakness, the lack of large-scale,
longitudinal, and experimental studies related to its impact
on learning/teaching in HEI. This is a big challenge for LA
future research [32].
It is possible to determine if an organization is ready to
adopt LA. EDUCAUSE [3] has developed maturity indices
Fig. 1 Learning context in LA research
to help institutions know where they stand. The LA matu-
rity index measures 32 items, organized into six categories:
(a) Decision-making culture including leadership agree-
ment and acceptance of analytics, (b) Policies including
data collection and access, (3) Data efficacy, availability
of tools/software, (4) Investment & resources consisting of
funding, (5) Technical infrastructure, store/manage/analyze
data and (6) Institutional research involvement.
Each dimension is assessed with statements in which the
interviewees express their degree of agreement. Each dimen-
sion’s maturity score is the average of all responses, and the
overall maturity score is the mean of each dimension score.
An intense literature review of LA was performed. The
concept is presented with practical experiences that have been
acquired and validated by different institutions. Besides, the
results, challenges, and expectations were analyzed. The out-
line of this paper is as follows: Sect. 2describes the concept of
LA. Section 3presents the experience that different organiza-
tions have had on the theme; many bibliographic references
are included. Section 4analyzes the results and generates
some recommendations and guidance. Finally, Sect. 5con-
cludes the paper.
2 Learning analytics
LA is one of the areas of Technology-Enhanced Learning
(TEL) research that is growing fast. It consists of analyz-
ing educational data to enhance the learning experience. For
example, it can, e.g., determine the time a student spends on
a specific activity and the number of visits to it. By combin-
ing the above data or any trace data with demographics and
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1213
Fig. 2 LA definitions (** Banihashem, Aliabadi, Ardakani, Delaver, & Ahmadabadi, [6,7],Bellini, Santis, Sannicandro, & Minerva, 2019; [12,16,
25,28,37,49],Pazmiño-maji, García-peñalvo, & Conde-gonzález, 2016; [55], [60], [63], [71]
performance history, professors can personalize students’
learning and redesign their courses, if necessary, according
to a group of students (Rienties, Nguyen, Holmes, & Reedy,
[57]. Some definitions of LA that have been proposed are
deployed next, Fig. 2.
Regardless of the definitions, all of them know that learn-
ing data is obtained to improve the teaching–learning process.
LA focuses on the relationship between the student and the
learning environment. The final goal is to enhance students’
success, defined as completing individual learning tasks and
the successful obtainment of a degree [15,31]. In the above
definitions, three parts are noticed, Table 3[53].
There are four forms of LA:descriptive analytics, diag-
nostic analytics, predictive analytics and prescriptive analyt-
ics (Boyer & Bonnin, n.d.), Table 4.
Data sources for LA are varied. The main one is the
VLE, which accesses students to view course information,
timetables, homework, etc. Another source is the information
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1214 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Table 3 PartsofLAdenition
Input Process Output
Students learning
data, demographics,
perceptions and
school processes
Integrated of four
stages:
measurement,
collection, analysis
and reporting
Understand and
optimizing
learning
processes and
environments
Table 4 FormsofLAdenition
Form Description
Descriptive (What happened?) It examines the data and uses
basic statistical techniques.
E.g. it can be used to score the
relative performance of the
students
Diagnostic (Why happened?) It uses techniques such as data
discovery, pattern mining, or
statistical correlations. E.g.
identify if the teaching
strategy is effective
Predictive (What will happen?) It allows to anticipate the near
futurebasedonpastevents.
E.g. determine which students
are at risk
Prescriptive (How can we make
it happen?)
It exploites tools able to process
big graph analysis, text and
data mining, simulation, etc
system of students in which data such as prior qualifica-
tions, grades, socioeconomic status, etc., can be encountered.
Attendance monitoring systems and library data are also
sources that can bring valuable information [63]. Data gen-
erated through the LMS and analyzed efficiently are [20]:
(1) Number of times resource accessed, (2) date and time of
access, (3) number of discussion posts generated and read,
and (4) types of resource accessed.
There are studies in the literature in which data generated
in the LMS has been used to predict students’ performance.
For example, researchers found that data such as login fre-
quency, site engagement, student pace in the course, and
assignment grades could predict course outcomes. Also, data
such as the number of discussion messages read and the num-
ber of discussion replies posted can predict students’ success
[20].
The technical infrastructure is based on different technolo-
gies such as VLE, students information systems, business
intelligence and visualization software, emerging LA pack-
ages, own house developments and enrollment, learning, and
advising management tools [41], [63].
LA uses various techniques, including data visualization,
artificial intelligence, data mining, machine learning, learn-
ing sciences, psychology, social network analysis, semantics,
e-learning, and social aspects (Domínguez Figaredo, Reich,
& Ruipérez-Valiente, [22,55]. It also uses soft computing
methods, including decision trees, random forests, artificial
neural networks, fuzzy logic, support vector machines, and
genetic/evolutionary algorithms [15].
Data mining is one of the techniques that is mainly used in
LA. The methods can be classified into five groups: predic-
tion, clustering, relationship mining, discovery with models,
and separation of data for use in human judgment [4]. Authors
[1] reviewed the literature and found a significant quantity of
data mining techniques and their applications. They classify
them in four dimensions:
(a) Computer-Supported Learning Analytics (CSLA)uses
data mining algorithms to develop the interaction of
students in the LMS. It identifies learning opportunities
by assessing the exchange of students and their results.
(b) Computer-Supported Predictive Analytics (CSPA)is
valid for predicting students’ performance and retention
by evaluating several dimensions, such as participation,
engagement, and grades.
(c) Computer-Supported Behavioral Analytics (CSBA)
shows students’ behavior and preferences or motiva-
tions in a learning environment while participating in
several different academic activities.
(d) Computer-Supported Visualization Analytics (CSVA)
offers visual/graphical results related to individual
behavior in a learning activity. Figure 3presents differ-
ent dimensions with their applications and techniques.
There is a concept named Educational Data Mining
(EDM), which uses data mining techniques for analyzing
educational information; the difference between EDM and
LA can be blurred. However, there are soft differences that
are explained in Fig. 4[15].
LA’s other techniques are Bayesian modeling, natural lan-
guage processing, and predictive modeling. Regardless of
the method, all of them collect data about the learner and
the learning process from various sources and improve and
predict learners’ success.
Some sources include the number of clicks, number of
posts in a discussion forum, or the number of computer-
assisted formative assessments attempted [56]. LA uses
methods and processes for answering questions such as [9]:
1. What is the best time for a student to advance to the next
topic in the course?
2. At what point is a student falling behind on a course
topic?
3. When is a student at risk of failing a course?
4. What will be the evaluation of the performance of stu-
dents without supporting them during the course
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1215
Fig. 3 Dimension applications and techniques
Fig. 4 Differences between EDM and LA
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1216 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
5. How should a student be administered based on their
performance in a course?
6. How to determine the need for additional help managing
a student
As can be seen, LA uses technology and the data gener-
ated through it to track students, if needed, help the students
promptly, and be sure that learners will complete their
careers. In addition, the LA process itself is preventive as it
advances any situation that can cause a student to withdraw
from his/her studies.
2.1 Learning analytics process
There is no consensus about how the LA process should be;
authors differ in the different stages it involves and the input
data needed. [59] analyzed the strategies of other authors,
Table 5.
As can be seen, several processes differ in various ways;
some are very specific; others are too general. Some consider
that an action must be part of the process (author-3, author-
4, and author-5), while others do not. An interested person
Table 5 LA processes
Author Stages
Author-1 1. What? (determine which data is going to be
collected and analyzed)
2. For Whom? (for which stakeholders is the
analysis. Students, professors or directors)
3. For what? (establish the purpose: monitoring,
prediction, adaptation, personalization, etc.)
4. How? (techniques, methods, and analysis
tools)
Author-2 1. Sampling 3. Reduction
2. Collection 4. Patterns finding
Author-3 1. Collection of “educational transactions” (big
data)
2. Information sources that can support the next
stage: a) institutional system; b) big data; c)
experience, intuitions, professors perspective
3. Application of analytical software
4. Decision making
Author-4 1. Collection of data 5. Analysis
2. Storage 6. Representation and visualization
3. Cleaning of data 7. Action (intervention,
optimization, alerts, etc.)
4. Integration 8. Restart the process
Author -5 1. Learning activities 3. Data storage and
processing
2. Data collection 4. Analysis
5. Visualization 6. Action
in implementing LA has to find its process. This procedure
could be done by combining a different approach and refin-
ing it as the own process is running. Attention must be paid
to the results; these must be measured to know the improve-
ments made thanks to the intervention. It is also essential to
have a leader who can negotiate the resources needed for the
deployment [69], as LA is costly in terms of time, experience,
and money [72].
2.2 Learning analytics tools
Information Technology (IT) departments or LA specialists
are the ones that design and implement LA tools [37]. Practi-
cal and well-designed LA tools reduce the time between the
analysis and the action [17]. Some of the most known LA
tools are listed in Table 6[15], [25]:
[26] developed an inventory of LA tools that have been
created by e-learning vendors, universities, or collaborative
projects. They have different purposes, e.g., they can alert
students struggling with their performance and give on-time
support. Others make predictions success of students. Some
tools adapt the content of a course to the learner’s needs.
Specifically, for HEI, some tools are worth mentioning,
Table 7.
2.3 Learning analytics applications
LA applications are varied, these include the ones depicted
in Fig. 5[9].
Big data is one of the drivers of LA. Big data techniques
have multiple applications in LA [14], [66]:
(a) Performance prediction. This can be done by evaluat-
ing students’ interactions with faculty and peers in a virtual
learning environment.
(b) Risk detection. By analyzing the students’ behavior,
the risk that students leave a course can be detected. Modifi-
cations can be done to the course bases on such analysis.
(c) Data visualization. Friendly visual reports can be
developed thanks to various data visualization techniques
that now exist.
(d) Intelligent feedback. Instant feedback can be offered
to students based on their inputs. This feedback will improve
the interactions of the students and their performance.
(e) Course recommendation. Courses can be recom-
mended to the students based on their interests. This rec-
ommendation is made by analyzing their activities.
(f) Student skills estimation. Estimation of the skills
acquired by the students.
(g) Others: grouping and collaboration of students, social
network analysis, developing conceptual maps, constructing
courseware and planning, scheduling, and identifying LMS
users’ behavior patterns.
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1217
Table 6 Some LA tools LA Tools Description
Social Network Analysis It can investigate and promote collaborative
connections between learners, teachers, and
educational resources
GISMO It is used for monitoring students taking into account
the social, cognitive, and behavioral aspects. It
presents graphical representations to analyze the
above factor
CourseVis It uses LMS data to help teachers know how their
students perform in online classes and identify
those who need extra support
Contextualised Attention Metadata It integrates information from sources such as office
tools, web browsers, multi-media players, and
computer-mediated communication. The goal is the
attention of users
LOCO Analyst It gives feedback regarding the quality of the learning
process
SMILI Open Learner Modelling Framework It offers a method for describing, analyzing, and
designing open learner models
Social Nets Adapting Pedagogical Practice It analyzes interaction patterns of courses that help
detect learner isolation, creativity, and community
formation
Honeycomb It visualizes large datasets, specifically networks,
including millions of connections
Gephi It performs advance analytics; it permits filtering,
clustering, navigation, and manipulation of network
data
Sense.us It supports asynchronous collaboration; graphical
annotation and view sharing can be performed
Signals It uses large datasets to real-time predict students that
areindangeroffailure
2.4 LA benefits and challenges
A group of researchers develops an investigation to determine
the benefits and challenges of LA through a literature review,
Table 8, and Figs. 6and 7[6].
One of the benefits of LA is that it can offer personal-
ization to the users. It is interesting to note that there is a
model named the 70–20-10 that states that learning at the
workplace is achieved through seminars, workshops, and
eLearning courses (10%), followed by collaboration, mentor-
ing, and coaching activities (20%) and personalized learning
during the daily work (70%) [50].
3 Experiences
A significant number of institutions are using LA in different
ways and for various reasons. Some want to enhance stu-
dents’ experience by improving achievement, giving on-time
feedback, and making students self-learners. Other organi-
zations want to improve retention. For example, Manchester
Metropolitan University increased nine percent satisfaction
among the student’s thanks to analyzing students’ require-
ments, which indicated that the university should reorganize
its curriculum. Nottingham Trent University uses LA to iden-
tify students at risk of failing and make interventions [63]
opportunely. The experiences of organizations in LA are pre-
sented.
3.1 Arizona State University (ASU)
ASU is committed to improving the students’ success with
the use of technology. In 2011, the university used Knew-
ton Math Readiness software to improve the mathematics
courses. This program developed a personalized learning
path for 5,000 students enrolled in remedial mathematic
courses [26]. Thanks to this strategy, the retention in such
programs increases from 64 to 75%. ASU has also used Civ-
itas, a data analytics platform, to improve students’ success.
This program helped the university to real-time track the
performance of the students. It collects data related to class
attendance, course participation, and the use of academic
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1218 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Table 7 Higher education
institutions LA tools Tool Name Description
Degree Compass Using information about the enrollment of students gives recommendations
on which courses to take to complete the degree. It suggests which classes
they are more likely to finish
Knewton It is an adaptive learning platform for personalized education. This suggests
lessons to students based on their performance/behavior. It provides
information about their progress (immediate feedback). Faculty can see
each learning status of the learners
Loop It can be integrated with Moodle or Blackboard, and it is used to understand
the behavior of students in the LMS. It has different components that aid
professors to track the performance of students on time
Open Essayist It offers automatic feedback regarding essays with suggestions to improve
their writing. It analyzes the text and gives graphical feedback (prominent
words, key sentences, highlights structure)
OU Analyse It predicts students at risk of failure in their studies using ML. The tool
reports the aggregated prediction value of several models for all students
of a module and the reason that underlies its prediction
Student Success Plan It provides information and analysis regarding the support services of
students (counseling and coaching) to improve retention, academic
performance, persistence, graduation rates, and completion time
Tribal’s Student Insights It uses academic information, demographics, and assessment results to
predict the general performance of students and determine those at risk of
failure. Educators can use the information to provide students support and
monitor modules concerning their predicted performance
X-Ray Analytics It visualizes behaviors in their LMS at multiple levels: a course, numerous
courses, and intuitional. It predicts those students who are at risk of
failure. It is helpful to make timely interventions
resources and support. ASU has been recognized for lead-
ing the student success movement by incorporating tracking
systems, adaptive learning tools, big data, and predictive ana-
lytics. Due to this strategy, the ASU’s retention rate for 2018
was 85.2% (11% higher than 15 years ago). Also, the six-
year graduation rate for cohort 2011 was 62.2% (6% higher
than the national average of 57%) [54].
3.2 Georgia State University (GSU)
GSU has used predictive analytics to improve the reten-
tion of students and graduation rates. GSU implemented
the Graduation and Progression Success system to moni-
tor the students’ performance daily. As a result, the system
helped the university have 1,700 more degrees in 2015–2016
than in 2011–2012. As a result of its strategy for improving
students’ success, GSU’s 2017 retention rate was 83%, and its
six-year graduation rate increased from 32 in 2003 to 53.7%
in 2017 [54]. In addition, the university serves minorities,in
research performed using predictive analytics, the institu-
tion found that students with extraordinary academic results
dropped before graduation due to non-payment. Thanks to
these highlights, graduation rates went from 32% in 2003 to
54% in 2014 [26].
3.3 Rio Salado College
This college began to use data mining and predictive mod-
eling research in 2008. In 2010, the institution developed
the RioPACE model to determine students at risk. The idea
was to have an alert system that identifies those students
who are struggling academically. The program uses the
naïve Bayes model to determine appropriate warning levels
weekly using updated activity and grade information [58].
The system analyses the performance of current students and
compares it with previously successful ones. It uses a color
code that expresses the level of achievement weekly. The final
goal is to make interventions opportunely for improving the
students’ success. Students also have access to the system.
If the indicator is yellow or red, learners must contact their
teacher for help [26].
3.4 The Open University (OU)
OU has collected and analyzed data over many years. The
knowledge generated has helped the organization support
students promptly and retain them. The effort began in 2013
with a project to explore LA, focused on developing dif-
ferent analytic solutions (R. [26]. This effort’s results are
varied, e.g., comparing 40 learning designs at the OU and
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1219
Fig. 5 LA applications
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1220 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Table 8 Benefits of LA
Benefits Description
1 Increase the effectiveness of games when used
in class
2 Increase engagement/results of learning in a
MOOC collaborative environment
3 Help in understanding the learning behavior of
students, modify content to align it to students
learning characteristics
4 Predict the performance of the students and
increase retention
5 Improve learning design and prevent that
students leave their studies
6 Predict the performance of students and improve
feedback
7 Useful to make evidence-based decisions
8 Improve curriculum, improve teacher
performance
9 Understand how students learn
10 Identify knowledge gaps, improve teaching
strategy
learner behavior, satisfaction, and academic performance.
The way teachers design a learning module influences stu-
dents’ engagement over time [56]. OU has a policy document
in which the treatment of the data generated through their sys-
tems is specified regarding privacy issues. It states that the
data will be used only to improve the students’ success. This
policy is aligned with its principle of "treat each other with
dignity and respect" [26].
3.5 University of Alabama (UA)
This organization works with analytics to retain its students.
An analysis of enrolled first-year students from 1999, 2000,
and 2001 was used to develop students’ predictive mod-
els, specifically those at risk. Techniques such as logistic
regression, decision trees, and neural networks were used
to make predictions. The model is composed of eight vari-
ables: (1) UA cumulative GPA, (2) English course, (3) English
course grade, (4) Distance from UA campus to home, (5)
Race, (6) Math course grade, (7) Total earned hours and (8)
Highest ACT score. The model can identify 150–200 first-
year students who will not finish their studies each year.
This information is shared with the advisors to make inter-
ventions opportunely. The model is continually updated to
adapt to students’ new characteristics (Campbell, DeBlois &
Oblinger, [13].
3.6 Sinclair Community College
This institution is committed to improving students’ comple-
tion rates. Since 2000, approximately 100 completion-related
projects have been developed. One strategy is to use Civitas
Learning, a predictive analytic tool [24]. The organization
also created the Student Success Plan (SSP) software to
improve retention, graduation rates, performance, persis-
tence, and completion time. It has been used for ten years
and has gained 11 awards in the United States [26]. The data
is collected and analyzed quarterly for trends. It has improved
students’ learning outcomes, mainly for low-income and
academically unprepared students who have problems with
courses [48]. From 2005 to 2011, students that used SSP were
five times more likely to graduate [26].
3.7 Northern Arizona University (NAU)
This organization uses various resources to help at-risk
first-year students. The university developed a model that
comprises three main elements (critical in the process):
1. Resources/services utilization (academic services, recre-
ational resources, social resources, academic referrals,
advising/career sessions).
2. Level of risk (admissions test scores, high school GPA,
and psychosocial factors).
3. Outcomes (first-year student GPA and enrollment reten-
tion status).
Results are promising, e.g., the GPA of students who
used 1 to 3 academic services increased by 0.192. Stu-
dents who used four services increased their GPA by 0.280
points. Finally, students at risk and using four services
increased their GPA by 0.460 points. The variables that had
the most significant impact on the model were academic
referrals and advising/career sessions [13].
3.8 Purdue University
This university uses its course management system to pre-
dict which students have problems with their studies and
make interventions opportunely. This initiative’s rationale is
that students’ academic success depends on students aptitude
(e.g., test scores) and effort [13]. Based on the importance
of LA for the university, the organization created Course
Signals. This Course Signals is a predictive LA system that
determines students at risk of not finishing a course. Mak-
ing the analysis, students are assigned to a group coded by
color (red, yellow, green). Data used consists of general
information about the students and activities in the learn-
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1221
Fig. 6 Benefits for different stakeholders
Fig. 7 Challenges of LA
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1222 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
ing management system. Finally, the system sends emails to
those students at risk,the university increases its retention by
21% [26].
3.9 New York Institute of Technology (NYIT)
NYIT developed an in-house predictive model to identify stu-
dents at risk. The goal was to increase first-year students’
retention by identifying those who need support and giving
specific information about each student to make interven-
tions opportunely. The process consisted of mining the data,
running the analytics, and developing friendly outputs that
helped the counseling staff. Four data sources were used:
admission application information, placement test data, a
survey completed by all students, and financial information.
The model had 74% recall and 55% precision. Results were
presented in a table; one row was assigned to each student,
indicating if he was probable to return the following year
to classes; the percentage confidence of the prediction and
reason for the prediction was added. With this information,
tutors or counselors can talk with each student about the sit-
uation and propose plans [63].
3.9.1 University of Maryland, Baltimore County
This organization integrated into its VLE the Check my Activ-
ity (CmA) software. This tool helps students compare their
activity in the VLE with a summary of the whole cohort’s
activity. In a generation of students, 92% used the VLE, and
of those, 91.5% used the CmA software. These data results
are impressive as those who used the tool were 1.92% more
likely to get a grade C o higher than those who did not.
Research regarding the VLE found that one professor with
high participation rates used Blackboard’s adaptive release
feature to allow students to take quizzes before accessing the
assignments. It was found that these students scored 20%
higher than students in other sections. They also perform
better in the following courses. In summary, analytics found
that “effective implementation of a VLE tool on a prerequi-
site course may lead to enhanced performance not just in that
course but also in subsequent courses” [63].
3.9.2 University of Wollongong
This organization developed the Social Networks Adapting
Pedagogical Practice (SNAPP) initiative. The tool analyses
students’ conversations in online forums to find patterns in
real-time through social network diagrams. They found that
collaborative learning can aid in promoting students’ under-
standing. Besides, the quality of professors’ intervention in
those forums significantly impacts students’ learning expe-
rience. The tool can help instructors analyze how the group
behaves over time and found students who are isolated or
lead discussions. As the tool gives real-time information,
professors can act on time to change, e.g., the learning strat-
egy. They also can use it to make a general reflection after
the course has finished. The SNAPP initiative revealed a
strong correlation between students’ learning interests and
the forums they participate in [63].
3.9.3 Open Universities Australia (OUA)
OUA is an online educational group integrated by seven Aus-
tralian universities. It uses the Personalised Adaptive Study
Success (PA SS ) to plan the students’ curriculum personal-
ized. The main goal is to support those students who have
problems in their studies by suggesting alternative courses
according to their specific needs. For example, a student with
trouble in one topic can take extra modules to strengthen that
area. The sources of information for PASS are varied, ranging
from customer relationship management systems to the cur-
riculum profiles for each unit and program. PASS uses LA to
give students recommendations in real-time through a dash-
board engine that can be customized [63].
3.9.4 Tecnológico de Monterrey (Tec)
Tecnológico de Monterrey has developed various academic
administration procedures, exploiting LA and classifying
students through policies in the academic regulations in three
primary states: (1) regular student, (2) prevention student,
and (3) conditioned student. Through data mining using edu-
cational information (i.e., current and past grades), behavior
(i.e., class attendance), trends (current and rolling grade point
average), and experience from the Office of Student Academic
Improvement, the condition of the student who is enrolled (by
regulation) in a specialized academic improvement program
to help him improve his academic performance. This pro-
gram has been continually enhanced and achieved excellent
results with more than 15,000 students on campus. The main
results are a reduction in the drop-out rate (~ 15%), a drop
in the percentage of students who change careers (~ 20%), a
reduction in the average study time of students (~ 15%), and
a solution of the fundamental problems of students (~ 30%),
etc. To date, most decisions are made through manual pro-
cesses by highly qualified personnel (mainly with the profile
of psychologists.)
3.9.5 University of Michigan E.2Coach
In 2012 the University of Michigan began to use in the
introductory physics class E2Coach,anLA system, to give
students personalized feedback. The goal was to determine
students’ success by predicting final grades. The system pro-
vides written feedback in personalized messages that include
advice to prepare for an exam, how to use the system better,
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1223
and feedback regarding student performance. Results indi-
cate that the systems work well as users have an average of
0.11 and 0.18 points higher in their final grade, considering
that non-users showed non-difference [64].
3.9.6 Dublin City University—PredictED
Dublin City University developed the project PredictED to
improve the learning experience and students’ performance
in their final examinations. The software is running in 17
different first-year courses. The information sources are data
from the VLE and past exam grades; by combining them,
the university can determine if a student will pass or fail the
course. The software also informs students how they perform;
a weekly alert is sent to them, indicating if they need to study
more. Results suggest that participants’ average scores are
significantly higher with the intervention. These results were
determined by analyzing ten courses and a sample size of
1,270 students [23].
3.9.7 University in Ankara, Turkey
Researchers of this university developed an investigation to
identify students’ behavior patterns in an online learning
environment. Moodle platform was the tool used for perform-
ing the analysis. One hundred sixty-five students registered
in the course Information and Communication Technologies.
Data such as students’ logs and interactions were used, and
results indicate that the technological tool elements that stu-
dents mainly deal with are course modules and discussion
forums. This information is valuable for teachers as they can
trace students’ behavior and, if necessary, change the teach-
ing items and strategy in the software [39].
In Table 9, the main elements of the initiatives previously
described are presented.
As can be seen, there is high interest from universities
to invest resources in implementing LA projects. For all the
universities, results from the LA process seem promising.
The majority of institutions use LA to increase retention;
this is their primary goal. Regarding the technology used,
only a few use commercial software; the rest have developed
their technological solutions or models.
4 Results
Most initiatives previously presented use LA to improve
students; few are focused merely on improving the teach-
ing/learning process or academic issues. The reason could
be desertion rates are high because students are not engaged,
are not prepared to finish their studies, have financial prob-
lems, or are isolated socially talking. Various strategies have
been proposed. One option is to personalize learning with the
use of technology. The organizations invest their resources in
acquiring LA software. However, the majority of universities
develop their technology. Civitas is software that some uni-
versities have used with good results. Others, such as Georgia
State University, Rio Salado College, University of Alabama,
Northern Arizona University, Purdue University, and New
York Institute of Technology, have developed their models.
The technology helps organizations be preventive and
not reactive as various models determine students at risk of
failing. This advice allows them to make reasonable inter-
ventions, which increases the initiative’s success. In addition,
pour information was found regarding privacy issues. OU is
the only found university that specifies how the data collected
will be treated.
The results have been good. In those cases where infor-
mation was available, retention rates vary from 11 to 21%.
Others have retention rates above 80%. Additional results
indicate that LA can predict performance results, used to
make suitable interventions. Some universities have found
cause/effect results, e.g., Purdue University found that aca-
demic success depends on student aptitude and effort. In the
case of OU, it was found that the way teachers design a learn-
ing module influences the students’ engagement.
Those who want to implement LA in their institution need
first to see and analyze what others have done, which is the
goal of this work. The majority of institutions included have
been experimenting with LA for years. This process takes
time, which is "personal" for each institution. Also, benefits
and challenges must be considered to decide how and where
to implement LA. The effectiveness of LA has to be studied
to deepen. More results are needed to determine that LA
improves the teaching–learning process. This research needs
to be done for specific contexts and problems. The benefits
of LA in education are numerous:
1 Identify courses that are more likely to fit with students’
interests and preferences;
2 Obtain data that will allow for improvement or change in
the curriculum;
3 Determine student’s actual outcomes and improve their
performance;
4 It is possible to personalize the learning of each individual;
5 Improve the performance of professors as the institution
can analyze their technological behavior;
6 The use of big data allows to identify of post-education
employment opportunities and permits to align of educa-
tion with market needs;
7 Analysis can be made so that researchers encounter gaps
between industry needs and academia.
8 The most significant benefit of LA is that it permits to
perform early interventions when a student is facing dif-
ficulties (Kollom et al. 2020)
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1224 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Table 9 Main elements of LA initiatives
Arizona State
University
Georgia State
University
Rio Salado
College
The Open
University
University of
Alabama
Sinclair
Community
College
Northern Arizona
University
Purdue
University
a
Goal Increase
retention
Increase
retention and
graduation
rates
Increase retention
and graduation
rates
Increase retention Increases retention Improve
completion rates
Increases retention Increases
retention
Application /
Technology
Knewton Math
Readiness
/Civitas
Learning
Graduation and
Progression
Success system
RioPACE model Different analytic
solutions
Predictive models
(logistic
regression,
decision trees and
neural networks)
Civitas Learning/
Student Success
Plan
Own model Course
Signals
Result Retention
increases from
64 to 75% / in
2018 retention
was 85.2%
1,700 more
degrees in
2015–2016
than in
2011–2012. In
2017 retention
rate was 83%
The model
identifies the
level of
achievement of
each student on
a weekly basis
and opportunely
interventions are
made
It was found that he
way teachers
design a learning
module
influences
students
engagement
The model is able to
identify 150–200
freshmen each
year who are not
going to finish
their studies
Have proven to be
successful in
improving
students learning
outcomes
GPAs of students
increased between
0.192 and 0.460
Retention
increased by
21%
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1225
Table 9 (continued)
New York
Institute of
Technology
University of
Maryland,
Baltimore County
University of
Wollongong
Open Universities
Australia
Tecnológico de
Monterrey
University of
Michigan
Dublin City
University
University in
Ankara, Turkey
b
Goal Increases
retention
Improve
performance
Promote students
understanding
Personalize
students
curriculum
Increases
retention
Determine
students success
Improve learning
experience and
performance
Analyze students
behavior
Application /
Technology
Own predictive
model
Check my Activity Social Networks
Adapting
Pedagogical
Practice
Personalized
Adaptive Study
Success
Own predictive
model (100%
manual)
E2Coach PreditED Moodle
Result The model had
74% of recall
and 55% of
precision
Those who used
the tool were
1.92% more
likely to get a
grade C o higher
Professors are able
to change
teaching strategy
on real time.
There is a strong
correlation
between
students’
learning interests
and the types of
forums that they
participate on
The system give
recommendation
to students in real
time through a
dashboard engine
that can be
customized
Improve all
performance
indicator more
than 15%
Users have an
average of 0.11
and 0.18 points
higher in their
final grade
Average scores of
participants are
significant higher
with the
intervention
Students dealt
mostly with
searching and
viewing of
course modules
and discussion
forums
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1226 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
Some challenges must overcome.
1. Data tracking, which is helpful to see individuals’
performance, depends mainly on the platform used
(Moodle, Canvas, EPIC, Blackboard);
2. Data collection can be a challenge, and it has to be
delivered in a timely and accurate manner, which is not
possible with today’s systems;
3. Technical challenges also exist in the analysis of infor-
mation. There is a need for technical resources to
manage big data;
4. Researchers must discover insights from the users’ per-
spective of the learning systems. Then, more advanced
data sets such as mobile, biometric, and mood data are
needed;
5. To take advantage of the full potential of LA, a tech-
nology that is still under development is required;
6. Ethical and privacy issues also need to be considered.
Today privacy and control of information affect the
adoption and deployment of LA systems [4],
7. There is a need for systems that give real-time feedback
to the users, i.e., systems that offer information as the
learning activity is ongoing [10],
8. Researchers are still working on reducing computer
capacity to store big data. However, experts are not
so optimistic regarding this issue [9]
9. There is a lack of large-scale studies regarding LA and
its impacts on learning and teaching in HEI [32].
10. LA systems have been developed without the active
participation of students and teachers. They only have
an observational role. The design and implementation
of LA tools are mainly the responsibility of the IT
department and learning analytics specialists.
11. It is not clear if LA offers a positive effect on learn-
ing. This is because educational institutions are more
interested in grades, persistence, and non-completion
metrics than students’ motivation, engagement, satis-
faction, and more formative learning assessment [27].
12. Data accuracy and understandability are the most crit-
ical challenges that professors consider must be faced
(Kollom et al. 2020)
13. Many companies do not know what to do with all their
generated data [44].,this is the case for education and
manufacturing organizations.
It is stated that LA requires many resources in money, time,
and experience [37]. Also, a question arises if digital traces
as proxies for learning are helpful and measure learners’
performance effectively [49]. On the other hand, studies indi-
cate that LA has successfully allowed students to complete
their courses or continue with them [32]. LA has also been
used to evaluate if students acquire lifelong learning com-
petencies. Skills such as problem-solving, logic, debugging,
creative thinking, analytical thinking, conceptual thinking,
self-efficacy, and time management are skills assessed with
LA’s aid [36].
Educational technology has been commercialized, and
authors report that some platforms use "psychological,
behavioral management techniques and rankings to model
student behavior according to the system." This falls in the
area of ethics and opens the question of whether these plat-
forms improve learning. There is also a concept arising that
has to do with LA.
The idea is the datafication of education based on business
intelligence. It is stated that all the data generated is mainly
used to influence people’s behavior. This is the other side of
LA (Teräs, Suoranta, Teräs, & Cur, [67]. However, the nar-
rative of LA promotes student engagement and personalized
learning [19] by processing a large amount of data that needs
to be used to improve education. The rationale behind this
is that the more data generated, the better. There has to be
an unrestricted flow of information if this goal wants to be
achieved [18]. Numerous sources of information are avail-
able for making LA. LMS are ideal as they offer a fertile
ground of information. Also, massive open online courses
(MOOCs) are another data source. These data can be com-
bined with other information such as socio-demographics,
course engagement data, entrance grades, test results, and
library usage [27]. LA can also be combined with other disci-
plines such as psychology, educational science, and computer
science [42],with this, the sources of information can be infi-
nite. Other disciplines in which LA has been applied are
decision science, social sciences, engineering, mathemat-
ics, arts and humanities, nursing, business management, and
accounting and medicine [47]. Regarding LA applications,
the research performed by [61] offers a significant number
of useful tools.
Regarding privacy issues, institutions must consider that
a large amount of information generated also poses a
responsibility regarding how this information will be used.
Researchers think that students have the right to limit data
analysis practices and express their privacy preferences to
control their information. This issue has to be considered
as it could cause problems in the future development of LA
[35]. There is a need to determine who has access to which
data, where and how much data will be deposited, and which
algorithms and procedures should be implemented. Aspects
that need to be considered include who has access to which
information, where the data will be stored, and how long and
which algorithms will be implemented [32]. Authors argue
that research has demonstrated that students are unaware of
the use of their data for LA. They do not even know which
of their data is collected [43].
Set a clear strategy is needed for implementing LA in an
educational institution. Also, investment in infrastructure has
to be considered, and ethical and privacy considerations have
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1227
to be set from the beginning. From the side of the educators,
they need support to have systems that give them immediate
results. A professor cannot take timely actions with data gen-
erated various months ago. They need on-time information to
help students of tomorrow [9]. One exciting thing about LA
is that it can infer trends across educational organizations,
programs, classes, etc. It provides feedback to individual
students and professors [40]. However, the efficacy of LA
depends on the skills that professors have in making inter-
pretations of the data and thus providing actionable feedback
[42].
Some recommendations for implementing LA specifically
at the HEI country level include:
1 Establish data standards,
2 Identify the requirements for data collection,
3 Introduce privacy, ethics, and data protection standards.
4 Promote the efficient use of data standards,
5 Make sure data is associated with metadata with for-
mal/standard procedures
6 Have benchmarks to compare yourself
7 Collaboration with professionals,
8 Exchange experiences
9 Integrate into university networks and link with society
10 Promote knowledge exchange, especially between dis-
ciplines.
11 Support to institutions,
12 Promote linkage models between institutions for their
adoption and transfer.
13 Support institutions in evaluating available resources.
14 Establish diligence mechanisms to develop interven-
tions,
15 Give autonomy to universities in the administration of
their data
At the school level, some recommendations include:
1. It is essential to have democratic control,
2. Data use considerations,
3. Need for capacity building,
4. Focus on ethical questions [31].
The future is narrow for LA as it is still a new field of
study; i.e., it is in an early development stage, so it isn’t easy
to make predictions. However, researchers have developed
some bets by looking at the experience of other industries.
Gartner has predicted that LA will become more automatic
and that data from all sources will be used. Also, privacy
and ethical aspects will have more attention. On the other
hand, access to the core algorithms in predictive systems
will be possible. Besides, more advanced and personalized
dashboards will be developed for students and professors.
These will make an advanced analysis of raw data and the
content of the work and a score of competencies [3]. LA in
education will provide shortly personalized and rich learning
on a large scale [56]. Experts in HEI believe that soon LA will
be significantly used in online education to identify student
behavior patterns and improve learning and retention rates
[6]. However, even though the importance of LA, there is
missing a widespread application of it in HEI. Indeed, LA has
a significant impact on those organizations, and many have
not yet exploited the data generated to address the challenges
they face [5]. Other authors consider that even though LA is
a new area, it has matured enough in HE application [40].
Also, publications on LA have grown at a fast pace since
2011. The growth is in terms of the techniques, methods, and
applications offered [47]. Developed countries, such as the
USA and those from Europe, are the ones that lead research
in LA [52].
LA will also influence the development of the Industry
4.0 approach. The Internet of Things and the Internet of Edu-
cational Things will significantly impact the studies of LA
[60]. Multimodal LA is another term that is emerging. It
integrates data from different sources, such as physiological
and contextual data. Various researches in this regard have
been developed. One interesting is the one in which a tool
called Lelikëlen and a Microsoft Azure Kinect camera were
used to capture students’ postures while presenting. Results
indicate that there were statistical differences in students’
behavior while offering, and this, combined with teachers’
additional information, can give interesting results (Morán-
Mirabal, [51]. Multimodal LA is more robust regarding the
understanding of the learning process. It incorporates sensor
data that capture gestures, gaze, or speech. Therefore, it gives
a complete outlook of the learning process [34].
5 Conclusions
LA is an emerging field taking the attention of the learn-
ing community as it offers valuable tools and processes for
improving the teaching–learning process. The definition of
LA most adopted by authors is "the measurement, collec-
tion, analysis, and reporting of data about learners and their
contexts, for understanding and optimizing learning and the
environments in which it occurs." It is composed of input
(data), a process (analysis), and an output (optimize learn-
ing). Whatever the definition used, LA has proven to enhance
the success and retention of the students.
Educational institutions that use LA are aware of the pay-
off in learning improvement and student retention that this
approach offers. Professors can track students’ progress and
be mindful of the educational strategies that are most likely
to work and be effective. Students can also analyze their
progress, determine their strengths and weaknesses, and be
promoters of their learning. Educational organizations are
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1228 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
using LA to improve students’ satisfaction and retention
with good results. More research is needed to exploit at full
its potential, mainly to determine the actual effectiveness it
offers; on the other hand, there is a lack of investigation
regarding students’ points of view. It will be interesting to
know what opinion they have regarding an issue of their real
concern. The main challenges depicted in this work need to
be taken into account, mainly privacy and ethical concerns, as
these could severely impact LA development. Also, research
needs to be pulled to practice as there is a lack of such trans-
ference.
Nowadays, it is necessary to personalize learning environ-
ments and let learners be creators of their education. LA is
a practical approach that can explore how learners learn and
support them in adjusting the learning environment to their
needs. The final goal is to give them control of their learning
process.
Research in LA has had great attention from the edu-
cational community. However, its application in real LA
projects is still lacking. There is a need to transfer research
into practice by applying a user-centric approach and includ-
ing the final user in any course [21,22]. In general, education
research is incomplete, imprecise, and qualitative. There is
an excellent opportunity to use educational data with Artifi-
cial Intelligence, and Soft Computing approaches. However,
the results will depend on the quality of the input data [15].
Online technologies can aid in turbulent times, such as
aCoVid 19 pandemic, which has caused a redesign of the
teaching–learning process and an option to give continuity
to academic issues. This has posed a challenge and a benefit,
allowing professors and students to develop digital compe-
tency quickly. In addition, all the data generated during this
disruption can be analyzed with LA techniques and make
education adapt to the new delivery forms effectively and to
the new normality.
A review of LA was performed, where basic concepts were
discussed. The practices that 16 educational organizations
such as Arizona State University, Georgia State University,
Rio Salado College, The Open University, and the University
of Alabama have adopted regarding the theme are described.
The majority of them use LA to retain students; others focus
on improving academic issues. They have also developed in-
house software or models. According to the organizations,
results are promising as they have achieved good percentages
of retention/improvements over the years.
Acknowledgements The authors would like to acknowledge the finan-
cial and technical support of Writing Lab, TecLabs, Tecnológico de
Monterrey, Mexico Institute for the Future of Education, Tecnologico
de Monterrey, Mexico, in the production of this work.
Open Access This article is licensed under a Creative Commons
Attribution 4.0 International License, which permits use, sharing, adap-
tation, distribution and reproduction in any medium or format, as
long as you give appropriate credit to the original author(s) and the
source, provide a link to the Creative Commons licence, and indi-
cate if changes were made. The images or other third party material
in this article are included in the article’s Creative Commons licence,
unless indicated otherwise in a credit line to the material. If material
is not included in the article’s Creative Commons licence and your
intended use is not permitted by statutory regulation or exceeds the
permitted use, you will need to obtain permission directly from the copy-
right holder. To view a copy of this licence, visit http://creativecomm
ons.org/licenses/by/4.0/.
References
1. Aldowah, H., Al-Samarraie, H., Fauzy, W.M.: Educational data
mining and learning analytics for 21st century higher education: a
review and synthesis. Telemat. Inform. 37, 113–149 (2019)
2. Aristovnik, A., Keržiˇc, D., Ravšelj, D., Tomaževiˇc, N., & Umek,
L.: Impacts of the COVID-19 pandemic on life of higher educa-
tion students: A Global Perspective. www.Preprints.Org,https://
doi.org/10.20944/preprints202008.0246.v1(2020)
3. Arroway, P., Morgan, G., O’Keefe, M., & Yanosky, R.: Learning
analytics in higher education. Louisville, CO: ECAR. (2016)
4. Avella, J.T., Kebritchi, M., Nunn, S.G., Kanai, T.: Learning ana-
lytics methods, benefits, and challenges in higher education : a
systematic literature review. Online Learn. 20(2), 13–29 (2016)
5. Axelsen, M., Heinrich, E., Henderson, M.: The evolving field of
learning analytics research in higher education : from data analysis
to theory generation, an agenda for future research. Australasian J.
Edu. Technol 36(2), 1–8 (2020)
6. Banihashem, S.K., Aliabadi, K., Ardakani, S.P., Delaver, A.,
Ahmadabadi, M.N.: Learning analytics : a systematic literature
review learning. Interdiscip J Virtual Learn Med Sci 9(2), 1–11
(2018). https://doi.org/10.5812/ijvlms.63024
7. Barana, A., Conte, A., Fissore, C., Marchisio, M., Rabellino, S.:
Learning analytics to improve formative assessment strategies. J of
E-Learn. and Knowledge Soc. 15(3), 75–88 (2019)
8. Bellini, C., Santis, ADe., Sannicandro, K., Minerva, T.: Data Man-
agement in learning analytics: terms and perspectives. J. E-Learn.
Knowledge Soc. 15(3), 133–144 (2019)
9. Bienkowski, M., Feng, M., & Means, B.: Enhancing teach-
ing and learning through educational data mining and learning
analytics: an issue Brief. U.S. Department of Education Office
of Educational Technology. Retrieved from //tech.ed.gov/wp-
content/uploads/2014/03/edm-la-brief.pdf (2012)
10. Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., &
Verbert, K.: Open learner models and learning analytics dashboards
: a systematic review. In: LAK’18: Int Conf on Learning Analytics
and Knowledge (p. 10). Sydney,: ACM Press. (2018)
11. Boyer, A., & Bonnin, G. (n.d.). Higher education and
the revolution of learning analytics. international coun-
cil for open and distance education. Retrieved from
//static1.squarespace.com/static/5b99664675f9eea7a3ecee82/t/5b
eb449703ce644d00213dc1/1542145198920/anne_la_report+cc+li
cence.pdf
12. Broadfoot, P., Timmis, S., Payton, S., Oldfield, A., Sutherland, R.:
Learning Analytics and Technology Enhanced Assessment (TEA).
University of Bristol, Retrieved from //www.bristol.ac.uk/media-
library/sites/education/migrated/documents/learninganalytics.pdf
(2013)
13. Campbell, J., DeBlois, P., & Oblinger, D.: Academic Analyt-
ics: A New Tool for a New Era. Retrieved July 14, 2020,
from //er.educause.edu/articles/2007/7/academic-analytics-a-new-
tool-for-a-new-era (2007)
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230 1229
14. Cantabella, M., Martínez-españa, R., Ayuso, B., Yáñez, J.A.: Anal-
ysis of student behavior in learning management systems through a
big data framework. Future Gen. Comput. Syst. 90, 262–272 (2019)
15. Charitopoulos, A., Rangoussi, M., & Koulouriotis, D.: On the
Use of Soft Computing Methods in Educational Data Mining
and Learning Analytics Research : a Review of Years 2010
2018. Int J of Artificial Intelligence in Education. Retrieved from
//link.springer.com/article/10.1007%2Fs40593–020–00200–8#cit
eas (2020)
16. Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Mohamed, A.,
Yousef, F., Schroeder, U. (2014). Learning analytics : challenges
and future research directions. E-Learning and Education, (2016).
17. Chatti, M.A., Muslim, A.: The PERLA framework : blending
personalization and learning analytics. Int Rev. Res. Open and Dis-
tribut. Learn. 20(1), 243–261 (2019)
18. Couldry, N., Yu, J.: Deconstructing datafication’s brave new world.
New Media & Soc. 20(12), 4473–4491 (2018)
19. DeFreitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams,
E., Ambrose, M., Arnab, S.: Foundations of dynamic learning ana-
lytics: using university student data to increase retention. British J
of Edu. Technol. 46(6), 1175–1188 (2015)
20. Dietz-uhler, B., Hurn, J.E.: Using learning analytics to predict
(and improve) student success: a faculty perspective. J. Interactive
Online Learn. 12(1), 17–26 (2013)
21. Dollinger, M., & Lodge, J. M.: Cocreation strategies for learning
analytics. In 8th Int Conf on Learning Analytics and Knowledge
(pp. 97–101). (2018)
22. Domínguez Figaredo, D., Reich, J., Ruipérez-Valiente, J.A.: Learn-
ing analytics and data-driven education: a growing field. RIED
Revista Iberoamericana de Educación a Distancia 23(2), 33–43
(2020)
23. Dublin City University: PredictED. Retrieved September 02, 2020,
from //predictedanalytics.wordpress.com/ (2020)
24. Fain, P.: Phase Two of Completion. Retrieved July 14, 2020, from
//www.insidehighered.com/news/2015/06/02/sinclair-community-
colleges-15-years-completion-projects-pay (2015)
25. Ferguson, R.: Learning analytics: drivers, developments and chal-
lenges. Int J of Technol. Enhanced Learn. 4(5/6), 304–317 (2012)
26. Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G.,
Mittelmeier, J., Vuorikari, R.: Research Evidence on the Use
of Learning Analytics - Implications for Education Policy.(R.
Vuorikari & J. Muñoz, Eds.). Joint Research Centre Science for
Policy Report; EUR 28294 EN; (2016)
27. Guzmán-Valenzuela, C., Gómez-González, C., Rojas-Murphy
Tagle, A., Lorca-Vyhmeister, A.: Learning analytics in higher edu-
cation: a preponderance of analytics but very little learning? Int. J.
Edu. Technol. Higher Edu. 18(1), 1–19 (2021)
28. Hilbig, R., & Schildhauer, T.: Data analytics : the future of inno-
vative teaching and learning. In The ISPIM Innovation Conference
(pp. 1–17). (2019)
29. Huang, R.H., Liu, D.J., Tlili, A., Yang, J.F., Wang, H.H., et al.:
Handbook on Facilitating FlexibleLearning During Educational
Disruption: The Chinese Experience in Maintaining Undisrupted
Learning in COVID-19 Outbreak. Smart Learning Institute of Bei-
jing Normal University, Beijing (2020)
30. Hwang, G., Spikol, D., Li, K.: Trends and research issues of learn-
ing analytics and educational big data. Edu. Technol. Soc. 21(2),
134–136 (2018)
31. Ifenthaler, D., Yau, J.: Higher education stakeholders views on
learning analytics policy recommendations for supporting study
success. Int J. Learn. Anal. Artificial Intell. Edu. 1(1), 28–42 (2019)
32. Ifenthaler, D., Yin, J., & Yau, K.: Utilising learning analytics to
support study success in higher education : a Systematic Review.
Educational Technology Research and Development, 1–30.
//doi.org/https://doi.org/10.1007/s11423-020-09788-z(2020)
33. Impey, C.: Massive online open courses see
exponential growth during COVID-19 pandemic.
World.edu. Retrieved September 02, 2020, from
https://world.edu/massive-online-open-courses-see-exponential-gr
owth-during-covid-19-pandemic/ (2020)
34. Joksimovi´c, S., Kovanovi´c, V., Dawson, S.: The journey of learning
analytics. HERDSA Rev. Higher Edu. 6, 37–63 (2019)
35. Jones, K.M.L.: Learning analytics and higher education : a pro-
posed model for establishing informed consent mechanisms to
promote student privacy and autonomy. Int J. Edu. Technol. Higher
Edu. 16(24), 1–22 (2019)
36. Kanuru, S.L., Priyaadharshini, M.: Lifelong learning in higher
education using learning analytics. Procedia Comput. Sci. 172,
848–852 (2020)
37. Kei, L., Simon, L., Lam, K.S.C., Kwok, F.: Learning analytics :
current trends and innovative practices. J. Comput. Edu. 7(1), 1–6
(2020). https://doi.org/10.1007/s40692-020-00155-8
38. Kerres, M.: Against all odds: Education in Germany coping with
Covid-19. Postdigital Science and Education, pp. 1–5. (2020)
39. Kilis, S. & Uzun, A.: E-learning analytics: moodle case. multidis-
ciplinary academic conference, pp 223–228. (2019)
40. Klašnja-Mili´cevi´c, A., Ivanovi´c, M., Vesin, B., Satratzemi, M., &
Lillehaug, B. W.:Learning Analytics–Trends and Challenges. Fron-
tiers in Artificial Intelligence, 5. (2022)
41. Klein, C., Lester, J., Rangwala, H., Johri, A.: Technological barriers
and incentives to learning analytics adoption in higher education :
insights from users. J. Comput. Higher Edu. 31(3), 604–625 (2019)
42. Kollom, K., Tammets, K., Scheffel, M., Tsai, Y.S., Jivet, I.,
Muñoz-Merino, P.J., Ley, T.: A four-country cross-case analysis
of academic staff expectations about learning analytics in higher
education. The Internet and Higher Edu. 49, 100788 (2021)
43. Korir, M., Slade, S., Holmes, W., & Rienties, B.: Eliciting students’
preferences for the use of their data for learning analytics. Open
World Learning: Research, Innovation and the Challenges of High-
Quality Education. (2021)
44. Kusiak, A.: Smart manufacturing must embrace big data. Nature
544(7648), 23–25 (2017)
45. Lang, C., Siemens, G., Wise, A.F., Gaševi´c, D.: Handbook of learn-
ing analytics. Retriev From (2017). https://doi.org/10.18608/hla17
46. Lee, K.: Rethinking the accessibility of online higher education: a
historical overview. The Internet and Higher Edu. 33, 15–23 (2017)
47. Lee, L.K., Cheung, S.K.: Learning analytics: current trends and
innovative practices. J. Comput. Edu. 7(1), 1–6 (2020)
48. Little, R.: The student success plan: case management
and intervention software. Retrieved July 15, 2020, from
https://er.educause.edu/articles/2011/12/the-student-success-plan-
case-management-and-intervention-software (2011)
49. Lizier, A. (2020). Remember, Learning Analytics are About Learn-
ing. Training and Development, pp. 1–3.
50. Lombardo, M.M., Eichinger, R.W.: The Career Architect Devel-
opment Planner, 1st edn. Lominger, Minneapolis (1996)
51. Morán-Mirabal, L.F. Multimodal technologies for learning analyt-
ics research. IFE Living Lab & Data Hub
52. Namoun, A., Alshanqiti, A.: Predicting student performance using
data mining and learning analytics techniques: a systematic litera-
ture review. Appl. Sci. 11(1), 237 (2020)
53. Pazmiño-maji, R. A., García-peñalvo, F. J., & Conde-gonzález, M.
A.: Approximation of statistical implicative analysis to learning
analytics : a systematic review. In TEEM’16 (p. 8). Salamanca:
ACM Press. (2016)
54. Qian, Y., & Huang, G.: Technology Leadership for Innovation
in Higher Education. Hershey PA: IGI GLobal. Retrieved from
https://books.google.com.mx/books?id=0ayMDwAAQBAJ&pg=
PA145&lpg=PA145&dq=Arizona+State+University+Learning+
Analytics+Hub&source=bl&ots=te1jUdo-7Z&sig=ACfU3U3Ik-
SCWC6zhzOzR06ZbiuiPX9Nyg&hl=es&sa=X&ved=
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1230 International Journal on Interactive Design and Manufacturing (IJIDeM) (2022) 16:1209–1230
2ahUKEwjo-vn8-sLqAhUNca0KHXDaBFYQ6AEwAnoECAkQ
AQ#v=onepage&q=Arizona State University Learning Analytics
Hub&f=false (2019)
55. Ranjeeth, S., Latchoumi, T., Victer Paul, P.: A survey on predictive
models of learning analytics. Procedia Comput. Sci 167, 37–46
(2020)
56. Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K.,
Murphy, S.: Analytics4Action evaluation framework : a review of
evidence-based learning analytics interventions at the open univer-
sity UK. J. Interactive Media in Edu. 2016(1), 1–11 (2016)
57. Rienties, B., Nguyen, Q., Holmes, W., & Reedy, K. (2017). A
review of ten years of implementation and research in aligning
learning design with learning analytics at the open University UK.
Interaction Design and Architecture(S),33, 134–154.
58. Rio Salado College.: Rio Salado College and Learning Analytics.
Retrieved from https://www.riosalado.edu/web/selfStudy/Rio
SaladoCollege Self-Study 2012/Resource Room
Documents/Self-Study Criterion 3/RioPacePredictiveAnalyt-
icsModel_June 2011.pdf (2011)
59. Rojas-Castro, P.: Learning analytics. una revisión de la literatura.
Educación y Educadores,20(1), 106–128. (2017)
60. ¸Sahin, M., Yurdugül, H.: Educational data mining and learning
analytics: past, present and future. Bartın Univ. J. Faculty of Edu.
9(1), 121–131 (2020)
61. Salihoun, M.: State of art of data mining and learning analytics
tools in higher education. Int. J. Emerg. Technol. Learn. (iJET)
15(21), 58–76 (2020)
62. Sandars, J., Correia, R., Dankbaar, M., de Jong, P., Goh, P. S., Hege,
I., ... & Webb, A.: Twelve tips for rapidly migrating to online learn-
ing during the COVID-19 pandemic. MedEdPublish, p. 9. (2020)
63. Sclater, N., Peasgood, A., & Mullan, J.: Learning analytics in higher
education. A review of UK and international practice. Jisc. (2016)
64. Schmidt, M. T.: Assessing the Effectiveness of personalized
computer-administered feedback in an introductory biology
course (Doctoral dissertation, University of Saskatchewan). (2019)
65. Shenoy, V., Mahendra, S., Vijay, N.: COVID 19 lockdown technol-
ogy adaption, teaching, learning, students engagement and faculty
experience. Mukt Shabd J 9(4), 698–702 (2020)
66. Sin, K., Muthu, L.: Application of big data in education data mining
and learning analytics a literature review. ICTACT J on Soft
Computing 05(04), 1035–1049 (2015)
67. Teräs, M., Suoranta, J., Teräs, H., Cur, M.: Post-Covid-19 edu-
cation and education technology ‘Solutionism’: a Seller’s market.
Postdigital Sci. Edu. (2020). https://doi.org/10.1007/s42438-020-
00164-x
68. Tsai, Y., & Gasevic, D.: Learning analytics in higher education
challenges and policies : a review of eight learning analytics
policies. In LAK ’17, (pp. 1–15). Vancouver. (2017)
69. Tsai, Y., Rates, D., Moreno-marcos, P.M., Muñoz-merino, P.J.,
Jivet, I., Scheffel, M., Gaševi´c, D.: Learning analytics in european
higher education trends and barriers. Comput. Edu. 155(May),
1–15 (2020)
70. Viberg,O., Hatakka, M., Bälter, O., Mavroudi, A.: The current land-
scape of learning analytics in higher education. Comput. Human
Behav. 89, 98–110 (2018)
71. Vieira, C., Parsons, P., Byrd, V.: Computers & education visual
learning analytics of educational data : a systematic literature
review and research agenda. Comput. Edu. 122, 119–135 (2018).
https://doi.org/10.1016/j.compedu.2018.03.018
72. West, D., Heath, D., Huijser, H.: Let’s talk learning analytics: a
framework for implementation in relation to student retention. J.
Asynchronous Learn. Netw. 20(2), 1–21 (2016)
Publisher’s Note Springer Nature remains neutral with regard to juris-
dictional claims in published maps and institutional affiliations.
123
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers and authorised users (“Users”), for small-
scale personal, non-commercial use provided that all copyright, trade and service marks and other proprietary notices are maintained. By
accessing, sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of use (“Terms”). For these
purposes, Springer Nature considers academic use (by researchers and students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and conditions, a relevant site licence or a personal
subscription. These Terms will prevail over any conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription
(to the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of the Creative Commons license used will
apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may also use these personal data internally within
ResearchGate and Springer Nature and as agreed share it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not
otherwise disclose your personal data outside the ResearchGate or the Springer Nature group of companies unless we have your permission as
detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial use, it is important to note that Users may
not:
use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at
onlineservice@springernature.com
... Within LA, it is widely appreciated that predictive analytics tools that identify at-risk students hold considerable potential to address these challenges at least in part, by providing the ability for timely interventions to be initiated with at-risk learners which can result in corrective measures being undertaken by them [26]. However, in their survey of LA applications, Hernández-de Menéndez et al. [17] conclude that LA technologies are generally not yet widely used in this sector despite the evident potential they offer to HEI. Indeed, Jang et al. [18] highlight that despite the clear opportunities offered by the predictive analytics technologies, the developed tools tend to persist only as research content. ...
... The authors concluded that there is still limited use of personal characteristics data such as psychological and social/behavioural features for developing student performance predictions and that future research should focus on including these to address dropout rates. Finally, Hernández-de Menéndez et al. [17] investigated the practices of 16 HEIs that have deployed LA projects. The authors found that they have mostly used LA technologies for student retention. ...
Preprint
Full-text available
A significant body of recent research in the field of Learning Analytics has focused on leveraging machine learning approaches for predicting at-risk students in order to initiate timely interventions and thereby elevate retention and completion rates. The overarching feature of the majority of these research studies has been on the science of prediction only. The component of predictive analytics concerned with interpreting the internals of the models and explaining their predictions for individual cases to stakeholders has largely been neglected. Additionally, works that attempt to employ data-driven prescriptive analytics to automatically generate evidence-based remedial advice for at-risk learners are in their infancy. eXplainable AI is a field that has recently emerged providing cutting-edge tools which support transparent predictive analytics and techniques for generating tailored advice for at-risk students. This study proposes a novel framework that unifies both transparent machine learning as well as techniques for enabling prescriptive analytics. This work practically demonstrates the proposed framework using predictive models for identifying at-risk learners of programme non-completion. The study then further demonstrates how predictive modelling can be augmented with prescriptive analytics on two case studies in order to generate human-readable prescriptive feedback for those who are at risk.
Article
In recent years, distance learning using learning management and e-book systems has been actively conducted in higher education institutions and various other organizations. It is possible to collect and analyze learning logs even in classes with many learners, including clickstreams and quiz scores in detail for each individual. This research proposes using Moodle's learning logs to classify learning patterns and outliers in order to identify struggling learners. The proposed method uses the descriptive statistics between the learner's teaching material clickstream and the final test score accumulated in Moodle, and students can be classified into four learning patterns. The frequency of each learning pattern was correlated with the appearance of outliers in the final test score and the teaching material clickstream. Most learners moved through four learning patterns during the weekly lessons, however, some learners scoring at the top and bottom of the weekly quiz scores repeated the same learning patterns. There was a tendency to correspond to an outlier due to the repetition of the same learning pattern. The time-series learning analytics of the teaching material clickstream revealed that learners with low final test scores and abnormal values tended to fall under a learning pattern with a smaller teaching material clickstream and a smaller access outside class hours.
Chapter
Full-text available
Research on student perspectives of learning analytics suggests that students are generally unaware of the collection and use of their data by their learning institutions, and they are often not involved in decisions about whether and how their data are used. To determine the influence of risks and benefits awareness on students’ data use preferences for learning analytics, we designed two interventions: one describing the possible privacy risks of data use for learning analytics and the second describing the possible benefits. These interventions were distributed amongst 447 participants recruited using a crowdsourcing platform. Participants were randomly assigned to one of three experimental groups – risks, benefits, and risks and benefits – and received the corresponding intervention(s). Participants in the control group received a learning analytics dashboard (as did participants in the experimental conditions). Participants’ indicated the motivation for their data use preferences. Chapter 11 will discuss the implications of our findings in relation to how to better support learning institutions in being more transparent with students about the practice of learning analytics.
Article
Full-text available
In a context where learning mediated by technology has gained prominence in higher education, learning analytics has become a powerful tool to collect and analyse data with the aim of improving students' learning. However, learning analytics is part of a young community and its developments deserve further exploration. Some critical stances claim that learning analytics tends to underplay the complexity of teaching-learning processes. By means of both a bibliometric and a content analysis, this paper examines the publication patterns on learning analytics in higher education and their main challenges. 385 papers that were published in WoScc and SciELO indexes between 2013 and 2019 were identified and analysed. Learning analytics is a vibrant and fast-developing community. However, it continues to face multiple and complex challenges, especially regarding students' learning and their implications. The paper concludes by distinguishing between a practice-based and management-oriented community of learning analytics and an academic-oriented community. Within both communities, though, it seems that the focus is more on analytics than on learning. Supplementary information: The online version contains supplementary material available at 10.1186/s41239-021-00258-x.
Article
Full-text available
The prediction of student academic performance has drawn considerable attention in education. However, although the learning outcomes are believed to improve learning and teaching, prognosticating the attainment of student outcomes remains underexplored. A decade of research work conducted between 2010 and November 2020 was surveyed to present a fundamental understanding of the intelligent techniques used for the prediction of student performance, where academic success is strictly measured using student learning outcomes. The electronic bibliographic databases searched include ACM, IEEE Xplore, Google Scholar, Science Direct, Scopus, Springer, and Web of Science. Eventually, we synthesized and analyzed a total of 62 relevant papers with a focus on three perspectives, (1) the forms in which the learning outcomes are predicted, (2) the predictive analytics models developed to forecast student learning, and (3) the dominant factors impacting student outcomes. The best practices for conducting systematic literature reviews, e.g., PICO and PRISMA, were applied to synthesize and report the main results. The attainment of learning outcomes was measured mainly as performance class standings (i.e., ranks) and achievement scores (i.e., grades). Regression and supervised machine learning models were frequently employed to classify student performance. Finally, student online learning activities, term assessment grades, and student academic emotions were the most evident predictors of learning outcomes. We conclude the survey by highlighting some major research challenges and suggesting a summary of significant recommendations to motivate future works in this field.
Article
Full-text available
The purpose of this paper is to explore the expectations of academic staff to learning analytics services from an ideal as well as a realistic perspective. This mixed-method study focused on a cross-case analysis of staff from Higher Education Institutions from four European universities (Spain, Estonia, Netherlands, UK). While there are some differences between the countries as well as between ideal and predicted expectations, the overarching results indicate that academic staff sees learning analytics as a tool to understand the learning activities and possibility to provide feedback for the students and adapt the curriculum to meet learners' needs. However, one of the findings from the study across cases is the generally consistently low expectation and desire for academic staff to be obligated to act based on data that shows students being at risk of failing or under-performing.
Article
Full-text available
Online teaching environments acquire extremely high granularity of data, both on users’ personal profiles and on their behaviour and results. Learning Analytics (LA) is open to numerous possible research scenarios thanks to the development of technology and the speed of data collection.One characteristic element is that the data are not anonymous, but they reproduce a personalization and identification of the profiles. Identifiability of the student is implicit in the teaching process, but access to Analytics techniques reveals a fundamental question: “What is the limit?” The answer to this question should be preliminary to any use of data by students, teachers, instructors and managers of the online learning environments.In the present day, we are also experiencing a particular moment of change: the effects of the European General Data Protection Regulation (GDPR) 679/2016, the general regulation on the protection of personal data that aims to standardize all national legislation and adapt it to the new needs dictated by the evolving technological context.The objective of this work is to propose a three-point checklist of the questions connected to the management and limits of teachers’ use of data in Learning Analytics and students’ right of transparency in the context of Higher Digital Education, to take into account before conducting research. To this end, the paper contains an examination of the literature on privacy and ethical debates in LA. Work continues with legislative review, particularly the Italian path, and the discussion about online data management in our current universities’ two contexts: technology and legislation.
Article
Full-text available
The Covid-19 pandemic and the social distancing that followed have affected all walks of society, also education. In order to keep education running, educational institutions have had to quickly adapt to the situation. This has resulted in an unprecedented push to online learning. Many, including commercial digital learning platform providers, have rushed to provide their support and 'solutions', sometimes for free. The Covid-19 pandemic has therefore also created a sellers' market in ed-tech. This paper employs a critical lens to reflect on the possible problems arising from hasty adoption of commercial digital learning solutions whose design might not always be driven by best pedagogical practices but their business model that leverages user data for profit-making. Moreover, already before Covid-19, there has been increasing critique of how ed-tech is redefining and reducing concepts of teaching and learning. The paper also challenges the narrative that claims, 'education is broken, and it should and can be fixed with technology'. Such technologization, often seen as neutral, is closely related to educationalization, i.e. imposing growing societal problems for education to resolve. Therefore, this is a critical moment to reflect how the current choices educational institutions are making might affect with Covid-19 education and online learning: Will they reinforce capitalist instrumental view of education or promote holistic human growth? This paper urges educational leaders to think carefully about the decisions they are currently making and if they indeed pave the way to a desirable future of education.
Article
Full-text available
The conventional education system lacks the focus to create employable graduates. The existing industries in the Information Technology sector widely recruit based on a very specific set of skills and academic performances. To create better career opportunities, the colleges and universities should ensure that the graduates are qualified to accomplish the basic skills required by any organizations. This demands systematic approaches to be adopted with an innovative teaching style during the academic curriculum-based training in engineering colleges. The international accreditation organization ABET is a worldwide recognized educational board that provides streamlined guidelines for competency skills and to deliver a quality education for students. In this research work, an extensive study has been performed on the needs of the industry and it has been compared to the quality of the course being offered to the students. The curriculum is designed based on the industry experts’ feedback. To achieve the Student Outcome Criteria as per ABET accreditation, the practices of structured approach is adopted along with the vision of lifelong learning in this research work. The teaching-learning process of first year under-graduate programming course and its evaluation techniques is considered in this research work. The competency skills like problem solving skills, critical thinking, and creative thinking are analysed using learning analytics strategies for first year Python programming course. The performances of the students are broadly categorized based on the metrics like logical, conceptual, analytical and conceptual thinking, while simultaneously focusing on their time management skills and commitment to learn. Artificial Neural Network (ANN), Naive Bayesian algorithm and logistic regression models are used to identify and measure the competency skills of the learners achieved in this course and validating these metrics with the student learning outcome. Implementing Artificial Intelligence concepts will provide results that can aid in creating the most suitable teaching-learning environment resulting in the best outcome for disruptive engineering education. Learning Analytics will provide an understanding and optimization of learning and its environments thereby ensuring sustainable development. This analysis presents an opportunity to identify the gap between the academic curriculum and the industry’s expectations in terms of competency skills to be acquired by the learners. Additionally, it helps to improve the teaching-learning process according to the dynamic changes from the industries and builds the foundation for students to be lifelong learners.
Article
In this decade, the use of learning management systems (LMS) does not cease to increase, becoming one of the most popular approaches adopted and widely used in the learning process. Learners' online activities generate a huge amount of unused data that is wasted because traditional learning analyses are not able to process them. In this regard, a large collection of applications/tools have emerged to conduct research in educational data mining (EDM) and / or learning analysis (LA). This study looks into the recent applications/tools of Big Data technologies in education and presents some of the most widely used, accessible, and powerful tools in this field of research. The majority of these tools are for researchers with the purpose of conducting research on educational data mining and learning analysis.
Article
The aim of this paper is to survey recent research publications that use Soft Computing methods to answer education-related problems based on the analysis of educational data ‘mined’ mainly from interactive/e-learning systems. Such systems are known to generate and store large volumes of data that can be exploited to assess the learner, the system and the quality of the interaction between them. Educational Data Mining (EDM) and Learning Analytics (LA) are two distinct and yet closely related research areas that focus on this data aiming to address open education-related questions or issues. Besides ‘classic’ data analysis methods such as clustering, classification, identification or regression/analysis of variances, soft computing methods are often employed by EDM and LA researchers to achieve their various tasks. Their very nature as iterative optimization algorithms that avoid the exhaustive search of the solutions space and go for possibly suboptimal solutions yet at realistic time and effort, along with their heavy reliance on rich data sets for training, make soft computing methods ideal tools for the EDM or LA type of problems. Decision trees, random forests, artificial neural networks, fuzzy logic, support vector machines and genetic/evolutionary algorithms are a few examples of soft computing approaches that, given enough data, can successfully deal with uncertainty, qualitatively stated problems and incomplete, imprecise or even contradictory data sets – features that the field of education shares with all humanities/social sciences fields. The present review focuses, therefore, on recent EDM and LA research that employs at least one soft computing method, and aims to identify (i) the major education problems/issues addressed and, consequently, research goals/objectives set, (ii) the learning contexts/settings within which relevant research and educational interventions take place, (iii) the relation between classic and soft computing methods employed to solve specific problems/issues, and (iv) the means of dissemination (publication journals) of the relevant research results. Selection and analysis of a body of 300 journal publications reveals that top research questions in education today seeking answers through soft computing methods refer directly to the issue of quality – a critical issue given the currently dominant educational/pedagogical models that favor e-learning or computer- or technology-mediated learning contexts. Moreover, results identify the most frequently used methods and tools within EDM/LA research and, comparatively, within their soft computing subsets, along with the major journals relevant research is being published worldwide. Weaknesses and issues that need further attention in order to fully exploit the benefits of research results to improve both the learning experience and the learning outcomes are discussed in the conclusions.