PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

The pandemic of the COVID-19 has accelerated the digital transformation oflearning and training. Many industries, especially higher education institutions,have been using e-Learning platforms. Learning Management System (LMS) inparticular, has gained popularity as a tool capable of managing learning andtraining processes as well as supporting learning process and administration. Itpresents an important package of services and features capable of facilitating andenhancing the e-Learning experience. Nevertheless, LMS offers only a few, if notno, functionalities when it gets to tracking the instructional progress of students.Progress in a learning experience is indicative of how students interact withcourses and learning materials. Based on such interaction, the teacher can detectlower performers, project students’ progress, and anticipate the ones at risk aswell as those who need intervention. The use of the LMS generates an importantamount of data. This data can become extremely useful when analysed andtransformed into indicators reflecting the learner’s progress. To this end, ourresearch project used learning analytics to determine and define indicatorscapable of monitoring students’ progress. With such analytics, we were able todetermine struggling students, lower performers and ultimately those who needreinforcement and tutoring. The main goal was to greatly improve students’success rates and the overall effectiveness of the learning process. We present acase study in which students’ progress was tracked and monitored using a set ofstudent’s Key Performance Indicators (KPIs). The data used in this study wascollected from the Web technologies course from the e-campus platform at the Cadi Ayyad University. The defined indicators allowed for tracking students’progress during the course, but they were not able to determine the performance of students.
Content may be subject to copyright.
Learning Analytics for Tracking Student Progress in
Aimad QAZDAR ( )
Cadi Ayyad University
Cadi Ayyad University
Cadi Ayyad University
Cadi Ayyad University
Cadi Ayyad University
Youssef MELK
Grand Canyon University
Case Report
Keywords: Data driven, Enhanced Learning Technologies, ABC Learning Design, Learning Analytics,
Academic performance
Posted Date: April 4th, 2022
License: This work is licensed under a Creative Commons Attribution 4.0 International License. 
Read Full License
QAZDAR et al.
Learning Analytics for Tracking Student Progress
in LMS
Aimad QAZDAR1*, Sara QASSIMI1, Oussama HASSIDI1
, Meriem HAFIDI1
, El Hassan ABDELWAHED1and Youssef MELK3
1Computer Systems Engineering
Laboratory (ISI), Computer
Science Department,Faculty of
Sciences Semlalia, Cadi Ayyad
University (UCA), Bd. Prince My
Abdellah, 40000, 2390 Marrakech,
Full list of author information is
available at the end of the article
Equal contributor
The pandemic of the COVID-19 has accelerated the digital transformation of
learning and training. Many industries, especially higher education institutions,
have been using e-Learning platforms. Learning Management System (LMS) in
particular, has gained popularity as a tool capable of managing learning and
training processes as well as supporting learning process and administration. It
presents an important package of services and features capable of facilitating and
enhancing the e-Learning experience. Nevertheless, LMS offers only a few, if not
no, functionalities when it gets to tracking the instructional progress of students.
Progress in a learning experience is indicative of how students interact with
courses and learning materials. Based on such interaction, the teacher can detect
lower performers, project students’ progress, and anticipate the ones at risk as
well as those who need intervention. The use of the LMS generates an important
amount of data. This data can become extremely useful when analysed and
transformed into indicators reflecting the learner’s progress. To this end, our
research project used learning analytics to determine and define indicators
capable of monitoring students’ progress. With such analytics, we were able to
determine struggling students, lower performers and ultimately those who need
reinforcement and tutoring. The main goal was to greatly improve students’
success rates and the overall effectiveness of the learning process. We present a
case study in which students’ progress was tracked and monitored using a set of
student’s Key Performance Indicators (KPIs). The data used in this study was
collected from the Web technologies course from the e-campus platform at the
Cadi Ayyad University. The defined indicators allowed for tracking students
progress during the course, but they were not able to determine the performance
of students.
Keywords: Data driven; Enhanced Learning Technologies; ABC Learning Design;
Learning Analytics; Academic performance
1 Introduction
The novel Coronavirus pandemic has drastically changed the world system. In the
Educational field, COVID-19 pandemic has transformed the teaching/learning pro-
cess and the students/teachers interactions from the traditional face to face, to the
new and virtual online. Due to the massive and unforeseen closures of institutions,
affected countries and communities have been forced to seek quick solutions based
on innovative technologies such as online learning platforms (LMS[1] and MOOC[2]),
[1]Learning Management System
[2]Massive Open Online Course
QAZDAR et al. Page 2 of 13
and to switch from the in-person to the fully online or hybrid classes with a smaller
number of students [1]. LMS and MOOC are an integrated systems which supports
the teaching/learning process and its administration [2]. LMS/MOOC presents
a package of services and features capable of enhancing e-Learning. It supports
teachers in developing e-learning courses and virtual classrooms where students can
enroll-in and study. In addition, it allows the inclusion of external learning materi-
als (resources/activities) and the reuse of already developed learning materials.An
LMS/MOOC can support designing, administering and grading assessments, as well
as planning for, creating and publishing courses. They also allow for synchronous
and /or asynchronous communications and interactions between all users (teachers,
students, tutors and administrators) through chat rooms, discussion forums, blog
posts and so on.
As the use of new information technology spreads rapidly, all most of data are
generated and exchanged digitally today. New data stored in digital form already
estimated for more than 92% in 2002, while the size of this new data was also over
five exabytes [3]. In Education and Training field, the use of the LMS/MOOC gen-
erates a huge amount of raw data. This data is critical for any given educational
entity and particularly for higher educational institutions. This data, provided it
has been critically analyzed, and synthesized into useful knowledge, can guide the
decision-making and advise all stakeholders (teachers, tutors, students, adminis-
trators and parents/ guardians). Additionally, such data can improve the quality
of both teaching and learning, and improve the overall students’ performance and
academic success [4].
Data-driven approach, in the Education field, is relatively linked to Educational
Data Mining (EDM) and Learning Analytics (LA). These are not by any means, two
new fields of study. It is in fact, the emergence of using “disruptive” technologies
in Education and the massive amount of data generated, that turned LA and EDM
into two promising fields of research capable of improving educational experiences
and decision making based on the aggregated data, for all Stakeholders (teachers,
tutors, students, administrators and researchers) [5].
Based on such a perspective, this study used learning analytics to analyse and pre-
dict the individual performance of students enrolled in a Web Technologies course
at UCA digital campus of the Faculty of Science Semlalia - Cadi Ayyad University
(MOROCCO). The main objective was to monitor the progress of students en-
rolled in the course on the platform, to predict any challenges to their learning and
to prevent them from failing at the end. The idea was to produce a system able to
generate predictive indicators using a dataset (logs file) collected from the UCA dig-
ital campus, these generated indicators could be used to monitor students progress
and to determine those who are not adequately progressing, poorly performing and
in need of reinforcement and tutoring. The outline of this paper goes as follows:
Section2 presents a literature review of the learning analytics, and related works.
Section 3 describes the context and the proposed methodology. Section 4 provides
a discussion of the obtained results. The conclusion and the planned future work
are presented in the last section (Section 5).
QAZDAR et al. Page 3 of 13
2 Background and related work
Learning analytics is a fast-increasing field of Technology-Enhanced Learning (TEL)
research. Its origin dates back more than ten years. In fact, since 2008, the concept
of analysis in education began to gain the attention of the researches with a focus
on understanding, enhancing, and optimizing the learning and teaching process.
From 2010, the concept of Learning Analytics (LA) was separated from the area
of analytics and became an independent field [6]. As a new research field, LA be-
came a interdisciplinary area between data mining, learning technology, pedagogy,
machine learning, business intelligence, big data, artificial intelligence, and statis-
tics [6, 7]. During this period, learning analytics was defined as : ”Using intelligent
data, learner-centered data generation and analysis models to explore information
and social interactions, predict learning, and provide recommendations for learning”
[8]. In 2011, the First International Conference on Learning Analytics and Knowl-
edge (LAK) was established in Banff, Alberta (Canada) [9]. The LAK first edition
conference organizing committee defined learning analytics as ”the measurement,
collection, analysis and reporting of data about learners and their contexts, for
purposes of understanding and optimising learning and the environments in which
it occurs” [9]. Implementing LA allows higher education institutions to better un-
derstand their students and the barriers to their learning, resulting in academic
success and the retention of a larger and more diverse student population, which
was important for operational facilities, fundraising, and admissions [10].
Many research studies have been conducted to understand the students’ learning
behaviors and optimize the LMS learning process using LA. Most of the developed
LA models and indicators served educators and educational institutions to identify
the students’ attitude and to detect those who were low performing or at-risk.
Student performance indicators are deeply dependent on the learning activities
and resources used in the LMS. The indicators can be classified into Productive,
Assimilative, Evaluation, Interactive and Communicative [11]. According to a sys-
tematic review conducted by [12], the most used indicators are Evaluation and Pro-
ductive activities. Evaluation activities had been used to determine the result of
activities and evaluations (formative and summative). Productive indicators could
determine the production actions of students, such as to create, complete, do, among
others. Interactive category showed also a good representativeness. Assimilative and
Communicative are the lowest used indicators. The analysis demonstrated that the
use of a combined indicator or several indicators can best represent the evalua-
tion of student engagement and participation [12]. The Learning Analytics at the
Dublin City University (Ireland) focused on three main axes: (1) improving test
performance using analytics from a general-purpose LMS like Moodle, (2) identify-
ing studying groups and the performance peer effect using on-campus geo-location
data, and (3) detecting lower-performing or at-risk students on programming mod-
ules [13]. LA could also be used in a computational environment to both analyze
and visualize student discussion groups working in a collaborative way to accom-
plish a task [14]. Using course signals, Purdue University allow professors to provide
real-time feedback to the students. Different measure such as Grades, demographic
data, interaction, and students’ effort are generated with the help of adopting the
traffic lights metaphor. In the same context, a personalized email is delivered to
QAZDAR et al. Page 4 of 13
students to inform them on their current status. The system helped to retain in-
formation, and performance outcomes were thus evaluated [15]. Learning Analytics
dashboards (LAD) had supported prior finds that visualizing learning behavior, and
helped students to reflect on their learning. LA framework LAViEW, discussed in
[16], presented an overview of the students indicators of engagement. Thus, Teachers
can directly send personalized feedbacks to selected cohorts of students, clustered
by their engagement indicator scores.
3 Study context
The study was conducted at the department of computer sciences within the Faculty
of Sciences Semlalia Cadi Ayyad University in Marrakech, Morocco. The data set
consisted of 154 instances of students registered in the third trimester of Mathemat-
ical and Computer Sciences Bachelor program. The data used collected in autumn
2020 from the Web technologies course hosted on the e-campus [3] at the depart-
ment. The Web technologies course was designed using the ABC Learning Design
framework [17]. Since its launch in 2015, ABC [4] learning design method has been
widely adopted across universities in Europe and beyond. It was developed by a
team from the University College London (UCL). ABC Learning Design focuses,
not on the technology, but on the types of learning that will take place, and what
students have to do in order to understand concepts. ABC LD was built on the
six learning types concept proposed by Laurillard [18]. The six learning types are :
Acquisition, Inquiry, Collaboration, Discussion, Practice, Production [19]. A good
learning design contains a mixture of these types of learning. The figure 1 below
represent a part of the course with the different used learning types.
Figure 1 course content
4 Data
4.1 Data Collection
The dataset used in this study consisted of teacher log files, student log files and
student gradebook. The course logs files allows the faculty member to see which
[4]Arena Blended Curriculum
QAZDAR et al. Page 5 of 13
resources or activities had been accessed and when. It outlined the date, and time
resources were accessed, the name of the student, completed actions (i.e., view, add,
update, or delete), the performed activities in different modules (e.g., the course,
forum, or quiz etc.), the Internet Protocol (IP) address from which it was accessed,
and additional information about the action [20]. Figure 2 show a part of the course
log file.
Figure 2 Logs file
The data set was completed by using the course gradebook. All of the grades for
each student in the course were recorded in the gradebook, or ”assessor’s report”.
The gradebook collects the items that were graded in the different parts of Moodle
that were assessed. And it allows teachers to view and edit them, as well as sort
them into categories and calculate totals in different ways. The grades are initially
displayed as the raw scores of the reviews themselves, so it would depend on how the
teacher had set them up [21]. The score could be presented gross or as a percentage
as shown in the figure 3.
Figure 3 Gradebooks file
4.2 Data Analysis
After data collection, an important question was what data and indicators to dis-
play to teachers and/or students on LA reports about student performance. In this
context, Key Performance Indicators (KPIs), also known as Key Success Indicators
(KSIs), are quantifiable measures that reflect, as faithfully as possible, the various
critical success factors at different levels [22]. Key Performance Indicators (KPIs)
serves educators and educational institutions to identify the students’ attitude and
to detect students who are low performance or at-risk. student’s key performance
indicators depends deeply on the learning design activities using in the LMS course
and on the data available in learning environments [23]. According to [24] ”Learn-
ing analytics are only likely to effectively and reliably enhance earning outcomes
QAZDAR et al. Page 6 of 13
if they are designed to measure and track signals that are genuine indicators of or
proxies for learning”. To this end, we defined five different pedagogical KPIs sus-
ceptible to give a global vision on the students activities through the LMS. The
indicators in question are adequate for the activities provided using the ABC learn-
ing design. The pedagogical indicators are : Connectivity,Acquisition,Productivity,
Interactivity, and Reactivity.
1Connectivity Indicator
The connectivity indicator aims to measure the state and the degree to which
the students are connected to the course. The connectivity indicator is calcu-
lated based on the course frequency consultation ”freqCons” , the average of
sessions duration ”sessionDuration” and the number of visited units ”num-
Connectivity = 0.1×f reqCons+0.3×sessionDuration+0.6×nbrOf U nits(1)
2Acquisition Indicator
The acquisition indicator specifies the student’s concepts acquisition progress.
It’s based on the number of achieved units ”nbrOfAchvUnits” and the average
score of quiz tentative (attempts) ”quizScore”. We consider that any visited
course unit is achieved.
Acquisition = 0.7×quizScore + 0.3×nbrOf AchvUnits (2)
3Productivity Indicator
The productivity indicator permits to evaluate the degree of learner’s produc-
tivity in the course. The productivity indicator is obtained based on the work-
shop grade ”workShGrad”, work shop peer evaluation grad ”wrkShPeerEvlG”,
and the assignement grade ”AssigGrad”.
P roductivity = 0.4×workShGrad+0.2×wr kShP eer EvlG+0.4×AssigGrad(3)
4Interactivity Indicator
The interactivity indicator is used to evaluate the learner’s degree of inter-
action and communication in the plateforms. This indicator is determinated
using number of student’s posts ”nbrOfPost” and the number of forum con-
sultation ”nbrOfForumCons”.
Interactivity = 0.8×nbr P ost + 0.2×nbrO fF or umCons (4)
5Reactivity Indicator
The reactivity indicator is applied to evaluate the degree of learner’s respon-
siveness to the course. This indicator is calculated by using the time of resource
publication ”timeOfResPub” and the time of the first learner’s consultation
Reactivity =timeO fF ir stCons timeO fR esP ub (5)
QAZDAR et al. Page 7 of 13
5 Overall results
5.1 Connectivity
From the connectivity graph shown in Figure 4, we notice that students’ connec-
tivity improves as they progress through the course. We also notice that the session
duration and the number of units visited are correlated. The students were more
connective to the course between the week 45 (2020) and 1 (2021), except for the
weeks 51 and 2 where the indicators has been decreased. The session duration was
between 30 minutes and 1h30. The number of unites vary between 80 and 128. For
the consultation frequency, it vary between 3 and 50 consultation per week. Due to
the end of publication of new chapters, the weeks 3 to 12 saw a decrease in this indi-
cator except for the week 5 and 10 in student which start to prepare the exam first
session. Durant the semester, the connectivity has six peaks in the following weeks
(46, 47, 50, 51, 53, and 2) which matches of the number of the course chapters.
Week 50 represent the week where the student consulted the maximum of unite;
128 units in 1 hour 30 minutes
Figure 4 Connectivity indicator
5.2 Acquisition
According to the graphic in Figure 5, the students acquisition could be subdivided
in two period. The first one is between the week 45 (2020) and the 2(2021) which
represent the evolution of the acquisition. In this period, students read the curse and
did the quiz. The peak of the acquisition was in the week 50. The second period was
between week 3 and 12 (2021).This period represents the decrease in acquisitions. In
this period student did quizzes without revising or consulting the course units. All
student had three attempts to do the quiz and all most of them left one tentative
for the end the semester to prepare for the exam. According the same graph we
note the the quiz score is correlated with the number of achievement units.
QAZDAR et al. Page 8 of 13
Figure 5 Acquisition indicator
5.3 Productivity
The productivity can answer about the ability of student to produce in the course.
In the analysis phase, we mentioned that the productivity is obtained using the
workshop grad, the workshop peer evaluation and the assessment grade. In this use
case we used only the workshop peer evaluation and the workshop grad. According
the graphic on view in Figure 6, we observe that the students were not productive
until week 47. The period between week 47 and the week 3 represent the period
of productivity evolution which is vary between 30 and 51 percent. The peak of
the productivity was in week number 52. The workshop peer evaluation rate was
higher than workshop grad which can be explained by the fact that the group was
composed of ”good” evaluators. Week number 6 shows a small change in student
productivity. This week represents the week of a second session/chance we gave to
students who missed a workshop in the first weeks.
5.4 Interactivity
As we mentioned before the interactivity indicator can give the teacher an idea
about the degree of interaction and the communication of student in the course
using especially the forum as a way for communication. Based on the graph Figure
7, the students were not interactive in the platform. The students consulted the
forum more than participated in the discussion. At the end of the semester, the week
five recognized an evolution of students interactivity where students participated
with more than five posts in the forum and consulting it more than 30 times. This
situation could be explained by the need of the students to get more information
about the exam or to ask for clarification of certain concepts in the course. However,
to confirm -or not- this hypothesis, we need a semantic analysis of the the student’s
QAZDAR et al. Page 9 of 13
Figure 6 Productivity indicator
Figure 7 Interactivity indicator
5.5 Reactivity
The reactivity is an important indicator that informs the teacher about the the
degree of students reactivity with the publication of courses resources and activities
on the platform. We notice that in almost all cases the students were not very
reactive for the first weeks. But after the week number 48(2020) they started to be
more reactive. Week 53 show a reactivity decrease which may be due to the end of
year vacation (See Figure 8).
5.6 Analysing indicators for some students
The following figure 9 represent an overview of the five students indicators : Con-
nectivity, Acquisition, Productivity, Interactivity, and Reactivity for six students
QAZDAR et al. Page 10 of 13
Figure 8 Reactivity indicator
from the samples (Std 3, Std 4, Std 26, Std 32, Std 143, and Std145). The students
were selected randomly according to their grade in the final exam as representing
in the table 1 above. The exam was in-person at the week 11(2021). To pass the
exam, the student must obtain at least 10/20 pts.
Table 1 Students grade in the exam
Std Grade Results
Std 3 14.25 Validate
Std 4 06.00 Not Validate
Std 26 16.5 Validate
Std 32 09.00 Not Validate
Std 143 07.25 Not Validate
Std 145 11.00 Validate
According to the graph of students indicators (Figure 9), Std 4, Std 26 and Std
143 did not exceed 110 in connectivity. Unlike Std 143, Std 4 and Std 32 which
respectively obtain 237, 190, 160 for the same indicator. The acquisition for student
who did passed the exam was between 50 and 80. For those didn’t passe the exam
was more than 120. All the six students got a productivity indicator between 40
and 95. However, we notice that student who passed the exam they didn’t miss any
workshop to do contrary for who failed the exam they miss at least one workshop.
We notice, from the same Figure 9, no variation in the indicators of Std 4 from
week 8. This can be explained by the dropout of this learner from week 8 (2021).
Unlike other students who have been logged into the platform for up to the last
three weeks.
6 Conclusions and perspectives
The following study was conduct to investigate about using indicators to monitor
the learners progression and performance through the learning platform. To this
end, Connectivity, Acquisition, Productivity, Interactivity, and Reactivity were de-
fined as indicators. The Indicators related to course activities designed based on
the ABC learning design framework. The dataset used consist of 154 instances of
QAZDAR et al. Page 11 of 13
Figure 9 Evolution of indicators
students registered in the third semester of Mathematical and Computer Sciences
Bachelor. The dataset was generated using the students logs file, gradebook file and
teacher logs file. After generating the dataset, the next steps consist on processing
and analysing data to determinate the different indicators in each week for all the
students or for a specific one.
At this stage of work, the indicators showed the student progress in the course, but
they aren’t able to determinate his or her performance. For that, we have defined
three objectives for future work. First objective, is to use this indicators to predict
student performance and determine student at risk. The second one is to define the
optimal values for each indicators. The third objective is to use the optimal value
to develop a recommendation system for student at risk and to prevent failure in
the exam.
Competing interests
The authors declare that they have no competing interests.
Ethics approval and consent to participate
Not applicable
Consent for publication
Not applicable
QAZDAR et al. Page 12 of 13
Availability of data and material
The data and the material are available from the corresponding author on reasonable request.
The authors declare that this research was not funded, and didn’t received no specific grant from any funding
agency in the public, commercial, or not-for-profit sectors.
Authors’ contributions
All the authors had contributed to the design and implementation of the case study, to the analysis of the results
and to the writing of the manuscript.
Not applicable
Authors’ information (optional)
Not applicable
Author details
1Computer Systems Engineering Laboratory (ISI), Computer Science Department,Faculty of Sciences Semlalia, Cadi
Ayyad University (UCA), Bd. Prince My Abdellah, 40000, 2390 Marrakech, Morocco. 2Department of Computer
Science, Faculty of Sciences and Techniques Guiliz (FSTG), University of Cadi Ayyad (UCA), Marrakech, Morrocco.
3Master of education. Curriculum and instructions Technology, , Grand Canyon University (GCU), Arizona, United
1. Ter¨as, M., Suoranta, J., Ter¨as, H., Curcher, M.: Post-Covid-19 Education and Education Technology
‘Solutionism’: a Seller’s Market. Postdigital Science and Education, 1–16 (2020).
2. Qazdar, A., Cherkaoui, C., Er-Raha, B., Mammass, D.: AeLF: Mixing Adaptive Learning System with Learning
Management System. International Journal of Computer Applications 119(15), 1–8 (2015).
3. Tsai, C.W., Lai, C.F., Chao, H.C., Vasilakos, A.V.: Big data analytics: a survey. Journal of Big Data 2(1), 1–32
(2015). doi:10.1186/S40537-015-0030-3/TABLES/3
4. Qazdar, A., Er-Raha, B., Cherkaoui, C., Mammass, D.: A machine learning algorithm framework for predicting
students performance: A case study of baccalaureate students in Morocco. Education and Information
Technologies 24(6) (2019). doi:10.1007/s10639-019-09946-8
5. Rotolo, D., Hicks, D., Martin, B.R.: What is an emerging technology? Research Policy 44(10), 1827–1843
(2015). doi:10.1016/j.respol.2015.06.006. 1503.00673
6. Banihashem, S.K., Aliabadi, K., Pourroostaei Ardakani, S., Delaver, A., Nili Ahmadabadi, M.: Learning
analytics: A systematic literature review. Interdisciplinary Journal of Virtual Learning in Medical Sciences 9(2)
7. Chui, K.T., Fung, D.C.L., Lytras, M.D., Lam, T.M.: Predicting at-risk university students in a virtual learning
environment via a machine learning algorithm. Computers in Human Behavior 107, 105584 (2020).
8. Siemens, G.: What are learning analytics? elearnspace. Recuperado de http://www. elearnspace.
org/blog/2010/08/25/what-are-learning-analytics/. Consultado el 1(07), 2012 (2010)
9. Long, P.: LAK’11: Proceedings of the 1st International Conference on Learning Analytics and Knowledge,
February 27-March 1, 2011, Banff, Alb erta, Canada. ACM, ??? (2011)
10. Hasan, R., Palaniappan, S., Mahmood, S., Abbas, A., Sarker, K.U., Sattar, M.U.: Predicting Student
Performance in Higher Educational Institutions Using Video Learning Analytics and Data Mining Techniques.
Applied Sciences 2020, Vol. 10, Page 3894 10(11), 3894 (2020). doi:10.3390/APP10113894
11. Rienties, B., Toetenel, L.: The impact of 151 learning designs on student satisfaction and performance: Social
learning (analytics) matters. ACM International Conference Proceeding Series 25-29-April-2016, 339–343
(2016). doi:10.1145/2883851.2883875
12. Costa, L.A., Salvador, L.N., Amorim, R.R.: Evaluation of Academic Performance Based on Learning Analytics
and Ontology: A Systematic Mapping Study. Proceedings - Frontiers in Education Conference, FIE 2018-Octob
(2019). doi:10.1109/FIE.2018.8658936
13. Azcona, D., Corrigan, O., Scanlon, P., Smeaton, A.F.: Innovative Learning Analytics Research at a data-driven
HEI. In: Proceedings of the 3rd International Conference on Higher Education Advances. Universitat Polit`ecnica
Val`encia, Valencia (2017). doi:10.4995/HEAD17.2017.5245.
14. Riquelme, F., Munoz, R., Mac Lean, R., Villarroel, R., Barcelos, T.S., de Albuquerque, V.H.C.: Using
multimodal learning analytics to study collaboration on discussion groups. Universal Access in the Information
Society 18(3), 633–643 (2019)
15. Arnold, K.E., Pistilli, M.D.: Course signals at purdue: Using learning analytics to increase student success. In:
Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, pp. 267–270 (2012)
16. Majumdar, R., Ak¸capınar, A., Ak¸capınar, G., Ogata, H., Flanagan, B.: Laview: Learning analytics dashboard
towards evidence-based education. In: Companion Proceedings of the 9th International Conference on Learning
Analytics and Knowledge (2019). Society for Learning Analytics Research (SoLAR)
QAZDAR et al. Page 13 of 13
17. Young, C.P., Perovi´c, N.: Ab c ld–a new toolkit for rapid learning design. In: European Distance Education
Network (EDEN) Conference 2020 (2020)
18. Laurillard, D.: Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology.
Routledge, ??? (2013)
19. ABC LD: The 6 Learning types ABC Learning Design (2020). Accessed
20. Mo odle: Logs - MoodleDo cs (2020). Accessed 2021-11-09
21. Mo odle: Grader report - MoodleDocs (2021). report Accessed
22. Jia, H., Wang, M., Ran, W., Yang, S.J.H., Liao, J., Chiu, D.K.W.: Design of a performance-oriented workplace
e-learning system using ontology. Expert Systems with Applications 38(4), 3372–3382 (2011).
23. Bussu, A., Detotto, C., Serra, L.: Indicators to prevent university drop-out and delayed graduation: an italian
case. Journal of Applied Research in Higher Education (2019)
24. Wilson, A., Watson, C., Thompson, T.L., Drew, V., Doyle, S.: Learning analytics: Challenges and limitations.
Teaching in Higher Education 22(8), 991–1007 (2017)
In Portugal, the demand for technology professionals is higher than the existing supply, with a tendency to worsen. This program aims to retrain a wide range of unemployed professionals, who have the potential to develop a career in the technological area. The program involves a training process of 3-6 months, to be taught in an educational institution, with the aim of preparing people to start a new profession in technical areas. Trainers at polytechnic institutes face a challenge related to defining the training actions to consider and the corresponding training contents. To achieve the main objective of this program, the chapter intends to demonstrate that the application of PBL can contribute to and facilitate the accelerated development of skills in technological and innovation areas. The aim is to help the trainers define the courses and choose the best strategy to evaluate the students' learning process. It is the intention to include (1) description of “UPSkills “program and PLB, (2) describe how to apply the PBL, and (3) present a course design proposal.
Full-text available
ABC Learning Design (ABC LD) is a high-energy, hands-on curriculum development workshop from University College London (UCL). In just 90 minutes teaching teams work together to create a visual “storyboard”. The storyboard is made up of pre-printed cards representing the type and sequence of learning activities (both online and offline) required to meet the module or programme learning outcomes. All the resources have been released under Creative Commons licenses and are free to download, adapt and use.ABC LD is now popular across European tertiary education and beyond. Participants have found the workshop-based “sprint” approach to be quick, engaging and productive. The original UCL or “base” ABC LD is built around a collaborative and intensive 90’ workshop in which module teams work together to produce a paper-based storyboard describing the student journey.Over the last two years UCL has led an Erasmus+ project to develop and evaluate the ABC LD method with 12 partners ( We have focused on localisation to institutional contexts and have explored the important link between storyboard designs and the Virtual Learning Environment. The main output is a freely downloadable Toolkit of resources and guides, enabling any college or university to adapt and adopt the method.Although developed to promote blended learning, during the COVID emergency, some institutions have now modified ABC LD to be facilitated remotely to support their need for a rapid transition to online learning. ABC LD is proving an effective method in this new format, too.
Full-text available
The Covid-19 pandemic and the social distancing that followed have affected all walks of society, also education. In order to keep education running, educational institutions have had to quickly adapt to the situation. This has resulted in an unprecedented push to online learning. Many, including commercial digital learning platform providers, have rushed to provide their support and 'solutions', sometimes for free. The Covid-19 pandemic has therefore also created a sellers' market in ed-tech. This paper employs a critical lens to reflect on the possible problems arising from hasty adoption of commercial digital learning solutions whose design might not always be driven by best pedagogical practices but their business model that leverages user data for profit-making. Moreover, already before Covid-19, there has been increasing critique of how ed-tech is redefining and reducing concepts of teaching and learning. The paper also challenges the narrative that claims, 'education is broken, and it should and can be fixed with technology'. Such technologization, often seen as neutral, is closely related to educationalization, i.e. imposing growing societal problems for education to resolve. Therefore, this is a critical moment to reflect how the current choices educational institutions are making might affect with Covid-19 education and online learning: Will they reinforce capitalist instrumental view of education or promote holistic human growth? This paper urges educational leaders to think carefully about the decisions they are currently making and if they indeed pave the way to a desirable future of education.
Full-text available
Technology and innovation empower higher educational institutions (HEI) to use different types of learning systems—video learning is one such system. Analyzing the footprints left behind from these online interactions is useful for understanding the effectiveness of this kind of learning. Video-based learning with flipped teaching can help improve student’s academic performance. This study was carried out with 772 examples of students registered in e-commerce and e-commerce technologies modules at an HEI. The study aimed to predict student’s overall performance at the end of the semester using video learning analytics and data mining techniques. Data from the student information system, learning management system and mobile applications were analyzed using eight different classification algorithms. Furthermore, data transformation and preprocessing techniques were carried out to reduce the features. Moreover, genetic search and principle component analysis were carried out to further reduce the features. Additionally, the CN2 Rule Inducer and multivariate projection can be used to assist faculty in interpreting the rules to gain insights into student interactions. The results showed that Random Forest accurately predicted successful students at the end of the class with an accuracy of 88.3% with an equal width and information gain ratio.
Full-text available
The use of machine learning with educational data mining (EDM) to predict learner performance has always been an important research area. Predicting academic results is one of the solutions that aims to monitor the progress of students and anticipates students at risk of failing the academic pathways. In this paper, we present a framework for predicting student performance based on Machine Learning algorithm at H.E.K high school in Morocco from 2016 to 2018. The proposed model was analyzed and tested using student’s data collected from The School Management System “MASSAR” (SMS-MASSAR). The dataset used in this study concerns 478 Physics students during the school years: 2015–2016, 2016–2017 and 2017–2018. The predictive performance results showed that our model can make more precise predictions of student’s performance.
Conference Paper
Full-text available
Educators face significant challenges related to monitoring academic performance and providing instruction in online courses. Due to spatial, temporal and interactive distance, this type of education requires that students have greater dedication, responsibility, and self-regulation of learning so that educational goals can be met. In this sense, educators seek instruments to consistently monitor and evaluate the student’s academic performance. The present work aims at systematically mapping studies that use techniques of Learning Analytics and Computational Ontologies, allied to a Taxonomy of Educational Objectives, to monitor the student’s state of knowledge. The evidence, methods, ontologies and taxonomies found will assist in the development of an educational software architecture that aims to assist educators in supervising academic performance. Three hundred and twenty studies were retrieved, of which we identified 21 (6.56%) works relevant to this research. This mapping shows that there is a research gap regarding the coordinated use of Learning Analytics and Ontologies in the field of learning assessment.
Full-text available
Nowadays companies and organizations require highly competitive professionals that have the necessary skills to confront new challenges. However, current evaluation techniques do not allow detection of skills that are valuable in the work environment, such as collaboration, teamwork, and effective communication. Multimodal Learning Analytics is a prominent discipline related to the analysis of several modalities of natural communication (e.g., speech, writing, gestures, sight) during educational processes. The main aim of this work is to develop a computational environment to both analyze and visualize student discussion groups working in a collaborative way to accomplish a task. ReSpeaker devices were used to collect speech data from students and the collected data were modelled by using influence graphs. Three centrality measures were defined, namely {\em permanence}, {\em persistence}, and {\em prompting}, to measure the activity of each student and the influence exerted between them. As a proof of concept, we carried out a case study made up of eleven groups of undergraduate students that had to solve an engineering problem with everyday materials. Thus, we show that our system allows to find and visualize non-trivial information regarding interrelations between subjects in collaborative working groups; moreover, this information can help to support complex decision-making processes.
Full-text available
Learning analytics is considered as the third wave in educational technology and it is a new and promising field of study. This study was conducted to clarify benefits and challenges of learning analytics in education.Cooper’s systematic literature review was used as the research method. This method has five steps as follows: a) formulation of the problem, b) collection of data, c) evaluation of the appropriateness of the data, d) analysis and interpretation of relevant data, and e) organization and presentation of the results. Based on the study selection process, 36 articles were finally selected to be analyzed. The results showed that ethics and privacy were one of the most important challenges of learning analytics in education along with the lack of attention to theoretical foundations and scope and quality of data. The results also showed that learning analytics could bring remarkable benefits for education, such as increased engagement of students, improvement of learning outcomes, identification of students at risk, providing real - time feedback, and personalization of learning. Based on the results, it can be concluded that learning analytics offer new insights in education, however, there are ethical, educational, and technical issues in the use of learning analytics in education.
Conference Paper
Full-text available
A university campus is comprised of Schools and Faculties attended by students whose primary intention is to learn and ultimately graduate with their desired qualification. From the moment students apply to a university and thereafter gain acceptance and attend the campus they create a unique digital footprint of themselves within the university IT systems. Students’ digital footprints are a source of data that is of interest to groups including teachers, analysts, administrators and policy makers in the education, sociology, and pedagogy domains. Learning analytics can offer tools to mine such data producing actionable knowledge for purposes of improving student retention, curriculum enhancement, student progress and feedback, and administrative evolution. In this paper, we summarise three ongoing Learning Analytics projects from an Irish university, demonstrating the potential that exists to enhance Higher Education pedagogical approaches. First year students often struggle with making the transition into University as they adapt to life and study at a Higher Education Institution. The research projects in the area of Learning Analytics at our institution focus on: improving test performance using analytics from a general-purpose VLE like Moodle, identifying studying groups and the performance peer effect using on-campus geolocation data, and detecting lower-performing or at-risk students on programming modules.
Purpose Research on the association between individual characteristics of undergraduate students, drop-out and delayed graduation is still evolving. Therefore, further evidence is required. The paper aims to discuss this issue. Design/methodology/approach This paper reports on an empirical study examining the relationship between students’ individual characteristics and delayed graduation. The analysis is based on a sample of 1,167 students who have registered on and have completed a full-time undergraduate programme in Italy. Using a Probit model, the findings document the individual, background and environmental indicators that play a role in explaining delayed graduation. Findings The study observes that students who commute to university perform better than those residing on campus. Other factors increasing the probability of completing the undergraduate programme on time include individual characteristics (e.g. gender and age), student background (family income, education), institutional environment (teaching and research quality) and student satisfaction. Finally, some policy implications are discussed. Social implications A direct policy implication of these findings is that supporting academic staff in order to enhance their performance in both research and teaching has a positive effect on the performance of the students. Originality/value This paper contributes to the debate on the impact of institutional quality on students’ performance, aiming to address the question of balance between teaching and research orientation.
A university education is widely considered essential to social advancement. Ensuring students pass their courses and graduate on time have thus become issues of concern. This paper proposes a reduced training vector-based support vector machine (RTV-SVM) capable of predicting at-risk and marginal students. It also removes redundant training vectors to reduce the training time and support vectors. To examine the effectiveness of the proposed RTV-SVM, 32,593 university students on seven courses were chosen for performance evaluation. Analysis reveals that the RTV-SVM achieved a training vector reduction of at least 59.7% without altering the margin or accuracy of the classifier. Moreover, the results showed the proposed method to be capable of achieving overall accuracy of 92.2–93.8% and 91.3–93.5% in predicting at-risk and marginal students, respectively.